Apr 23 16:35:20.026085 ip-10-0-135-57 systemd[1]: Starting Kubernetes Kubelet... Apr 23 16:35:20.490921 ip-10-0-135-57 kubenswrapper[2562]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 16:35:20.490921 ip-10-0-135-57 kubenswrapper[2562]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 23 16:35:20.490921 ip-10-0-135-57 kubenswrapper[2562]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 16:35:20.490921 ip-10-0-135-57 kubenswrapper[2562]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 23 16:35:20.490921 ip-10-0-135-57 kubenswrapper[2562]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 16:35:20.492699 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.492608 2562 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 23 16:35:20.495728 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495711 2562 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:35:20.495728 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495729 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:35:20.495803 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495734 2562 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:35:20.495803 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495751 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:35:20.495803 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495754 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:35:20.495803 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495758 2562 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:35:20.495803 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495761 2562 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:35:20.495803 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495764 2562 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:35:20.495803 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495767 2562 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:35:20.495803 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495770 2562 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:35:20.495803 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495779 2562 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:35:20.495803 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495782 2562 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:35:20.495803 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495785 2562 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:35:20.495803 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495787 2562 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:35:20.495803 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495790 2562 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:35:20.495803 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495792 2562 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:35:20.495803 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495795 2562 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:35:20.495803 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495797 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:35:20.495803 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495800 2562 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:35:20.495803 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495803 2562 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:35:20.495803 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495806 2562 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:35:20.496291 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495809 2562 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:35:20.496291 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495812 2562 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:35:20.496291 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495815 2562 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:35:20.496291 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495818 2562 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:35:20.496291 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495820 2562 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:35:20.496291 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495823 2562 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:35:20.496291 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495825 2562 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:35:20.496291 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495828 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:35:20.496291 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495830 2562 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:35:20.496291 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495833 2562 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:35:20.496291 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495836 2562 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:35:20.496291 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495838 2562 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:35:20.496291 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495841 2562 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:35:20.496291 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495844 2562 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:35:20.496291 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495846 2562 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:35:20.496291 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495849 2562 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:35:20.496291 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495851 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:35:20.496291 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495853 2562 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:35:20.496291 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495856 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:35:20.496291 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495858 2562 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:35:20.496790 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495861 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:35:20.496790 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495863 2562 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:35:20.496790 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495865 2562 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:35:20.496790 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495868 2562 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:35:20.496790 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495870 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:35:20.496790 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495873 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:35:20.496790 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495875 2562 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:35:20.496790 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495878 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:35:20.496790 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495880 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:35:20.496790 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495884 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:35:20.496790 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495886 2562 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:35:20.496790 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495889 2562 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:35:20.496790 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495891 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:35:20.496790 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495895 2562 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:35:20.496790 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495898 2562 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:35:20.496790 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495901 2562 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:35:20.496790 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495903 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:35:20.496790 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495906 2562 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:35:20.496790 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495909 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:35:20.496790 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495911 2562 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:35:20.497271 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495914 2562 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:35:20.497271 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495916 2562 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:35:20.497271 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495919 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:35:20.497271 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495922 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:35:20.497271 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495924 2562 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:35:20.497271 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495926 2562 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:35:20.497271 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495930 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:35:20.497271 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495932 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:35:20.497271 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495935 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:35:20.497271 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495937 2562 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:35:20.497271 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495942 2562 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:35:20.497271 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495945 2562 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:35:20.497271 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495948 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:35:20.497271 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495951 2562 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:35:20.497271 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495954 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:35:20.497271 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495956 2562 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:35:20.497271 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495959 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:35:20.497271 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495961 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:35:20.497271 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495963 2562 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:35:20.497722 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495966 2562 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:35:20.497722 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495968 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:35:20.497722 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495971 2562 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:35:20.497722 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495974 2562 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:35:20.497722 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495976 2562 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:35:20.497722 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.495979 2562 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:35:20.497722 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496356 2562 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:35:20.497722 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496362 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:35:20.497722 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496365 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:35:20.497722 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496368 2562 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:35:20.497722 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496370 2562 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:35:20.497722 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496373 2562 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:35:20.497722 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496376 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:35:20.497722 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496378 2562 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:35:20.497722 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496381 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:35:20.497722 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496383 2562 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:35:20.497722 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496386 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:35:20.497722 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496389 2562 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:35:20.497722 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496391 2562 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:35:20.498193 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496394 2562 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:35:20.498193 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496398 2562 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:35:20.498193 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496402 2562 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:35:20.498193 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496405 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:35:20.498193 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496408 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:35:20.498193 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496410 2562 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:35:20.498193 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496413 2562 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:35:20.498193 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496416 2562 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:35:20.498193 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496419 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:35:20.498193 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496421 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:35:20.498193 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496424 2562 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:35:20.498193 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496427 2562 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:35:20.498193 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496429 2562 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:35:20.498193 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496432 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:35:20.498193 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496435 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:35:20.498193 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496437 2562 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:35:20.498193 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496440 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:35:20.498193 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496443 2562 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:35:20.498193 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496445 2562 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:35:20.498694 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496449 2562 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:35:20.498694 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496452 2562 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:35:20.498694 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496454 2562 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:35:20.498694 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496457 2562 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:35:20.498694 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496459 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:35:20.498694 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496462 2562 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:35:20.498694 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496464 2562 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:35:20.498694 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496467 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:35:20.498694 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496469 2562 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:35:20.498694 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496472 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:35:20.498694 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496474 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:35:20.498694 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496477 2562 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:35:20.498694 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496479 2562 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:35:20.498694 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496482 2562 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:35:20.498694 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496485 2562 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:35:20.498694 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496487 2562 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:35:20.498694 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496490 2562 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:35:20.498694 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496492 2562 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:35:20.498694 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496495 2562 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:35:20.498694 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496497 2562 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:35:20.499210 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496499 2562 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:35:20.499210 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496502 2562 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:35:20.499210 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496505 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:35:20.499210 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496507 2562 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:35:20.499210 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496509 2562 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:35:20.499210 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496512 2562 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:35:20.499210 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496514 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:35:20.499210 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496517 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:35:20.499210 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496520 2562 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:35:20.499210 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496522 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:35:20.499210 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496525 2562 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:35:20.499210 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496527 2562 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:35:20.499210 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496530 2562 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:35:20.499210 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496532 2562 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:35:20.499210 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496535 2562 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:35:20.499210 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496537 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:35:20.499210 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496540 2562 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:35:20.499210 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496542 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:35:20.499210 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496545 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:35:20.499210 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496548 2562 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:35:20.499692 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496550 2562 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:35:20.499692 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496553 2562 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:35:20.499692 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496556 2562 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:35:20.499692 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496559 2562 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:35:20.499692 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496561 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:35:20.499692 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496564 2562 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:35:20.499692 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496568 2562 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:35:20.499692 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496572 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:35:20.499692 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496574 2562 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:35:20.499692 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496577 2562 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:35:20.499692 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496580 2562 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:35:20.499692 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496582 2562 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:35:20.499692 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496586 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:35:20.499692 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.496588 2562 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:35:20.499692 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496665 2562 flags.go:64] FLAG: --address="0.0.0.0" Apr 23 16:35:20.499692 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496673 2562 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 23 16:35:20.499692 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496679 2562 flags.go:64] FLAG: --anonymous-auth="true" Apr 23 16:35:20.499692 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496684 2562 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 23 16:35:20.499692 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496689 2562 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 23 16:35:20.499692 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496696 2562 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 23 16:35:20.499692 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496701 2562 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 23 16:35:20.500286 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496706 2562 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 23 16:35:20.500286 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496709 2562 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 23 16:35:20.500286 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496712 2562 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 23 16:35:20.500286 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496716 2562 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 23 16:35:20.500286 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496720 2562 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 23 16:35:20.500286 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496723 2562 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 23 16:35:20.500286 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496726 2562 flags.go:64] FLAG: --cgroup-root="" Apr 23 16:35:20.500286 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496729 2562 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 23 16:35:20.500286 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496732 2562 flags.go:64] FLAG: --client-ca-file="" Apr 23 16:35:20.500286 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496735 2562 flags.go:64] FLAG: --cloud-config="" Apr 23 16:35:20.500286 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496752 2562 flags.go:64] FLAG: --cloud-provider="external" Apr 23 16:35:20.500286 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496756 2562 flags.go:64] FLAG: --cluster-dns="[]" Apr 23 16:35:20.500286 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496761 2562 flags.go:64] FLAG: --cluster-domain="" Apr 23 16:35:20.500286 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496763 2562 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 23 16:35:20.500286 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496766 2562 flags.go:64] FLAG: --config-dir="" Apr 23 16:35:20.500286 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496769 2562 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 23 16:35:20.500286 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496773 2562 flags.go:64] FLAG: --container-log-max-files="5" Apr 23 16:35:20.500286 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496777 2562 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 23 16:35:20.500286 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496780 2562 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 23 16:35:20.500286 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496784 2562 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 23 16:35:20.500286 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496787 2562 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 23 16:35:20.500286 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496791 2562 flags.go:64] FLAG: --contention-profiling="false" Apr 23 16:35:20.500286 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496794 2562 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 23 16:35:20.500286 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496797 2562 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 23 16:35:20.500881 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496800 2562 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 23 16:35:20.500881 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496803 2562 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 23 16:35:20.500881 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496808 2562 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 23 16:35:20.500881 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496811 2562 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 23 16:35:20.500881 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496814 2562 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 23 16:35:20.500881 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496817 2562 flags.go:64] FLAG: --enable-load-reader="false" Apr 23 16:35:20.500881 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496822 2562 flags.go:64] FLAG: --enable-server="true" Apr 23 16:35:20.500881 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496825 2562 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 23 16:35:20.500881 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496830 2562 flags.go:64] FLAG: --event-burst="100" Apr 23 16:35:20.500881 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496833 2562 flags.go:64] FLAG: --event-qps="50" Apr 23 16:35:20.500881 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496836 2562 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 23 16:35:20.500881 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496839 2562 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 23 16:35:20.500881 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496842 2562 flags.go:64] FLAG: --eviction-hard="" Apr 23 16:35:20.500881 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496846 2562 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 23 16:35:20.500881 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496849 2562 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 23 16:35:20.500881 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496852 2562 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 23 16:35:20.500881 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496855 2562 flags.go:64] FLAG: --eviction-soft="" Apr 23 16:35:20.500881 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496858 2562 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 23 16:35:20.500881 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496861 2562 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 23 16:35:20.500881 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496865 2562 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 23 16:35:20.500881 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496868 2562 flags.go:64] FLAG: --experimental-mounter-path="" Apr 23 16:35:20.500881 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496871 2562 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 23 16:35:20.500881 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496874 2562 flags.go:64] FLAG: --fail-swap-on="true" Apr 23 16:35:20.500881 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496877 2562 flags.go:64] FLAG: --feature-gates="" Apr 23 16:35:20.500881 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496881 2562 flags.go:64] FLAG: --file-check-frequency="20s" Apr 23 16:35:20.501481 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496884 2562 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 23 16:35:20.501481 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496887 2562 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 23 16:35:20.501481 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496891 2562 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 23 16:35:20.501481 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496894 2562 flags.go:64] FLAG: --healthz-port="10248" Apr 23 16:35:20.501481 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496899 2562 flags.go:64] FLAG: --help="false" Apr 23 16:35:20.501481 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496902 2562 flags.go:64] FLAG: --hostname-override="ip-10-0-135-57.ec2.internal" Apr 23 16:35:20.501481 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496905 2562 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 23 16:35:20.501481 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496908 2562 flags.go:64] FLAG: --http-check-frequency="20s" Apr 23 16:35:20.501481 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496911 2562 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 23 16:35:20.501481 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496915 2562 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 23 16:35:20.501481 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496918 2562 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 23 16:35:20.501481 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496921 2562 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 23 16:35:20.501481 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496924 2562 flags.go:64] FLAG: --image-service-endpoint="" Apr 23 16:35:20.501481 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496928 2562 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 23 16:35:20.501481 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496931 2562 flags.go:64] FLAG: --kube-api-burst="100" Apr 23 16:35:20.501481 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496934 2562 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 23 16:35:20.501481 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496937 2562 flags.go:64] FLAG: --kube-api-qps="50" Apr 23 16:35:20.501481 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496941 2562 flags.go:64] FLAG: --kube-reserved="" Apr 23 16:35:20.501481 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496944 2562 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 23 16:35:20.501481 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496946 2562 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 23 16:35:20.501481 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496949 2562 flags.go:64] FLAG: --kubelet-cgroups="" Apr 23 16:35:20.501481 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496952 2562 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 23 16:35:20.501481 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496955 2562 flags.go:64] FLAG: --lock-file="" Apr 23 16:35:20.501481 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496959 2562 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 23 16:35:20.502074 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496962 2562 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 23 16:35:20.502074 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496964 2562 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 23 16:35:20.502074 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496969 2562 flags.go:64] FLAG: --log-json-split-stream="false" Apr 23 16:35:20.502074 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496972 2562 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 23 16:35:20.502074 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496975 2562 flags.go:64] FLAG: --log-text-split-stream="false" Apr 23 16:35:20.502074 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496978 2562 flags.go:64] FLAG: --logging-format="text" Apr 23 16:35:20.502074 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496981 2562 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 23 16:35:20.502074 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496984 2562 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 23 16:35:20.502074 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496987 2562 flags.go:64] FLAG: --manifest-url="" Apr 23 16:35:20.502074 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496990 2562 flags.go:64] FLAG: --manifest-url-header="" Apr 23 16:35:20.502074 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496995 2562 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 23 16:35:20.502074 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.496998 2562 flags.go:64] FLAG: --max-open-files="1000000" Apr 23 16:35:20.502074 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497008 2562 flags.go:64] FLAG: --max-pods="110" Apr 23 16:35:20.502074 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497011 2562 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 23 16:35:20.502074 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497014 2562 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 23 16:35:20.502074 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497017 2562 flags.go:64] FLAG: --memory-manager-policy="None" Apr 23 16:35:20.502074 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497020 2562 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 23 16:35:20.502074 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497023 2562 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 23 16:35:20.502074 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497026 2562 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 23 16:35:20.502074 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497029 2562 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 23 16:35:20.502074 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497036 2562 flags.go:64] FLAG: --node-status-max-images="50" Apr 23 16:35:20.502074 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497040 2562 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 23 16:35:20.502074 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497043 2562 flags.go:64] FLAG: --oom-score-adj="-999" Apr 23 16:35:20.502074 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497047 2562 flags.go:64] FLAG: --pod-cidr="" Apr 23 16:35:20.502661 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497050 2562 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 23 16:35:20.502661 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497056 2562 flags.go:64] FLAG: --pod-manifest-path="" Apr 23 16:35:20.502661 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497059 2562 flags.go:64] FLAG: --pod-max-pids="-1" Apr 23 16:35:20.502661 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497062 2562 flags.go:64] FLAG: --pods-per-core="0" Apr 23 16:35:20.502661 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497065 2562 flags.go:64] FLAG: --port="10250" Apr 23 16:35:20.502661 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497068 2562 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 23 16:35:20.502661 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497071 2562 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0ecfbce45ceb21167" Apr 23 16:35:20.502661 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497074 2562 flags.go:64] FLAG: --qos-reserved="" Apr 23 16:35:20.502661 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497077 2562 flags.go:64] FLAG: --read-only-port="10255" Apr 23 16:35:20.502661 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497080 2562 flags.go:64] FLAG: --register-node="true" Apr 23 16:35:20.502661 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497083 2562 flags.go:64] FLAG: --register-schedulable="true" Apr 23 16:35:20.502661 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497086 2562 flags.go:64] FLAG: --register-with-taints="" Apr 23 16:35:20.502661 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497090 2562 flags.go:64] FLAG: --registry-burst="10" Apr 23 16:35:20.502661 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497093 2562 flags.go:64] FLAG: --registry-qps="5" Apr 23 16:35:20.502661 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497096 2562 flags.go:64] FLAG: --reserved-cpus="" Apr 23 16:35:20.502661 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497098 2562 flags.go:64] FLAG: --reserved-memory="" Apr 23 16:35:20.502661 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497102 2562 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 23 16:35:20.502661 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497105 2562 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 23 16:35:20.502661 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497108 2562 flags.go:64] FLAG: --rotate-certificates="false" Apr 23 16:35:20.502661 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497111 2562 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 23 16:35:20.502661 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497115 2562 flags.go:64] FLAG: --runonce="false" Apr 23 16:35:20.502661 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497118 2562 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 23 16:35:20.502661 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497122 2562 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 23 16:35:20.502661 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497125 2562 flags.go:64] FLAG: --seccomp-default="false" Apr 23 16:35:20.502661 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497128 2562 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 23 16:35:20.503261 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497131 2562 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 23 16:35:20.503261 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497134 2562 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 23 16:35:20.503261 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497137 2562 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 23 16:35:20.503261 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497140 2562 flags.go:64] FLAG: --storage-driver-password="root" Apr 23 16:35:20.503261 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497143 2562 flags.go:64] FLAG: --storage-driver-secure="false" Apr 23 16:35:20.503261 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497147 2562 flags.go:64] FLAG: --storage-driver-table="stats" Apr 23 16:35:20.503261 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497150 2562 flags.go:64] FLAG: --storage-driver-user="root" Apr 23 16:35:20.503261 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497153 2562 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 23 16:35:20.503261 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497156 2562 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 23 16:35:20.503261 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497159 2562 flags.go:64] FLAG: --system-cgroups="" Apr 23 16:35:20.503261 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497162 2562 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 23 16:35:20.503261 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497167 2562 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 23 16:35:20.503261 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497170 2562 flags.go:64] FLAG: --tls-cert-file="" Apr 23 16:35:20.503261 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497173 2562 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 23 16:35:20.503261 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497178 2562 flags.go:64] FLAG: --tls-min-version="" Apr 23 16:35:20.503261 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497181 2562 flags.go:64] FLAG: --tls-private-key-file="" Apr 23 16:35:20.503261 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497183 2562 flags.go:64] FLAG: --topology-manager-policy="none" Apr 23 16:35:20.503261 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497186 2562 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 23 16:35:20.503261 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497189 2562 flags.go:64] FLAG: --topology-manager-scope="container" Apr 23 16:35:20.503261 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497192 2562 flags.go:64] FLAG: --v="2" Apr 23 16:35:20.503261 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497196 2562 flags.go:64] FLAG: --version="false" Apr 23 16:35:20.503261 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497200 2562 flags.go:64] FLAG: --vmodule="" Apr 23 16:35:20.503261 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497205 2562 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 23 16:35:20.503261 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497208 2562 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 23 16:35:20.503849 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497302 2562 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:35:20.503849 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497308 2562 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:35:20.503849 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497311 2562 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:35:20.503849 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497316 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:35:20.503849 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497319 2562 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:35:20.503849 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497322 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:35:20.503849 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497325 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:35:20.503849 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497328 2562 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:35:20.503849 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497331 2562 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:35:20.503849 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497334 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:35:20.503849 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497336 2562 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:35:20.503849 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497339 2562 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:35:20.503849 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497342 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:35:20.503849 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497346 2562 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:35:20.503849 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497349 2562 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:35:20.503849 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497352 2562 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:35:20.503849 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497354 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:35:20.503849 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497357 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:35:20.503849 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497360 2562 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:35:20.504675 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497363 2562 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:35:20.504675 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497365 2562 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:35:20.504675 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497368 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:35:20.504675 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497370 2562 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:35:20.504675 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497373 2562 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:35:20.504675 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497375 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:35:20.504675 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497378 2562 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:35:20.504675 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497381 2562 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:35:20.504675 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497383 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:35:20.504675 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497386 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:35:20.504675 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497390 2562 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:35:20.504675 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497393 2562 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:35:20.504675 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497395 2562 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:35:20.504675 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497398 2562 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:35:20.504675 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497400 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:35:20.504675 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497403 2562 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:35:20.504675 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497406 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:35:20.504675 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497409 2562 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:35:20.504675 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497412 2562 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:35:20.504675 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497415 2562 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:35:20.505291 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497417 2562 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:35:20.505291 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497420 2562 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:35:20.505291 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497423 2562 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:35:20.505291 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497426 2562 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:35:20.505291 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497430 2562 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:35:20.505291 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497432 2562 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:35:20.505291 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497436 2562 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:35:20.505291 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497439 2562 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:35:20.505291 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497441 2562 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:35:20.505291 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497444 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:35:20.505291 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497447 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:35:20.505291 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497449 2562 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:35:20.505291 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497452 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:35:20.505291 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497454 2562 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:35:20.505291 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497457 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:35:20.505291 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497460 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:35:20.505291 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497462 2562 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:35:20.505291 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497465 2562 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:35:20.505291 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497468 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:35:20.505787 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497470 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:35:20.505787 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497473 2562 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:35:20.505787 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497475 2562 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:35:20.505787 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497478 2562 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:35:20.505787 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497481 2562 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:35:20.505787 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497483 2562 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:35:20.505787 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497485 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:35:20.505787 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497488 2562 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:35:20.505787 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497490 2562 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:35:20.505787 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497494 2562 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:35:20.505787 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497497 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:35:20.505787 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497499 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:35:20.505787 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497502 2562 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:35:20.505787 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497505 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:35:20.505787 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497507 2562 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:35:20.505787 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497510 2562 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:35:20.505787 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497512 2562 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:35:20.505787 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497515 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:35:20.505787 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497517 2562 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:35:20.505787 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497521 2562 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:35:20.506277 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497524 2562 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:35:20.506277 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497527 2562 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:35:20.506277 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497529 2562 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:35:20.506277 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497532 2562 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:35:20.506277 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497534 2562 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:35:20.506277 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497537 2562 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:35:20.506277 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497540 2562 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:35:20.506277 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.497542 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:35:20.506277 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.497551 2562 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 16:35:20.506511 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.506374 2562 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 23 16:35:20.506511 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.506393 2562 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 23 16:35:20.506511 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506445 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:35:20.506511 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506451 2562 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:35:20.506511 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506455 2562 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:35:20.506511 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506458 2562 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:35:20.506511 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506462 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:35:20.506511 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506465 2562 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:35:20.506511 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506468 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:35:20.506511 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506471 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:35:20.506511 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506474 2562 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:35:20.506511 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506476 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:35:20.506511 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506479 2562 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:35:20.506511 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506482 2562 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:35:20.506511 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506484 2562 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:35:20.506511 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506487 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:35:20.506511 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506489 2562 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:35:20.506511 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506492 2562 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:35:20.506511 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506495 2562 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:35:20.506998 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506497 2562 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:35:20.506998 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506500 2562 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:35:20.506998 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506502 2562 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:35:20.506998 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506506 2562 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:35:20.506998 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506508 2562 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:35:20.506998 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506511 2562 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:35:20.506998 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506515 2562 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:35:20.506998 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506518 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:35:20.506998 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506520 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:35:20.506998 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506523 2562 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:35:20.506998 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506526 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:35:20.506998 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506528 2562 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:35:20.506998 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506531 2562 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:35:20.506998 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506534 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:35:20.506998 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506537 2562 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:35:20.506998 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506540 2562 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:35:20.506998 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506544 2562 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:35:20.506998 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506546 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:35:20.506998 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506549 2562 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:35:20.506998 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506551 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:35:20.507486 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506554 2562 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:35:20.507486 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506556 2562 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:35:20.507486 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506559 2562 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:35:20.507486 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506561 2562 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:35:20.507486 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506564 2562 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:35:20.507486 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506566 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:35:20.507486 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506569 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:35:20.507486 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506571 2562 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:35:20.507486 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506574 2562 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:35:20.507486 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506576 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:35:20.507486 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506579 2562 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:35:20.507486 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506581 2562 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:35:20.507486 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506585 2562 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:35:20.507486 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506588 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:35:20.507486 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506591 2562 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:35:20.507486 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506594 2562 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:35:20.507486 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506597 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:35:20.507486 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506599 2562 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:35:20.507486 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506603 2562 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:35:20.507486 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506607 2562 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:35:20.508052 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506610 2562 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:35:20.508052 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506613 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:35:20.508052 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506615 2562 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:35:20.508052 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506618 2562 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:35:20.508052 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506621 2562 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:35:20.508052 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506623 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:35:20.508052 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506626 2562 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:35:20.508052 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506629 2562 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:35:20.508052 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506631 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:35:20.508052 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506634 2562 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:35:20.508052 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506636 2562 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:35:20.508052 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506639 2562 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:35:20.508052 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506642 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:35:20.508052 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506644 2562 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:35:20.508052 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506647 2562 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:35:20.508052 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506649 2562 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:35:20.508052 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506652 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:35:20.508052 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506655 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:35:20.508052 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506657 2562 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:35:20.508052 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506660 2562 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:35:20.508563 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506663 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:35:20.508563 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506666 2562 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:35:20.508563 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506668 2562 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:35:20.508563 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506671 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:35:20.508563 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506674 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:35:20.508563 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506677 2562 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:35:20.508563 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506681 2562 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:35:20.508563 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506683 2562 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:35:20.508563 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506686 2562 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:35:20.508563 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.506691 2562 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 16:35:20.508563 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506804 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:35:20.508563 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506810 2562 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:35:20.508563 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506813 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:35:20.508563 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506816 2562 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:35:20.508563 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506819 2562 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:35:20.508959 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506821 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:35:20.508959 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506824 2562 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:35:20.508959 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506827 2562 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:35:20.508959 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506830 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:35:20.508959 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506832 2562 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:35:20.508959 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506835 2562 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:35:20.508959 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506837 2562 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:35:20.508959 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506840 2562 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:35:20.508959 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506843 2562 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:35:20.508959 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506845 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:35:20.508959 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506848 2562 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:35:20.508959 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506850 2562 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:35:20.508959 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506853 2562 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:35:20.508959 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506855 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:35:20.508959 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506858 2562 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:35:20.508959 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506861 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:35:20.508959 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506863 2562 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:35:20.508959 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506867 2562 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:35:20.508959 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506870 2562 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:35:20.508959 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506872 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:35:20.509448 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506875 2562 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:35:20.509448 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506878 2562 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:35:20.509448 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506881 2562 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:35:20.509448 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506884 2562 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:35:20.509448 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506886 2562 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:35:20.509448 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506889 2562 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:35:20.509448 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506891 2562 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:35:20.509448 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506894 2562 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:35:20.509448 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506896 2562 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:35:20.509448 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506899 2562 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:35:20.509448 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506901 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:35:20.509448 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506904 2562 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:35:20.509448 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506907 2562 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:35:20.509448 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506909 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:35:20.509448 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506912 2562 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:35:20.509448 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506914 2562 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:35:20.509448 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506917 2562 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:35:20.509448 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506920 2562 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:35:20.509448 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506924 2562 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:35:20.509448 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506928 2562 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:35:20.509950 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506931 2562 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:35:20.509950 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506933 2562 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:35:20.509950 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506936 2562 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:35:20.509950 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506939 2562 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:35:20.509950 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506941 2562 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:35:20.509950 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506943 2562 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:35:20.509950 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506946 2562 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:35:20.509950 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506948 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:35:20.509950 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506951 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:35:20.509950 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506954 2562 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:35:20.509950 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506957 2562 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:35:20.509950 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506960 2562 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:35:20.509950 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506962 2562 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:35:20.509950 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506965 2562 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:35:20.509950 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506968 2562 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:35:20.509950 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506970 2562 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:35:20.509950 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506973 2562 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:35:20.509950 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506975 2562 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:35:20.509950 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506978 2562 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:35:20.509950 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506980 2562 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:35:20.510432 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506983 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:35:20.510432 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506985 2562 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:35:20.510432 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506988 2562 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:35:20.510432 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506990 2562 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:35:20.510432 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506993 2562 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:35:20.510432 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506996 2562 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:35:20.510432 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.506998 2562 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:35:20.510432 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.507001 2562 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:35:20.510432 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.507003 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:35:20.510432 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.507006 2562 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:35:20.510432 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.507009 2562 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:35:20.510432 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.507012 2562 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:35:20.510432 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.507015 2562 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:35:20.510432 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.507017 2562 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:35:20.510432 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.507020 2562 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:35:20.510432 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.507023 2562 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:35:20.510432 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.507025 2562 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:35:20.510432 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.507029 2562 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:35:20.510432 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.507032 2562 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:35:20.510897 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.507036 2562 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:35:20.510897 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:20.507038 2562 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:35:20.510897 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.507044 2562 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 16:35:20.510897 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.507825 2562 server.go:962] "Client rotation is on, will bootstrap in background" Apr 23 16:35:20.510897 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.510055 2562 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 23 16:35:20.511159 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.511146 2562 server.go:1019] "Starting client certificate rotation" Apr 23 16:35:20.511264 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.511248 2562 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 16:35:20.511302 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.511284 2562 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 16:35:20.537135 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.537112 2562 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 16:35:20.541924 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.541907 2562 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 16:35:20.558527 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.558501 2562 log.go:25] "Validated CRI v1 runtime API" Apr 23 16:35:20.566451 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.566432 2562 log.go:25] "Validated CRI v1 image API" Apr 23 16:35:20.567616 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.567591 2562 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 16:35:20.567724 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.567701 2562 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 23 16:35:20.571564 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.571543 2562 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 de6f4302-0232-471d-9a78-7c8e0af981c2:/dev/nvme0n1p4 e561129e-08a6-436e-b144-8e46b0ebd5a0:/dev/nvme0n1p3] Apr 23 16:35:20.571615 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.571564 2562 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 23 16:35:20.578448 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.578326 2562 manager.go:217] Machine: {Timestamp:2026-04-23 16:35:20.577041307 +0000 UTC m=+0.429856362 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3101027 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2d8bc9f2cbdba9a1f4c8c71e2b52a2 SystemUUID:ec2d8bc9-f2cb-dba9-a1f4-c8c71e2b52a2 BootID:0e2be542-c294-4bdd-aea3-87d118c43c49 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:a1:f2:69:ad:19 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:a1:f2:69:ad:19 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:2a:bd:9b:76:4b:1d Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 23 16:35:20.578448 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.578434 2562 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 23 16:35:20.578592 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.578520 2562 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 23 16:35:20.580056 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.580033 2562 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 23 16:35:20.580205 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.580058 2562 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-135-57.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 23 16:35:20.580249 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.580217 2562 topology_manager.go:138] "Creating topology manager with none policy" Apr 23 16:35:20.580249 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.580226 2562 container_manager_linux.go:306] "Creating device plugin manager" Apr 23 16:35:20.580249 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.580239 2562 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 16:35:20.580328 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.580250 2562 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 16:35:20.581663 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.581652 2562 state_mem.go:36] "Initialized new in-memory state store" Apr 23 16:35:20.581825 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.581815 2562 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 23 16:35:20.584130 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.584111 2562 kubelet.go:491] "Attempting to sync node with API server" Apr 23 16:35:20.584130 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.584135 2562 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 23 16:35:20.584202 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.584147 2562 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 23 16:35:20.584202 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.584156 2562 kubelet.go:397] "Adding apiserver pod source" Apr 23 16:35:20.584202 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.584165 2562 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 23 16:35:20.585361 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.585348 2562 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 16:35:20.585412 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.585372 2562 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 16:35:20.588833 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.588814 2562 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 23 16:35:20.590341 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.590320 2562 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 23 16:35:20.591564 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.591539 2562 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-jpztg" Apr 23 16:35:20.592227 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.592212 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 23 16:35:20.592270 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.592232 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 23 16:35:20.592270 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.592239 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 23 16:35:20.592270 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.592245 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 23 16:35:20.592270 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.592251 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 23 16:35:20.592270 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.592257 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 23 16:35:20.592270 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.592263 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 23 16:35:20.592270 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.592268 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 23 16:35:20.592456 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.592275 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 23 16:35:20.592456 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.592282 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 23 16:35:20.592456 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.592291 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 23 16:35:20.592456 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.592300 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 23 16:35:20.594016 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.594003 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 23 16:35:20.594056 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.594019 2562 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 23 16:35:20.596903 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:20.596869 2562 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 16:35:20.596982 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:20.596931 2562 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-135-57.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 16:35:20.597979 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.597963 2562 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 23 16:35:20.598061 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.598002 2562 server.go:1295] "Started kubelet" Apr 23 16:35:20.598158 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.598127 2562 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 23 16:35:20.598376 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.598277 2562 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 23 16:35:20.598461 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.598450 2562 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 23 16:35:20.599075 ip-10-0-135-57 systemd[1]: Started Kubernetes Kubelet. Apr 23 16:35:20.599705 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.599680 2562 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 23 16:35:20.600771 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.600735 2562 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-jpztg" Apr 23 16:35:20.601464 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.601447 2562 server.go:317] "Adding debug handlers to kubelet server" Apr 23 16:35:20.606666 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.606644 2562 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 23 16:35:20.607231 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.607214 2562 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 23 16:35:20.607354 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:20.607324 2562 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 23 16:35:20.607989 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.607974 2562 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 23 16:35:20.608079 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.607978 2562 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 23 16:35:20.608079 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.608023 2562 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 23 16:35:20.608196 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.608091 2562 factory.go:55] Registering systemd factory Apr 23 16:35:20.608196 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.608134 2562 factory.go:223] Registration of the systemd container factory successfully Apr 23 16:35:20.608196 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:20.608154 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-57.ec2.internal\" not found" Apr 23 16:35:20.608389 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.608375 2562 factory.go:153] Registering CRI-O factory Apr 23 16:35:20.608429 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.608392 2562 factory.go:223] Registration of the crio container factory successfully Apr 23 16:35:20.608459 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.608446 2562 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 23 16:35:20.608491 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.608482 2562 factory.go:103] Registering Raw factory Apr 23 16:35:20.608524 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.608501 2562 manager.go:1196] Started watching for new ooms in manager Apr 23 16:35:20.608774 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.608761 2562 reconstruct.go:97] "Volume reconstruction finished" Apr 23 16:35:20.608844 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.608778 2562 reconciler.go:26] "Reconciler: start to sync state" Apr 23 16:35:20.609190 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.609140 2562 manager.go:319] Starting recovery of all containers Apr 23 16:35:20.610503 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.610480 2562 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:35:20.613826 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.613786 2562 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-135-57.ec2.internal" not found Apr 23 16:35:20.613913 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:20.613819 2562 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-135-57.ec2.internal\" not found" node="ip-10-0-135-57.ec2.internal" Apr 23 16:35:20.619464 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.619229 2562 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 23 16:35:20.621387 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.621372 2562 manager.go:324] Recovery completed Apr 23 16:35:20.625784 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.625768 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:35:20.629640 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.629625 2562 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-135-57.ec2.internal" not found Apr 23 16:35:20.631726 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.631712 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-57.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:35:20.631826 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.631785 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-57.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:35:20.631826 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.631806 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-57.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:35:20.632330 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.632315 2562 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 23 16:35:20.632330 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.632329 2562 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 23 16:35:20.632429 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.632347 2562 state_mem.go:36] "Initialized new in-memory state store" Apr 23 16:35:20.634878 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.634865 2562 policy_none.go:49] "None policy: Start" Apr 23 16:35:20.634931 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.634882 2562 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 23 16:35:20.634931 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.634893 2562 state_mem.go:35] "Initializing new in-memory state store" Apr 23 16:35:20.676995 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.674350 2562 manager.go:341] "Starting Device Plugin manager" Apr 23 16:35:20.676995 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:20.674386 2562 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 23 16:35:20.676995 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.674397 2562 server.go:85] "Starting device plugin registration server" Apr 23 16:35:20.676995 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.674612 2562 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 23 16:35:20.676995 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.674622 2562 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 23 16:35:20.676995 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.674703 2562 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 23 16:35:20.676995 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.674788 2562 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 23 16:35:20.676995 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.674795 2562 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 23 16:35:20.676995 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:20.675389 2562 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 23 16:35:20.676995 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:20.675422 2562 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-135-57.ec2.internal\" not found" Apr 23 16:35:20.686511 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.686494 2562 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-135-57.ec2.internal" not found Apr 23 16:35:20.737232 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.737202 2562 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 23 16:35:20.737232 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.737234 2562 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 23 16:35:20.737425 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.737252 2562 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 23 16:35:20.737425 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.737261 2562 kubelet.go:2451] "Starting kubelet main sync loop" Apr 23 16:35:20.737425 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:20.737290 2562 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 23 16:35:20.739797 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.739776 2562 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:35:20.775705 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.775645 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:35:20.776588 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.776573 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-57.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:35:20.776660 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.776604 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-57.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:35:20.776660 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.776613 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-57.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:35:20.776660 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.776637 2562 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-135-57.ec2.internal" Apr 23 16:35:20.785552 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.785538 2562 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-135-57.ec2.internal" Apr 23 16:35:20.785610 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:20.785559 2562 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-135-57.ec2.internal\": node \"ip-10-0-135-57.ec2.internal\" not found" Apr 23 16:35:20.804486 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:20.804463 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-57.ec2.internal\" not found" Apr 23 16:35:20.837863 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.837826 2562 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-57.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-135-57.ec2.internal"] Apr 23 16:35:20.837937 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.837910 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:35:20.839487 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.839463 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-57.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:35:20.839556 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.839498 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-57.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:35:20.839556 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.839512 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-57.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:35:20.840792 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.840777 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:35:20.840944 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.840929 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-57.ec2.internal" Apr 23 16:35:20.841004 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.840967 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:35:20.842585 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.842569 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-57.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:35:20.842689 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.842590 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-57.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:35:20.842689 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.842601 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-57.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:35:20.842689 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.842608 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-57.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:35:20.842689 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.842628 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-57.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:35:20.842689 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.842638 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-57.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:35:20.843735 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.843722 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-57.ec2.internal" Apr 23 16:35:20.843795 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.843763 2562 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:35:20.844493 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.844480 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-57.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:35:20.844551 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.844507 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-57.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:35:20.844551 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.844520 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-57.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:35:20.874458 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:20.874429 2562 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-57.ec2.internal\" not found" node="ip-10-0-135-57.ec2.internal" Apr 23 16:35:20.878714 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:20.878696 2562 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-57.ec2.internal\" not found" node="ip-10-0-135-57.ec2.internal" Apr 23 16:35:20.905025 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:20.905005 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-57.ec2.internal\" not found" Apr 23 16:35:20.911375 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.911352 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/6bfe44dce378e2f142aa223226db1983-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-57.ec2.internal\" (UID: \"6bfe44dce378e2f142aa223226db1983\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-57.ec2.internal" Apr 23 16:35:20.911464 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.911393 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6bfe44dce378e2f142aa223226db1983-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-57.ec2.internal\" (UID: \"6bfe44dce378e2f142aa223226db1983\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-57.ec2.internal" Apr 23 16:35:20.911464 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:20.911421 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/cb72edff024abc1bfc776fb556b861df-config\") pod \"kube-apiserver-proxy-ip-10-0-135-57.ec2.internal\" (UID: \"cb72edff024abc1bfc776fb556b861df\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-57.ec2.internal" Apr 23 16:35:21.005887 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:21.005854 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-57.ec2.internal\" not found" Apr 23 16:35:21.012238 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.012218 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6bfe44dce378e2f142aa223226db1983-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-57.ec2.internal\" (UID: \"6bfe44dce378e2f142aa223226db1983\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-57.ec2.internal" Apr 23 16:35:21.012313 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.012218 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6bfe44dce378e2f142aa223226db1983-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-57.ec2.internal\" (UID: \"6bfe44dce378e2f142aa223226db1983\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-57.ec2.internal" Apr 23 16:35:21.012313 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.012291 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/cb72edff024abc1bfc776fb556b861df-config\") pod \"kube-apiserver-proxy-ip-10-0-135-57.ec2.internal\" (UID: \"cb72edff024abc1bfc776fb556b861df\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-57.ec2.internal" Apr 23 16:35:21.012313 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.012272 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/cb72edff024abc1bfc776fb556b861df-config\") pod \"kube-apiserver-proxy-ip-10-0-135-57.ec2.internal\" (UID: \"cb72edff024abc1bfc776fb556b861df\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-57.ec2.internal" Apr 23 16:35:21.012405 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.012320 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/6bfe44dce378e2f142aa223226db1983-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-57.ec2.internal\" (UID: \"6bfe44dce378e2f142aa223226db1983\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-57.ec2.internal" Apr 23 16:35:21.012405 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.012347 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/6bfe44dce378e2f142aa223226db1983-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-57.ec2.internal\" (UID: \"6bfe44dce378e2f142aa223226db1983\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-57.ec2.internal" Apr 23 16:35:21.106486 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:21.106400 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-57.ec2.internal\" not found" Apr 23 16:35:21.176933 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.176905 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-57.ec2.internal" Apr 23 16:35:21.181608 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.181589 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-57.ec2.internal" Apr 23 16:35:21.207483 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:21.207449 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-57.ec2.internal\" not found" Apr 23 16:35:21.307926 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:21.307890 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-57.ec2.internal\" not found" Apr 23 16:35:21.408355 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:21.408321 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-57.ec2.internal\" not found" Apr 23 16:35:21.508801 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:21.508769 2562 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-57.ec2.internal\" not found" Apr 23 16:35:21.510936 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.510919 2562 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 23 16:35:21.511079 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.511058 2562 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 16:35:21.511112 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.511087 2562 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 16:35:21.547015 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.546990 2562 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:35:21.585327 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.585302 2562 apiserver.go:52] "Watching apiserver" Apr 23 16:35:21.599651 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.599627 2562 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 23 16:35:21.600020 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.600000 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-h6w7v","openshift-network-operator/iptables-alerter-kjd4g","openshift-ovn-kubernetes/ovnkube-node-krt5k","openshift-cluster-node-tuning-operator/tuned-vllcp","openshift-image-registry/node-ca-h68c5","openshift-multus/multus-additional-cni-plugins-q6xdb","openshift-multus/network-metrics-daemon-6wbcq","openshift-network-diagnostics/network-check-target-wn6cd","kube-system/konnectivity-agent-86gqj","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wcjsr"] Apr 23 16:35:21.602486 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.602468 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-h6w7v" Apr 23 16:35:21.602725 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.602674 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-kjd4g" Apr 23 16:35:21.602936 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.602889 2562 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-22 16:30:20 +0000 UTC" deadline="2028-01-29 06:17:03.489421504 +0000 UTC" Apr 23 16:35:21.602936 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.602934 2562 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15493h41m41.886491111s" Apr 23 16:35:21.603705 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.603685 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.604854 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.604737 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-vllcp" Apr 23 16:35:21.605819 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.605805 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-h68c5" Apr 23 16:35:21.606875 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.606855 2562 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 23 16:35:21.607034 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.607003 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 23 16:35:21.607114 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.607056 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-q6xdb" Apr 23 16:35:21.607171 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.607134 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 23 16:35:21.607247 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.607230 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-r7btr\"" Apr 23 16:35:21.607303 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.607267 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 23 16:35:21.607426 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.607410 2562 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-57.ec2.internal" Apr 23 16:35:21.607490 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.607425 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 23 16:35:21.607490 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.607428 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 23 16:35:21.607568 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.607526 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-vq8nn\"" Apr 23 16:35:21.607658 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.607644 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 23 16:35:21.608083 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.608064 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-qcf7r\"" Apr 23 16:35:21.608208 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.608066 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 23 16:35:21.608208 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.608064 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 23 16:35:21.608782 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.608442 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 23 16:35:21.608782 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.608467 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 23 16:35:21.608782 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.608480 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 23 16:35:21.608782 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.608513 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 23 16:35:21.608782 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.608530 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-n6tl9\"" Apr 23 16:35:21.608782 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.608467 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 23 16:35:21.608782 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.608481 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 23 16:35:21.608782 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.608479 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6wbcq" Apr 23 16:35:21.608782 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.608442 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 23 16:35:21.608782 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:21.608654 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6wbcq" podUID="adbb31b5-ee6b-431b-ac95-7775688ba039" Apr 23 16:35:21.608782 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.608667 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 23 16:35:21.608782 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.608719 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 23 16:35:21.609316 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.608823 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-wl7qd\"" Apr 23 16:35:21.609316 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.608909 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 23 16:35:21.609974 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.609957 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wn6cd" Apr 23 16:35:21.610070 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:21.610016 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wn6cd" podUID="0ee2ca50-34ff-4830-8c04-92018768a3a7" Apr 23 16:35:21.610070 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.610045 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 23 16:35:21.610272 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.610248 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 23 16:35:21.610856 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.610838 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-ccff8\"" Apr 23 16:35:21.611282 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.611266 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-86gqj" Apr 23 16:35:21.612566 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.612549 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wcjsr" Apr 23 16:35:21.613908 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.613889 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-qhg9q\"" Apr 23 16:35:21.613997 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.613987 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 23 16:35:21.614112 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.614095 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 23 16:35:21.614996 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.614977 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 23 16:35:21.615272 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.615159 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 23 16:35:21.615272 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.615246 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 23 16:35:21.615594 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.615579 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-xvnpc\"" Apr 23 16:35:21.615819 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.615801 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/27c0f8c4-6f27-4267-8d2e-7e39aa9adcef-host-run-multus-certs\") pod \"multus-h6w7v\" (UID: \"27c0f8c4-6f27-4267-8d2e-7e39aa9adcef\") " pod="openshift-multus/multus-h6w7v" Apr 23 16:35:21.615914 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.615825 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/27c0f8c4-6f27-4267-8d2e-7e39aa9adcef-etc-kubernetes\") pod \"multus-h6w7v\" (UID: \"27c0f8c4-6f27-4267-8d2e-7e39aa9adcef\") " pod="openshift-multus/multus-h6w7v" Apr 23 16:35:21.615914 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.615840 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-log-socket\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.615914 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.615863 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b15c0f5c-7376-4c94-b44b-d6eb7a192048-etc-sysconfig\") pod \"tuned-vllcp\" (UID: \"b15c0f5c-7376-4c94-b44b-d6eb7a192048\") " pod="openshift-cluster-node-tuning-operator/tuned-vllcp" Apr 23 16:35:21.615914 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.615886 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/adbb31b5-ee6b-431b-ac95-7775688ba039-metrics-certs\") pod \"network-metrics-daemon-6wbcq\" (UID: \"adbb31b5-ee6b-431b-ac95-7775688ba039\") " pod="openshift-multus/network-metrics-daemon-6wbcq" Apr 23 16:35:21.615914 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.615910 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b15c0f5c-7376-4c94-b44b-d6eb7a192048-etc-modprobe-d\") pod \"tuned-vllcp\" (UID: \"b15c0f5c-7376-4c94-b44b-d6eb7a192048\") " pod="openshift-cluster-node-tuning-operator/tuned-vllcp" Apr 23 16:35:21.616105 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.615968 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-etc-openvswitch\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.616105 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.616011 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-env-overrides\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.616105 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.616055 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/27c0f8c4-6f27-4267-8d2e-7e39aa9adcef-os-release\") pod \"multus-h6w7v\" (UID: \"27c0f8c4-6f27-4267-8d2e-7e39aa9adcef\") " pod="openshift-multus/multus-h6w7v" Apr 23 16:35:21.616105 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.616084 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/27c0f8c4-6f27-4267-8d2e-7e39aa9adcef-hostroot\") pod \"multus-h6w7v\" (UID: \"27c0f8c4-6f27-4267-8d2e-7e39aa9adcef\") " pod="openshift-multus/multus-h6w7v" Apr 23 16:35:21.616266 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.616107 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b15c0f5c-7376-4c94-b44b-d6eb7a192048-etc-systemd\") pod \"tuned-vllcp\" (UID: \"b15c0f5c-7376-4c94-b44b-d6eb7a192048\") " pod="openshift-cluster-node-tuning-operator/tuned-vllcp" Apr 23 16:35:21.616266 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.616131 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/27c0f8c4-6f27-4267-8d2e-7e39aa9adcef-host-var-lib-cni-multus\") pod \"multus-h6w7v\" (UID: \"27c0f8c4-6f27-4267-8d2e-7e39aa9adcef\") " pod="openshift-multus/multus-h6w7v" Apr 23 16:35:21.616266 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.616150 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/380d4b90-b7df-4855-ae70-0dc24d42f0d3-iptables-alerter-script\") pod \"iptables-alerter-kjd4g\" (UID: \"380d4b90-b7df-4855-ae70-0dc24d42f0d3\") " pod="openshift-network-operator/iptables-alerter-kjd4g" Apr 23 16:35:21.616266 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.616170 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/380d4b90-b7df-4855-ae70-0dc24d42f0d3-host-slash\") pod \"iptables-alerter-kjd4g\" (UID: \"380d4b90-b7df-4855-ae70-0dc24d42f0d3\") " pod="openshift-network-operator/iptables-alerter-kjd4g" Apr 23 16:35:21.616266 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.616191 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-host-run-ovn-kubernetes\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.616266 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.616239 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b15c0f5c-7376-4c94-b44b-d6eb7a192048-etc-tuned\") pod \"tuned-vllcp\" (UID: \"b15c0f5c-7376-4c94-b44b-d6eb7a192048\") " pod="openshift-cluster-node-tuning-operator/tuned-vllcp" Apr 23 16:35:21.616502 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.616295 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b15c0f5c-7376-4c94-b44b-d6eb7a192048-tmp\") pod \"tuned-vllcp\" (UID: \"b15c0f5c-7376-4c94-b44b-d6eb7a192048\") " pod="openshift-cluster-node-tuning-operator/tuned-vllcp" Apr 23 16:35:21.616502 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.616327 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-run-systemd\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.616502 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.616354 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-host-cni-netd\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.616502 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.616386 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-ovnkube-script-lib\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.616502 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.616412 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b15c0f5c-7376-4c94-b44b-d6eb7a192048-etc-sysctl-d\") pod \"tuned-vllcp\" (UID: \"b15c0f5c-7376-4c94-b44b-d6eb7a192048\") " pod="openshift-cluster-node-tuning-operator/tuned-vllcp" Apr 23 16:35:21.616502 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.616433 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b15c0f5c-7376-4c94-b44b-d6eb7a192048-run\") pod \"tuned-vllcp\" (UID: \"b15c0f5c-7376-4c94-b44b-d6eb7a192048\") " pod="openshift-cluster-node-tuning-operator/tuned-vllcp" Apr 23 16:35:21.616502 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.616466 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e5a7d4ac-2ec3-48b6-8f5f-0d911a26eb86-host\") pod \"node-ca-h68c5\" (UID: \"e5a7d4ac-2ec3-48b6-8f5f-0d911a26eb86\") " pod="openshift-image-registry/node-ca-h68c5" Apr 23 16:35:21.616502 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.616497 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/27c0f8c4-6f27-4267-8d2e-7e39aa9adcef-multus-cni-dir\") pod \"multus-h6w7v\" (UID: \"27c0f8c4-6f27-4267-8d2e-7e39aa9adcef\") " pod="openshift-multus/multus-h6w7v" Apr 23 16:35:21.616887 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.616521 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/27c0f8c4-6f27-4267-8d2e-7e39aa9adcef-host-var-lib-cni-bin\") pod \"multus-h6w7v\" (UID: \"27c0f8c4-6f27-4267-8d2e-7e39aa9adcef\") " pod="openshift-multus/multus-h6w7v" Apr 23 16:35:21.616887 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.616546 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-run-ovn\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.616887 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.616569 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-node-log\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.616887 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.616604 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2226bf67-86e3-4375-a378-075aedce2ea6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-q6xdb\" (UID: \"2226bf67-86e3-4375-a378-075aedce2ea6\") " pod="openshift-multus/multus-additional-cni-plugins-q6xdb" Apr 23 16:35:21.616887 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.616631 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5kdj\" (UniqueName: \"kubernetes.io/projected/adbb31b5-ee6b-431b-ac95-7775688ba039-kube-api-access-n5kdj\") pod \"network-metrics-daemon-6wbcq\" (UID: \"adbb31b5-ee6b-431b-ac95-7775688ba039\") " pod="openshift-multus/network-metrics-daemon-6wbcq" Apr 23 16:35:21.616887 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.616662 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5lhx\" (UniqueName: \"kubernetes.io/projected/b15c0f5c-7376-4c94-b44b-d6eb7a192048-kube-api-access-v5lhx\") pod \"tuned-vllcp\" (UID: \"b15c0f5c-7376-4c94-b44b-d6eb7a192048\") " pod="openshift-cluster-node-tuning-operator/tuned-vllcp" Apr 23 16:35:21.616887 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.616686 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/27c0f8c4-6f27-4267-8d2e-7e39aa9adcef-host-run-k8s-cni-cncf-io\") pod \"multus-h6w7v\" (UID: \"27c0f8c4-6f27-4267-8d2e-7e39aa9adcef\") " pod="openshift-multus/multus-h6w7v" Apr 23 16:35:21.616887 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.616711 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/27c0f8c4-6f27-4267-8d2e-7e39aa9adcef-host-run-netns\") pod \"multus-h6w7v\" (UID: \"27c0f8c4-6f27-4267-8d2e-7e39aa9adcef\") " pod="openshift-multus/multus-h6w7v" Apr 23 16:35:21.616887 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.616769 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/27c0f8c4-6f27-4267-8d2e-7e39aa9adcef-cni-binary-copy\") pod \"multus-h6w7v\" (UID: \"27c0f8c4-6f27-4267-8d2e-7e39aa9adcef\") " pod="openshift-multus/multus-h6w7v" Apr 23 16:35:21.616887 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.616799 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-host-run-netns\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.616887 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.616823 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2226bf67-86e3-4375-a378-075aedce2ea6-cnibin\") pod \"multus-additional-cni-plugins-q6xdb\" (UID: \"2226bf67-86e3-4375-a378-075aedce2ea6\") " pod="openshift-multus/multus-additional-cni-plugins-q6xdb" Apr 23 16:35:21.616887 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.616861 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/27c0f8c4-6f27-4267-8d2e-7e39aa9adcef-cnibin\") pod \"multus-h6w7v\" (UID: \"27c0f8c4-6f27-4267-8d2e-7e39aa9adcef\") " pod="openshift-multus/multus-h6w7v" Apr 23 16:35:21.616887 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.616886 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/27c0f8c4-6f27-4267-8d2e-7e39aa9adcef-host-var-lib-kubelet\") pod \"multus-h6w7v\" (UID: \"27c0f8c4-6f27-4267-8d2e-7e39aa9adcef\") " pod="openshift-multus/multus-h6w7v" Apr 23 16:35:21.617448 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.616909 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-host-slash\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.617448 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.616934 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.617448 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.616973 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2226bf67-86e3-4375-a378-075aedce2ea6-cni-binary-copy\") pod \"multus-additional-cni-plugins-q6xdb\" (UID: \"2226bf67-86e3-4375-a378-075aedce2ea6\") " pod="openshift-multus/multus-additional-cni-plugins-q6xdb" Apr 23 16:35:21.617448 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.617002 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcb2c\" (UniqueName: \"kubernetes.io/projected/2226bf67-86e3-4375-a378-075aedce2ea6-kube-api-access-tcb2c\") pod \"multus-additional-cni-plugins-q6xdb\" (UID: \"2226bf67-86e3-4375-a378-075aedce2ea6\") " pod="openshift-multus/multus-additional-cni-plugins-q6xdb" Apr 23 16:35:21.617448 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.617027 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b15c0f5c-7376-4c94-b44b-d6eb7a192048-host\") pod \"tuned-vllcp\" (UID: \"b15c0f5c-7376-4c94-b44b-d6eb7a192048\") " pod="openshift-cluster-node-tuning-operator/tuned-vllcp" Apr 23 16:35:21.617448 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.617058 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/27c0f8c4-6f27-4267-8d2e-7e39aa9adcef-system-cni-dir\") pod \"multus-h6w7v\" (UID: \"27c0f8c4-6f27-4267-8d2e-7e39aa9adcef\") " pod="openshift-multus/multus-h6w7v" Apr 23 16:35:21.617448 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.617088 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-host-cni-bin\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.617448 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.617129 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-ovnkube-config\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.617448 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.617162 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-ovn-node-metrics-cert\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.617448 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.617201 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv2hz\" (UniqueName: \"kubernetes.io/projected/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-kube-api-access-kv2hz\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.617448 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.617242 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b15c0f5c-7376-4c94-b44b-d6eb7a192048-lib-modules\") pod \"tuned-vllcp\" (UID: \"b15c0f5c-7376-4c94-b44b-d6eb7a192048\") " pod="openshift-cluster-node-tuning-operator/tuned-vllcp" Apr 23 16:35:21.617448 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.617266 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-var-lib-openvswitch\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.617448 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.617294 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2226bf67-86e3-4375-a378-075aedce2ea6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-q6xdb\" (UID: \"2226bf67-86e3-4375-a378-075aedce2ea6\") " pod="openshift-multus/multus-additional-cni-plugins-q6xdb" Apr 23 16:35:21.617448 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.617320 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b15c0f5c-7376-4c94-b44b-d6eb7a192048-etc-kubernetes\") pod \"tuned-vllcp\" (UID: \"b15c0f5c-7376-4c94-b44b-d6eb7a192048\") " pod="openshift-cluster-node-tuning-operator/tuned-vllcp" Apr 23 16:35:21.617448 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.617342 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b15c0f5c-7376-4c94-b44b-d6eb7a192048-sys\") pod \"tuned-vllcp\" (UID: \"b15c0f5c-7376-4c94-b44b-d6eb7a192048\") " pod="openshift-cluster-node-tuning-operator/tuned-vllcp" Apr 23 16:35:21.617448 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.617374 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-systemd-units\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.618032 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.617405 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2226bf67-86e3-4375-a378-075aedce2ea6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-q6xdb\" (UID: \"2226bf67-86e3-4375-a378-075aedce2ea6\") " pod="openshift-multus/multus-additional-cni-plugins-q6xdb" Apr 23 16:35:21.618032 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.617436 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llch5\" (UniqueName: \"kubernetes.io/projected/380d4b90-b7df-4855-ae70-0dc24d42f0d3-kube-api-access-llch5\") pod \"iptables-alerter-kjd4g\" (UID: \"380d4b90-b7df-4855-ae70-0dc24d42f0d3\") " pod="openshift-network-operator/iptables-alerter-kjd4g" Apr 23 16:35:21.618032 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.617465 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-host-kubelet\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.618032 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.617486 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e5a7d4ac-2ec3-48b6-8f5f-0d911a26eb86-serviceca\") pod \"node-ca-h68c5\" (UID: \"e5a7d4ac-2ec3-48b6-8f5f-0d911a26eb86\") " pod="openshift-image-registry/node-ca-h68c5" Apr 23 16:35:21.618032 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.617517 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/27c0f8c4-6f27-4267-8d2e-7e39aa9adcef-multus-socket-dir-parent\") pod \"multus-h6w7v\" (UID: \"27c0f8c4-6f27-4267-8d2e-7e39aa9adcef\") " pod="openshift-multus/multus-h6w7v" Apr 23 16:35:21.618032 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.617552 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l82z8\" (UniqueName: \"kubernetes.io/projected/27c0f8c4-6f27-4267-8d2e-7e39aa9adcef-kube-api-access-l82z8\") pod \"multus-h6w7v\" (UID: \"27c0f8c4-6f27-4267-8d2e-7e39aa9adcef\") " pod="openshift-multus/multus-h6w7v" Apr 23 16:35:21.618032 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.617613 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2226bf67-86e3-4375-a378-075aedce2ea6-os-release\") pod \"multus-additional-cni-plugins-q6xdb\" (UID: \"2226bf67-86e3-4375-a378-075aedce2ea6\") " pod="openshift-multus/multus-additional-cni-plugins-q6xdb" Apr 23 16:35:21.618032 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.617647 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b15c0f5c-7376-4c94-b44b-d6eb7a192048-var-lib-kubelet\") pod \"tuned-vllcp\" (UID: \"b15c0f5c-7376-4c94-b44b-d6eb7a192048\") " pod="openshift-cluster-node-tuning-operator/tuned-vllcp" Apr 23 16:35:21.618032 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.617675 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/27c0f8c4-6f27-4267-8d2e-7e39aa9adcef-multus-conf-dir\") pod \"multus-h6w7v\" (UID: \"27c0f8c4-6f27-4267-8d2e-7e39aa9adcef\") " pod="openshift-multus/multus-h6w7v" Apr 23 16:35:21.618032 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.617729 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/27c0f8c4-6f27-4267-8d2e-7e39aa9adcef-multus-daemon-config\") pod \"multus-h6w7v\" (UID: \"27c0f8c4-6f27-4267-8d2e-7e39aa9adcef\") " pod="openshift-multus/multus-h6w7v" Apr 23 16:35:21.618032 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.617780 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-run-openvswitch\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.618032 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.617807 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2226bf67-86e3-4375-a378-075aedce2ea6-system-cni-dir\") pod \"multus-additional-cni-plugins-q6xdb\" (UID: \"2226bf67-86e3-4375-a378-075aedce2ea6\") " pod="openshift-multus/multus-additional-cni-plugins-q6xdb" Apr 23 16:35:21.618032 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.617829 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b15c0f5c-7376-4c94-b44b-d6eb7a192048-etc-sysctl-conf\") pod \"tuned-vllcp\" (UID: \"b15c0f5c-7376-4c94-b44b-d6eb7a192048\") " pod="openshift-cluster-node-tuning-operator/tuned-vllcp" Apr 23 16:35:21.618032 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.617855 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqznn\" (UniqueName: \"kubernetes.io/projected/e5a7d4ac-2ec3-48b6-8f5f-0d911a26eb86-kube-api-access-cqznn\") pod \"node-ca-h68c5\" (UID: \"e5a7d4ac-2ec3-48b6-8f5f-0d911a26eb86\") " pod="openshift-image-registry/node-ca-h68c5" Apr 23 16:35:21.619551 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.619534 2562 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 16:35:21.621062 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.621044 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-57.ec2.internal"] Apr 23 16:35:21.621572 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.621556 2562 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 16:35:21.621647 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.621635 2562 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-57.ec2.internal" Apr 23 16:35:21.629223 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.628406 2562 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 16:35:21.629223 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.628532 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-135-57.ec2.internal"] Apr 23 16:35:21.648659 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.648618 2562 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-4bqpv" Apr 23 16:35:21.664526 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.664480 2562 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-4bqpv" Apr 23 16:35:21.693294 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:21.693266 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bfe44dce378e2f142aa223226db1983.slice/crio-99531fcae33d0320a9588d6611ec3f956ffaf52c7e07e8d976983332757b1a4e WatchSource:0}: Error finding container 99531fcae33d0320a9588d6611ec3f956ffaf52c7e07e8d976983332757b1a4e: Status 404 returned error can't find the container with id 99531fcae33d0320a9588d6611ec3f956ffaf52c7e07e8d976983332757b1a4e Apr 23 16:35:21.693645 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:21.693628 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb72edff024abc1bfc776fb556b861df.slice/crio-7ee18e1c1b9a0cbb87c8670228ca279db9820c101fb3cbe4b482c552047fc88c WatchSource:0}: Error finding container 7ee18e1c1b9a0cbb87c8670228ca279db9820c101fb3cbe4b482c552047fc88c: Status 404 returned error can't find the container with id 7ee18e1c1b9a0cbb87c8670228ca279db9820c101fb3cbe4b482c552047fc88c Apr 23 16:35:21.698567 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.698547 2562 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 16:35:21.709479 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.709459 2562 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 23 16:35:21.718870 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.718848 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cqznn\" (UniqueName: \"kubernetes.io/projected/e5a7d4ac-2ec3-48b6-8f5f-0d911a26eb86-kube-api-access-cqznn\") pod \"node-ca-h68c5\" (UID: \"e5a7d4ac-2ec3-48b6-8f5f-0d911a26eb86\") " pod="openshift-image-registry/node-ca-h68c5" Apr 23 16:35:21.718962 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.718879 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95594\" (UniqueName: \"kubernetes.io/projected/0ee2ca50-34ff-4830-8c04-92018768a3a7-kube-api-access-95594\") pod \"network-check-target-wn6cd\" (UID: \"0ee2ca50-34ff-4830-8c04-92018768a3a7\") " pod="openshift-network-diagnostics/network-check-target-wn6cd" Apr 23 16:35:21.718962 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.718898 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/27c0f8c4-6f27-4267-8d2e-7e39aa9adcef-host-run-multus-certs\") pod \"multus-h6w7v\" (UID: \"27c0f8c4-6f27-4267-8d2e-7e39aa9adcef\") " pod="openshift-multus/multus-h6w7v" Apr 23 16:35:21.718962 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.718914 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/27c0f8c4-6f27-4267-8d2e-7e39aa9adcef-etc-kubernetes\") pod \"multus-h6w7v\" (UID: \"27c0f8c4-6f27-4267-8d2e-7e39aa9adcef\") " pod="openshift-multus/multus-h6w7v" Apr 23 16:35:21.718962 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.718929 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-log-socket\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.718962 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.718944 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b15c0f5c-7376-4c94-b44b-d6eb7a192048-etc-sysconfig\") pod \"tuned-vllcp\" (UID: \"b15c0f5c-7376-4c94-b44b-d6eb7a192048\") " pod="openshift-cluster-node-tuning-operator/tuned-vllcp" Apr 23 16:35:21.719196 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.718982 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/adbb31b5-ee6b-431b-ac95-7775688ba039-metrics-certs\") pod \"network-metrics-daemon-6wbcq\" (UID: \"adbb31b5-ee6b-431b-ac95-7775688ba039\") " pod="openshift-multus/network-metrics-daemon-6wbcq" Apr 23 16:35:21.719196 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.718990 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b15c0f5c-7376-4c94-b44b-d6eb7a192048-etc-sysconfig\") pod \"tuned-vllcp\" (UID: \"b15c0f5c-7376-4c94-b44b-d6eb7a192048\") " pod="openshift-cluster-node-tuning-operator/tuned-vllcp" Apr 23 16:35:21.719196 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.718995 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-log-socket\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.719196 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.719012 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b15c0f5c-7376-4c94-b44b-d6eb7a192048-etc-modprobe-d\") pod \"tuned-vllcp\" (UID: \"b15c0f5c-7376-4c94-b44b-d6eb7a192048\") " pod="openshift-cluster-node-tuning-operator/tuned-vllcp" Apr 23 16:35:21.719196 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.719023 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/27c0f8c4-6f27-4267-8d2e-7e39aa9adcef-etc-kubernetes\") pod \"multus-h6w7v\" (UID: \"27c0f8c4-6f27-4267-8d2e-7e39aa9adcef\") " pod="openshift-multus/multus-h6w7v" Apr 23 16:35:21.719196 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.719038 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-etc-openvswitch\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.719196 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.719050 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/27c0f8c4-6f27-4267-8d2e-7e39aa9adcef-host-run-multus-certs\") pod \"multus-h6w7v\" (UID: \"27c0f8c4-6f27-4267-8d2e-7e39aa9adcef\") " pod="openshift-multus/multus-h6w7v" Apr 23 16:35:21.719196 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.719055 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-env-overrides\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.719196 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.719080 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3e1cd67e-5850-4c5d-b72f-849d619e4d11-socket-dir\") pod \"aws-ebs-csi-driver-node-wcjsr\" (UID: \"3e1cd67e-5850-4c5d-b72f-849d619e4d11\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wcjsr" Apr 23 16:35:21.719196 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:21.719084 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:21.719196 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.719105 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/27c0f8c4-6f27-4267-8d2e-7e39aa9adcef-os-release\") pod \"multus-h6w7v\" (UID: \"27c0f8c4-6f27-4267-8d2e-7e39aa9adcef\") " pod="openshift-multus/multus-h6w7v" Apr 23 16:35:21.719196 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.719122 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b15c0f5c-7376-4c94-b44b-d6eb7a192048-etc-modprobe-d\") pod \"tuned-vllcp\" (UID: \"b15c0f5c-7376-4c94-b44b-d6eb7a192048\") " pod="openshift-cluster-node-tuning-operator/tuned-vllcp" Apr 23 16:35:21.719196 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.719131 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/27c0f8c4-6f27-4267-8d2e-7e39aa9adcef-hostroot\") pod \"multus-h6w7v\" (UID: \"27c0f8c4-6f27-4267-8d2e-7e39aa9adcef\") " pod="openshift-multus/multus-h6w7v" Apr 23 16:35:21.719196 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.719133 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-etc-openvswitch\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.719196 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.719154 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b15c0f5c-7376-4c94-b44b-d6eb7a192048-etc-systemd\") pod \"tuned-vllcp\" (UID: \"b15c0f5c-7376-4c94-b44b-d6eb7a192048\") " pod="openshift-cluster-node-tuning-operator/tuned-vllcp" Apr 23 16:35:21.719196 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.719178 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/27c0f8c4-6f27-4267-8d2e-7e39aa9adcef-hostroot\") pod \"multus-h6w7v\" (UID: \"27c0f8c4-6f27-4267-8d2e-7e39aa9adcef\") " pod="openshift-multus/multus-h6w7v" Apr 23 16:35:21.719196 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:21.719192 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/adbb31b5-ee6b-431b-ac95-7775688ba039-metrics-certs podName:adbb31b5-ee6b-431b-ac95-7775688ba039 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:22.2191568 +0000 UTC m=+2.071971854 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/adbb31b5-ee6b-431b-ac95-7775688ba039-metrics-certs") pod "network-metrics-daemon-6wbcq" (UID: "adbb31b5-ee6b-431b-ac95-7775688ba039") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:21.719196 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.719190 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/27c0f8c4-6f27-4267-8d2e-7e39aa9adcef-os-release\") pod \"multus-h6w7v\" (UID: \"27c0f8c4-6f27-4267-8d2e-7e39aa9adcef\") " pod="openshift-multus/multus-h6w7v" Apr 23 16:35:21.719939 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.719210 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b15c0f5c-7376-4c94-b44b-d6eb7a192048-etc-systemd\") pod \"tuned-vllcp\" (UID: \"b15c0f5c-7376-4c94-b44b-d6eb7a192048\") " pod="openshift-cluster-node-tuning-operator/tuned-vllcp" Apr 23 16:35:21.719939 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.719251 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/27c0f8c4-6f27-4267-8d2e-7e39aa9adcef-host-var-lib-cni-multus\") pod \"multus-h6w7v\" (UID: \"27c0f8c4-6f27-4267-8d2e-7e39aa9adcef\") " pod="openshift-multus/multus-h6w7v" Apr 23 16:35:21.719939 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.719278 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/380d4b90-b7df-4855-ae70-0dc24d42f0d3-iptables-alerter-script\") pod \"iptables-alerter-kjd4g\" (UID: \"380d4b90-b7df-4855-ae70-0dc24d42f0d3\") " pod="openshift-network-operator/iptables-alerter-kjd4g" Apr 23 16:35:21.719939 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.719306 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/380d4b90-b7df-4855-ae70-0dc24d42f0d3-host-slash\") pod \"iptables-alerter-kjd4g\" (UID: \"380d4b90-b7df-4855-ae70-0dc24d42f0d3\") " pod="openshift-network-operator/iptables-alerter-kjd4g" Apr 23 16:35:21.719939 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.719331 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-host-run-ovn-kubernetes\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.719939 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.719335 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/27c0f8c4-6f27-4267-8d2e-7e39aa9adcef-host-var-lib-cni-multus\") pod \"multus-h6w7v\" (UID: \"27c0f8c4-6f27-4267-8d2e-7e39aa9adcef\") " pod="openshift-multus/multus-h6w7v" Apr 23 16:35:21.719939 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.719367 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-host-run-ovn-kubernetes\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.719939 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.719375 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/380d4b90-b7df-4855-ae70-0dc24d42f0d3-host-slash\") pod \"iptables-alerter-kjd4g\" (UID: \"380d4b90-b7df-4855-ae70-0dc24d42f0d3\") " pod="openshift-network-operator/iptables-alerter-kjd4g" Apr 23 16:35:21.719939 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.719394 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b15c0f5c-7376-4c94-b44b-d6eb7a192048-etc-tuned\") pod \"tuned-vllcp\" (UID: \"b15c0f5c-7376-4c94-b44b-d6eb7a192048\") " pod="openshift-cluster-node-tuning-operator/tuned-vllcp" Apr 23 16:35:21.719939 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.719412 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b15c0f5c-7376-4c94-b44b-d6eb7a192048-tmp\") pod \"tuned-vllcp\" (UID: \"b15c0f5c-7376-4c94-b44b-d6eb7a192048\") " pod="openshift-cluster-node-tuning-operator/tuned-vllcp" Apr 23 16:35:21.719939 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.719427 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-run-systemd\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.719939 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.719443 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-host-cni-netd\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.719939 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.719459 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-ovnkube-script-lib\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.719939 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.719491 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-env-overrides\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.719939 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.719513 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b15c0f5c-7376-4c94-b44b-d6eb7a192048-etc-sysctl-d\") pod \"tuned-vllcp\" (UID: \"b15c0f5c-7376-4c94-b44b-d6eb7a192048\") " pod="openshift-cluster-node-tuning-operator/tuned-vllcp" Apr 23 16:35:21.719939 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.719516 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-host-cni-netd\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.719939 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.719494 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-run-systemd\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.719939 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.719541 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b15c0f5c-7376-4c94-b44b-d6eb7a192048-run\") pod \"tuned-vllcp\" (UID: \"b15c0f5c-7376-4c94-b44b-d6eb7a192048\") " pod="openshift-cluster-node-tuning-operator/tuned-vllcp" Apr 23 16:35:21.720837 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.719564 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e5a7d4ac-2ec3-48b6-8f5f-0d911a26eb86-host\") pod \"node-ca-h68c5\" (UID: \"e5a7d4ac-2ec3-48b6-8f5f-0d911a26eb86\") " pod="openshift-image-registry/node-ca-h68c5" Apr 23 16:35:21.720837 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.719590 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b15c0f5c-7376-4c94-b44b-d6eb7a192048-run\") pod \"tuned-vllcp\" (UID: \"b15c0f5c-7376-4c94-b44b-d6eb7a192048\") " pod="openshift-cluster-node-tuning-operator/tuned-vllcp" Apr 23 16:35:21.720837 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.719594 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3e1cd67e-5850-4c5d-b72f-849d619e4d11-sys-fs\") pod \"aws-ebs-csi-driver-node-wcjsr\" (UID: \"3e1cd67e-5850-4c5d-b72f-849d619e4d11\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wcjsr" Apr 23 16:35:21.720837 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.719628 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e5a7d4ac-2ec3-48b6-8f5f-0d911a26eb86-host\") pod \"node-ca-h68c5\" (UID: \"e5a7d4ac-2ec3-48b6-8f5f-0d911a26eb86\") " pod="openshift-image-registry/node-ca-h68c5" Apr 23 16:35:21.720837 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.719641 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b15c0f5c-7376-4c94-b44b-d6eb7a192048-etc-sysctl-d\") pod \"tuned-vllcp\" (UID: \"b15c0f5c-7376-4c94-b44b-d6eb7a192048\") " pod="openshift-cluster-node-tuning-operator/tuned-vllcp" Apr 23 16:35:21.720837 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.719653 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsw6x\" (UniqueName: \"kubernetes.io/projected/3e1cd67e-5850-4c5d-b72f-849d619e4d11-kube-api-access-zsw6x\") pod \"aws-ebs-csi-driver-node-wcjsr\" (UID: \"3e1cd67e-5850-4c5d-b72f-849d619e4d11\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wcjsr" Apr 23 16:35:21.720837 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.719705 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/27c0f8c4-6f27-4267-8d2e-7e39aa9adcef-multus-cni-dir\") pod \"multus-h6w7v\" (UID: \"27c0f8c4-6f27-4267-8d2e-7e39aa9adcef\") " pod="openshift-multus/multus-h6w7v" Apr 23 16:35:21.720837 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.719728 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/27c0f8c4-6f27-4267-8d2e-7e39aa9adcef-host-var-lib-cni-bin\") pod \"multus-h6w7v\" (UID: \"27c0f8c4-6f27-4267-8d2e-7e39aa9adcef\") " pod="openshift-multus/multus-h6w7v" Apr 23 16:35:21.720837 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.719759 2562 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 23 16:35:21.720837 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.719768 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/380d4b90-b7df-4855-ae70-0dc24d42f0d3-iptables-alerter-script\") pod \"iptables-alerter-kjd4g\" (UID: \"380d4b90-b7df-4855-ae70-0dc24d42f0d3\") " pod="openshift-network-operator/iptables-alerter-kjd4g" Apr 23 16:35:21.720837 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.719775 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-run-ovn\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.720837 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.719785 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/27c0f8c4-6f27-4267-8d2e-7e39aa9adcef-host-var-lib-cni-bin\") pod \"multus-h6w7v\" (UID: \"27c0f8c4-6f27-4267-8d2e-7e39aa9adcef\") " pod="openshift-multus/multus-h6w7v" Apr 23 16:35:21.720837 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.719788 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/27c0f8c4-6f27-4267-8d2e-7e39aa9adcef-multus-cni-dir\") pod \"multus-h6w7v\" (UID: \"27c0f8c4-6f27-4267-8d2e-7e39aa9adcef\") " pod="openshift-multus/multus-h6w7v" Apr 23 16:35:21.720837 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.719798 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-node-log\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.720837 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.719820 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-run-ovn\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.720837 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.719832 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-node-log\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.720837 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.719837 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2226bf67-86e3-4375-a378-075aedce2ea6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-q6xdb\" (UID: \"2226bf67-86e3-4375-a378-075aedce2ea6\") " pod="openshift-multus/multus-additional-cni-plugins-q6xdb" Apr 23 16:35:21.720837 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.719862 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n5kdj\" (UniqueName: \"kubernetes.io/projected/adbb31b5-ee6b-431b-ac95-7775688ba039-kube-api-access-n5kdj\") pod \"network-metrics-daemon-6wbcq\" (UID: \"adbb31b5-ee6b-431b-ac95-7775688ba039\") " pod="openshift-multus/network-metrics-daemon-6wbcq" Apr 23 16:35:21.721658 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.719879 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v5lhx\" (UniqueName: \"kubernetes.io/projected/b15c0f5c-7376-4c94-b44b-d6eb7a192048-kube-api-access-v5lhx\") pod \"tuned-vllcp\" (UID: \"b15c0f5c-7376-4c94-b44b-d6eb7a192048\") " pod="openshift-cluster-node-tuning-operator/tuned-vllcp" Apr 23 16:35:21.721658 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.719901 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3e1cd67e-5850-4c5d-b72f-849d619e4d11-kubelet-dir\") pod \"aws-ebs-csi-driver-node-wcjsr\" (UID: \"3e1cd67e-5850-4c5d-b72f-849d619e4d11\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wcjsr" Apr 23 16:35:21.721658 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.719962 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-ovnkube-script-lib\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.721658 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.720009 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/27c0f8c4-6f27-4267-8d2e-7e39aa9adcef-host-run-k8s-cni-cncf-io\") pod \"multus-h6w7v\" (UID: \"27c0f8c4-6f27-4267-8d2e-7e39aa9adcef\") " pod="openshift-multus/multus-h6w7v" Apr 23 16:35:21.721658 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.720028 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/27c0f8c4-6f27-4267-8d2e-7e39aa9adcef-host-run-netns\") pod \"multus-h6w7v\" (UID: \"27c0f8c4-6f27-4267-8d2e-7e39aa9adcef\") " pod="openshift-multus/multus-h6w7v" Apr 23 16:35:21.721658 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.720059 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3e1cd67e-5850-4c5d-b72f-849d619e4d11-registration-dir\") pod \"aws-ebs-csi-driver-node-wcjsr\" (UID: \"3e1cd67e-5850-4c5d-b72f-849d619e4d11\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wcjsr" Apr 23 16:35:21.721658 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.720063 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/27c0f8c4-6f27-4267-8d2e-7e39aa9adcef-host-run-k8s-cni-cncf-io\") pod \"multus-h6w7v\" (UID: \"27c0f8c4-6f27-4267-8d2e-7e39aa9adcef\") " pod="openshift-multus/multus-h6w7v" Apr 23 16:35:21.721658 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.720082 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3e1cd67e-5850-4c5d-b72f-849d619e4d11-device-dir\") pod \"aws-ebs-csi-driver-node-wcjsr\" (UID: \"3e1cd67e-5850-4c5d-b72f-849d619e4d11\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wcjsr" Apr 23 16:35:21.721658 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.720100 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/27c0f8c4-6f27-4267-8d2e-7e39aa9adcef-cni-binary-copy\") pod \"multus-h6w7v\" (UID: \"27c0f8c4-6f27-4267-8d2e-7e39aa9adcef\") " pod="openshift-multus/multus-h6w7v" Apr 23 16:35:21.721658 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.720116 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-host-run-netns\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.721658 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.720121 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/27c0f8c4-6f27-4267-8d2e-7e39aa9adcef-host-run-netns\") pod \"multus-h6w7v\" (UID: \"27c0f8c4-6f27-4267-8d2e-7e39aa9adcef\") " pod="openshift-multus/multus-h6w7v" Apr 23 16:35:21.721658 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.720153 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2226bf67-86e3-4375-a378-075aedce2ea6-cnibin\") pod \"multus-additional-cni-plugins-q6xdb\" (UID: \"2226bf67-86e3-4375-a378-075aedce2ea6\") " pod="openshift-multus/multus-additional-cni-plugins-q6xdb" Apr 23 16:35:21.721658 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.720167 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-host-run-netns\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.721658 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.720183 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a58d25fc-7e16-47db-81f3-d8e27f59a92b-agent-certs\") pod \"konnectivity-agent-86gqj\" (UID: \"a58d25fc-7e16-47db-81f3-d8e27f59a92b\") " pod="kube-system/konnectivity-agent-86gqj" Apr 23 16:35:21.721658 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.720211 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/27c0f8c4-6f27-4267-8d2e-7e39aa9adcef-cnibin\") pod \"multus-h6w7v\" (UID: \"27c0f8c4-6f27-4267-8d2e-7e39aa9adcef\") " pod="openshift-multus/multus-h6w7v" Apr 23 16:35:21.721658 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.720214 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2226bf67-86e3-4375-a378-075aedce2ea6-cnibin\") pod \"multus-additional-cni-plugins-q6xdb\" (UID: \"2226bf67-86e3-4375-a378-075aedce2ea6\") " pod="openshift-multus/multus-additional-cni-plugins-q6xdb" Apr 23 16:35:21.721658 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.720236 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/27c0f8c4-6f27-4267-8d2e-7e39aa9adcef-host-var-lib-kubelet\") pod \"multus-h6w7v\" (UID: \"27c0f8c4-6f27-4267-8d2e-7e39aa9adcef\") " pod="openshift-multus/multus-h6w7v" Apr 23 16:35:21.722196 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.720263 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-host-slash\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.722196 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.720279 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/27c0f8c4-6f27-4267-8d2e-7e39aa9adcef-cnibin\") pod \"multus-h6w7v\" (UID: \"27c0f8c4-6f27-4267-8d2e-7e39aa9adcef\") " pod="openshift-multus/multus-h6w7v" Apr 23 16:35:21.722196 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.720289 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.722196 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.720315 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2226bf67-86e3-4375-a378-075aedce2ea6-cni-binary-copy\") pod \"multus-additional-cni-plugins-q6xdb\" (UID: \"2226bf67-86e3-4375-a378-075aedce2ea6\") " pod="openshift-multus/multus-additional-cni-plugins-q6xdb" Apr 23 16:35:21.722196 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.720323 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/27c0f8c4-6f27-4267-8d2e-7e39aa9adcef-host-var-lib-kubelet\") pod \"multus-h6w7v\" (UID: \"27c0f8c4-6f27-4267-8d2e-7e39aa9adcef\") " pod="openshift-multus/multus-h6w7v" Apr 23 16:35:21.722196 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.720314 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2226bf67-86e3-4375-a378-075aedce2ea6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-q6xdb\" (UID: \"2226bf67-86e3-4375-a378-075aedce2ea6\") " pod="openshift-multus/multus-additional-cni-plugins-q6xdb" Apr 23 16:35:21.722196 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.720350 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-host-slash\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.722196 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.720355 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tcb2c\" (UniqueName: \"kubernetes.io/projected/2226bf67-86e3-4375-a378-075aedce2ea6-kube-api-access-tcb2c\") pod \"multus-additional-cni-plugins-q6xdb\" (UID: \"2226bf67-86e3-4375-a378-075aedce2ea6\") " pod="openshift-multus/multus-additional-cni-plugins-q6xdb" Apr 23 16:35:21.722196 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.720400 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b15c0f5c-7376-4c94-b44b-d6eb7a192048-host\") pod \"tuned-vllcp\" (UID: \"b15c0f5c-7376-4c94-b44b-d6eb7a192048\") " pod="openshift-cluster-node-tuning-operator/tuned-vllcp" Apr 23 16:35:21.722196 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.720350 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.722196 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.720427 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a58d25fc-7e16-47db-81f3-d8e27f59a92b-konnectivity-ca\") pod \"konnectivity-agent-86gqj\" (UID: \"a58d25fc-7e16-47db-81f3-d8e27f59a92b\") " pod="kube-system/konnectivity-agent-86gqj" Apr 23 16:35:21.722196 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.720452 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/27c0f8c4-6f27-4267-8d2e-7e39aa9adcef-system-cni-dir\") pod \"multus-h6w7v\" (UID: \"27c0f8c4-6f27-4267-8d2e-7e39aa9adcef\") " pod="openshift-multus/multus-h6w7v" Apr 23 16:35:21.722196 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.720473 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-host-cni-bin\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.722196 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.720494 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-ovnkube-config\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.722196 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.720479 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b15c0f5c-7376-4c94-b44b-d6eb7a192048-host\") pod \"tuned-vllcp\" (UID: \"b15c0f5c-7376-4c94-b44b-d6eb7a192048\") " pod="openshift-cluster-node-tuning-operator/tuned-vllcp" Apr 23 16:35:21.722196 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.720517 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-ovn-node-metrics-cert\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.722196 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.720540 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kv2hz\" (UniqueName: \"kubernetes.io/projected/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-kube-api-access-kv2hz\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.722724 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.720560 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b15c0f5c-7376-4c94-b44b-d6eb7a192048-lib-modules\") pod \"tuned-vllcp\" (UID: \"b15c0f5c-7376-4c94-b44b-d6eb7a192048\") " pod="openshift-cluster-node-tuning-operator/tuned-vllcp" Apr 23 16:35:21.722724 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.720573 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/27c0f8c4-6f27-4267-8d2e-7e39aa9adcef-system-cni-dir\") pod \"multus-h6w7v\" (UID: \"27c0f8c4-6f27-4267-8d2e-7e39aa9adcef\") " pod="openshift-multus/multus-h6w7v" Apr 23 16:35:21.722724 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.720585 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3e1cd67e-5850-4c5d-b72f-849d619e4d11-etc-selinux\") pod \"aws-ebs-csi-driver-node-wcjsr\" (UID: \"3e1cd67e-5850-4c5d-b72f-849d619e4d11\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wcjsr" Apr 23 16:35:21.722724 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.720590 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/27c0f8c4-6f27-4267-8d2e-7e39aa9adcef-cni-binary-copy\") pod \"multus-h6w7v\" (UID: \"27c0f8c4-6f27-4267-8d2e-7e39aa9adcef\") " pod="openshift-multus/multus-h6w7v" Apr 23 16:35:21.722724 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.720636 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-host-cni-bin\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.722724 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.720665 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-var-lib-openvswitch\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.722724 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.720713 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2226bf67-86e3-4375-a378-075aedce2ea6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-q6xdb\" (UID: \"2226bf67-86e3-4375-a378-075aedce2ea6\") " pod="openshift-multus/multus-additional-cni-plugins-q6xdb" Apr 23 16:35:21.722724 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.720724 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b15c0f5c-7376-4c94-b44b-d6eb7a192048-lib-modules\") pod \"tuned-vllcp\" (UID: \"b15c0f5c-7376-4c94-b44b-d6eb7a192048\") " pod="openshift-cluster-node-tuning-operator/tuned-vllcp" Apr 23 16:35:21.722724 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.720769 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-var-lib-openvswitch\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.722724 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.720772 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b15c0f5c-7376-4c94-b44b-d6eb7a192048-etc-kubernetes\") pod \"tuned-vllcp\" (UID: \"b15c0f5c-7376-4c94-b44b-d6eb7a192048\") " pod="openshift-cluster-node-tuning-operator/tuned-vllcp" Apr 23 16:35:21.722724 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.720813 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b15c0f5c-7376-4c94-b44b-d6eb7a192048-etc-kubernetes\") pod \"tuned-vllcp\" (UID: \"b15c0f5c-7376-4c94-b44b-d6eb7a192048\") " pod="openshift-cluster-node-tuning-operator/tuned-vllcp" Apr 23 16:35:21.722724 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.720817 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b15c0f5c-7376-4c94-b44b-d6eb7a192048-sys\") pod \"tuned-vllcp\" (UID: \"b15c0f5c-7376-4c94-b44b-d6eb7a192048\") " pod="openshift-cluster-node-tuning-operator/tuned-vllcp" Apr 23 16:35:21.722724 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.720843 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-systemd-units\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.722724 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.720872 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2226bf67-86e3-4375-a378-075aedce2ea6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-q6xdb\" (UID: \"2226bf67-86e3-4375-a378-075aedce2ea6\") " pod="openshift-multus/multus-additional-cni-plugins-q6xdb" Apr 23 16:35:21.722724 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.720897 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b15c0f5c-7376-4c94-b44b-d6eb7a192048-sys\") pod \"tuned-vllcp\" (UID: \"b15c0f5c-7376-4c94-b44b-d6eb7a192048\") " pod="openshift-cluster-node-tuning-operator/tuned-vllcp" Apr 23 16:35:21.722724 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.720901 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-llch5\" (UniqueName: \"kubernetes.io/projected/380d4b90-b7df-4855-ae70-0dc24d42f0d3-kube-api-access-llch5\") pod \"iptables-alerter-kjd4g\" (UID: \"380d4b90-b7df-4855-ae70-0dc24d42f0d3\") " pod="openshift-network-operator/iptables-alerter-kjd4g" Apr 23 16:35:21.722724 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.720935 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-systemd-units\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.723272 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.720945 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-host-kubelet\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.723272 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.720967 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e5a7d4ac-2ec3-48b6-8f5f-0d911a26eb86-serviceca\") pod \"node-ca-h68c5\" (UID: \"e5a7d4ac-2ec3-48b6-8f5f-0d911a26eb86\") " pod="openshift-image-registry/node-ca-h68c5" Apr 23 16:35:21.723272 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.720987 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/27c0f8c4-6f27-4267-8d2e-7e39aa9adcef-multus-socket-dir-parent\") pod \"multus-h6w7v\" (UID: \"27c0f8c4-6f27-4267-8d2e-7e39aa9adcef\") " pod="openshift-multus/multus-h6w7v" Apr 23 16:35:21.723272 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.721011 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l82z8\" (UniqueName: \"kubernetes.io/projected/27c0f8c4-6f27-4267-8d2e-7e39aa9adcef-kube-api-access-l82z8\") pod \"multus-h6w7v\" (UID: \"27c0f8c4-6f27-4267-8d2e-7e39aa9adcef\") " pod="openshift-multus/multus-h6w7v" Apr 23 16:35:21.723272 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.721029 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-ovnkube-config\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.723272 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.721045 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2226bf67-86e3-4375-a378-075aedce2ea6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-q6xdb\" (UID: \"2226bf67-86e3-4375-a378-075aedce2ea6\") " pod="openshift-multus/multus-additional-cni-plugins-q6xdb" Apr 23 16:35:21.723272 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.721034 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2226bf67-86e3-4375-a378-075aedce2ea6-os-release\") pod \"multus-additional-cni-plugins-q6xdb\" (UID: \"2226bf67-86e3-4375-a378-075aedce2ea6\") " pod="openshift-multus/multus-additional-cni-plugins-q6xdb" Apr 23 16:35:21.723272 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.721068 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-host-kubelet\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.723272 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.721082 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b15c0f5c-7376-4c94-b44b-d6eb7a192048-var-lib-kubelet\") pod \"tuned-vllcp\" (UID: \"b15c0f5c-7376-4c94-b44b-d6eb7a192048\") " pod="openshift-cluster-node-tuning-operator/tuned-vllcp" Apr 23 16:35:21.723272 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.721109 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/27c0f8c4-6f27-4267-8d2e-7e39aa9adcef-multus-conf-dir\") pod \"multus-h6w7v\" (UID: \"27c0f8c4-6f27-4267-8d2e-7e39aa9adcef\") " pod="openshift-multus/multus-h6w7v" Apr 23 16:35:21.723272 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.721103 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2226bf67-86e3-4375-a378-075aedce2ea6-os-release\") pod \"multus-additional-cni-plugins-q6xdb\" (UID: \"2226bf67-86e3-4375-a378-075aedce2ea6\") " pod="openshift-multus/multus-additional-cni-plugins-q6xdb" Apr 23 16:35:21.723272 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.721162 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/27c0f8c4-6f27-4267-8d2e-7e39aa9adcef-multus-daemon-config\") pod \"multus-h6w7v\" (UID: \"27c0f8c4-6f27-4267-8d2e-7e39aa9adcef\") " pod="openshift-multus/multus-h6w7v" Apr 23 16:35:21.723272 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.721177 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b15c0f5c-7376-4c94-b44b-d6eb7a192048-var-lib-kubelet\") pod \"tuned-vllcp\" (UID: \"b15c0f5c-7376-4c94-b44b-d6eb7a192048\") " pod="openshift-cluster-node-tuning-operator/tuned-vllcp" Apr 23 16:35:21.723272 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.721190 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-run-openvswitch\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.723272 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.721203 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/2226bf67-86e3-4375-a378-075aedce2ea6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-q6xdb\" (UID: \"2226bf67-86e3-4375-a378-075aedce2ea6\") " pod="openshift-multus/multus-additional-cni-plugins-q6xdb" Apr 23 16:35:21.723272 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.721210 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/27c0f8c4-6f27-4267-8d2e-7e39aa9adcef-multus-socket-dir-parent\") pod \"multus-h6w7v\" (UID: \"27c0f8c4-6f27-4267-8d2e-7e39aa9adcef\") " pod="openshift-multus/multus-h6w7v" Apr 23 16:35:21.723272 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.721222 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2226bf67-86e3-4375-a378-075aedce2ea6-system-cni-dir\") pod \"multus-additional-cni-plugins-q6xdb\" (UID: \"2226bf67-86e3-4375-a378-075aedce2ea6\") " pod="openshift-multus/multus-additional-cni-plugins-q6xdb" Apr 23 16:35:21.723781 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.721271 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2226bf67-86e3-4375-a378-075aedce2ea6-cni-binary-copy\") pod \"multus-additional-cni-plugins-q6xdb\" (UID: \"2226bf67-86e3-4375-a378-075aedce2ea6\") " pod="openshift-multus/multus-additional-cni-plugins-q6xdb" Apr 23 16:35:21.723781 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.721214 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/27c0f8c4-6f27-4267-8d2e-7e39aa9adcef-multus-conf-dir\") pod \"multus-h6w7v\" (UID: \"27c0f8c4-6f27-4267-8d2e-7e39aa9adcef\") " pod="openshift-multus/multus-h6w7v" Apr 23 16:35:21.723781 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.721484 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-run-openvswitch\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.723781 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.721487 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e5a7d4ac-2ec3-48b6-8f5f-0d911a26eb86-serviceca\") pod \"node-ca-h68c5\" (UID: \"e5a7d4ac-2ec3-48b6-8f5f-0d911a26eb86\") " pod="openshift-image-registry/node-ca-h68c5" Apr 23 16:35:21.723781 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.721512 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b15c0f5c-7376-4c94-b44b-d6eb7a192048-etc-sysctl-conf\") pod \"tuned-vllcp\" (UID: \"b15c0f5c-7376-4c94-b44b-d6eb7a192048\") " pod="openshift-cluster-node-tuning-operator/tuned-vllcp" Apr 23 16:35:21.723781 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.721540 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2226bf67-86e3-4375-a378-075aedce2ea6-system-cni-dir\") pod \"multus-additional-cni-plugins-q6xdb\" (UID: \"2226bf67-86e3-4375-a378-075aedce2ea6\") " pod="openshift-multus/multus-additional-cni-plugins-q6xdb" Apr 23 16:35:21.723781 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.721643 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b15c0f5c-7376-4c94-b44b-d6eb7a192048-etc-sysctl-conf\") pod \"tuned-vllcp\" (UID: \"b15c0f5c-7376-4c94-b44b-d6eb7a192048\") " pod="openshift-cluster-node-tuning-operator/tuned-vllcp" Apr 23 16:35:21.723781 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.722409 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/27c0f8c4-6f27-4267-8d2e-7e39aa9adcef-multus-daemon-config\") pod \"multus-h6w7v\" (UID: \"27c0f8c4-6f27-4267-8d2e-7e39aa9adcef\") " pod="openshift-multus/multus-h6w7v" Apr 23 16:35:21.723781 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.722989 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b15c0f5c-7376-4c94-b44b-d6eb7a192048-etc-tuned\") pod \"tuned-vllcp\" (UID: \"b15c0f5c-7376-4c94-b44b-d6eb7a192048\") " pod="openshift-cluster-node-tuning-operator/tuned-vllcp" Apr 23 16:35:21.723781 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.723046 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b15c0f5c-7376-4c94-b44b-d6eb7a192048-tmp\") pod \"tuned-vllcp\" (UID: \"b15c0f5c-7376-4c94-b44b-d6eb7a192048\") " pod="openshift-cluster-node-tuning-operator/tuned-vllcp" Apr 23 16:35:21.723781 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.723664 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-ovn-node-metrics-cert\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.734270 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.734250 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5lhx\" (UniqueName: \"kubernetes.io/projected/b15c0f5c-7376-4c94-b44b-d6eb7a192048-kube-api-access-v5lhx\") pod \"tuned-vllcp\" (UID: \"b15c0f5c-7376-4c94-b44b-d6eb7a192048\") " pod="openshift-cluster-node-tuning-operator/tuned-vllcp" Apr 23 16:35:21.736824 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.736806 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5kdj\" (UniqueName: \"kubernetes.io/projected/adbb31b5-ee6b-431b-ac95-7775688ba039-kube-api-access-n5kdj\") pod \"network-metrics-daemon-6wbcq\" (UID: \"adbb31b5-ee6b-431b-ac95-7775688ba039\") " pod="openshift-multus/network-metrics-daemon-6wbcq" Apr 23 16:35:21.736947 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.736925 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqznn\" (UniqueName: \"kubernetes.io/projected/e5a7d4ac-2ec3-48b6-8f5f-0d911a26eb86-kube-api-access-cqznn\") pod \"node-ca-h68c5\" (UID: \"e5a7d4ac-2ec3-48b6-8f5f-0d911a26eb86\") " pod="openshift-image-registry/node-ca-h68c5" Apr 23 16:35:21.738969 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.738928 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv2hz\" (UniqueName: \"kubernetes.io/projected/fd41c55a-49c5-41cd-8fd5-f7964f5444e9-kube-api-access-kv2hz\") pod \"ovnkube-node-krt5k\" (UID: \"fd41c55a-49c5-41cd-8fd5-f7964f5444e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.739657 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.739635 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-llch5\" (UniqueName: \"kubernetes.io/projected/380d4b90-b7df-4855-ae70-0dc24d42f0d3-kube-api-access-llch5\") pod \"iptables-alerter-kjd4g\" (UID: \"380d4b90-b7df-4855-ae70-0dc24d42f0d3\") " pod="openshift-network-operator/iptables-alerter-kjd4g" Apr 23 16:35:21.739752 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.739680 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l82z8\" (UniqueName: \"kubernetes.io/projected/27c0f8c4-6f27-4267-8d2e-7e39aa9adcef-kube-api-access-l82z8\") pod \"multus-h6w7v\" (UID: \"27c0f8c4-6f27-4267-8d2e-7e39aa9adcef\") " pod="openshift-multus/multus-h6w7v" Apr 23 16:35:21.740380 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.740343 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-57.ec2.internal" event={"ID":"cb72edff024abc1bfc776fb556b861df","Type":"ContainerStarted","Data":"7ee18e1c1b9a0cbb87c8670228ca279db9820c101fb3cbe4b482c552047fc88c"} Apr 23 16:35:21.740832 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.740812 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcb2c\" (UniqueName: \"kubernetes.io/projected/2226bf67-86e3-4375-a378-075aedce2ea6-kube-api-access-tcb2c\") pod \"multus-additional-cni-plugins-q6xdb\" (UID: \"2226bf67-86e3-4375-a378-075aedce2ea6\") " pod="openshift-multus/multus-additional-cni-plugins-q6xdb" Apr 23 16:35:21.743509 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.742055 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-57.ec2.internal" event={"ID":"6bfe44dce378e2f142aa223226db1983","Type":"ContainerStarted","Data":"99531fcae33d0320a9588d6611ec3f956ffaf52c7e07e8d976983332757b1a4e"} Apr 23 16:35:21.822027 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.821997 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3e1cd67e-5850-4c5d-b72f-849d619e4d11-kubelet-dir\") pod \"aws-ebs-csi-driver-node-wcjsr\" (UID: \"3e1cd67e-5850-4c5d-b72f-849d619e4d11\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wcjsr" Apr 23 16:35:21.822150 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.822037 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3e1cd67e-5850-4c5d-b72f-849d619e4d11-registration-dir\") pod \"aws-ebs-csi-driver-node-wcjsr\" (UID: \"3e1cd67e-5850-4c5d-b72f-849d619e4d11\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wcjsr" Apr 23 16:35:21.822150 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.822053 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3e1cd67e-5850-4c5d-b72f-849d619e4d11-device-dir\") pod \"aws-ebs-csi-driver-node-wcjsr\" (UID: \"3e1cd67e-5850-4c5d-b72f-849d619e4d11\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wcjsr" Apr 23 16:35:21.822150 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.822122 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3e1cd67e-5850-4c5d-b72f-849d619e4d11-kubelet-dir\") pod \"aws-ebs-csi-driver-node-wcjsr\" (UID: \"3e1cd67e-5850-4c5d-b72f-849d619e4d11\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wcjsr" Apr 23 16:35:21.822246 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.822171 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3e1cd67e-5850-4c5d-b72f-849d619e4d11-registration-dir\") pod \"aws-ebs-csi-driver-node-wcjsr\" (UID: \"3e1cd67e-5850-4c5d-b72f-849d619e4d11\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wcjsr" Apr 23 16:35:21.822246 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.822178 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3e1cd67e-5850-4c5d-b72f-849d619e4d11-device-dir\") pod \"aws-ebs-csi-driver-node-wcjsr\" (UID: \"3e1cd67e-5850-4c5d-b72f-849d619e4d11\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wcjsr" Apr 23 16:35:21.822246 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.822202 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a58d25fc-7e16-47db-81f3-d8e27f59a92b-agent-certs\") pod \"konnectivity-agent-86gqj\" (UID: \"a58d25fc-7e16-47db-81f3-d8e27f59a92b\") " pod="kube-system/konnectivity-agent-86gqj" Apr 23 16:35:21.822246 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.822229 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a58d25fc-7e16-47db-81f3-d8e27f59a92b-konnectivity-ca\") pod \"konnectivity-agent-86gqj\" (UID: \"a58d25fc-7e16-47db-81f3-d8e27f59a92b\") " pod="kube-system/konnectivity-agent-86gqj" Apr 23 16:35:21.822379 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.822250 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3e1cd67e-5850-4c5d-b72f-849d619e4d11-etc-selinux\") pod \"aws-ebs-csi-driver-node-wcjsr\" (UID: \"3e1cd67e-5850-4c5d-b72f-849d619e4d11\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wcjsr" Apr 23 16:35:21.822379 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.822362 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3e1cd67e-5850-4c5d-b72f-849d619e4d11-etc-selinux\") pod \"aws-ebs-csi-driver-node-wcjsr\" (UID: \"3e1cd67e-5850-4c5d-b72f-849d619e4d11\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wcjsr" Apr 23 16:35:21.822455 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.822384 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-95594\" (UniqueName: \"kubernetes.io/projected/0ee2ca50-34ff-4830-8c04-92018768a3a7-kube-api-access-95594\") pod \"network-check-target-wn6cd\" (UID: \"0ee2ca50-34ff-4830-8c04-92018768a3a7\") " pod="openshift-network-diagnostics/network-check-target-wn6cd" Apr 23 16:35:21.822455 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.822441 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3e1cd67e-5850-4c5d-b72f-849d619e4d11-socket-dir\") pod \"aws-ebs-csi-driver-node-wcjsr\" (UID: \"3e1cd67e-5850-4c5d-b72f-849d619e4d11\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wcjsr" Apr 23 16:35:21.822545 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.822479 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3e1cd67e-5850-4c5d-b72f-849d619e4d11-sys-fs\") pod \"aws-ebs-csi-driver-node-wcjsr\" (UID: \"3e1cd67e-5850-4c5d-b72f-849d619e4d11\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wcjsr" Apr 23 16:35:21.822545 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.822504 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zsw6x\" (UniqueName: \"kubernetes.io/projected/3e1cd67e-5850-4c5d-b72f-849d619e4d11-kube-api-access-zsw6x\") pod \"aws-ebs-csi-driver-node-wcjsr\" (UID: \"3e1cd67e-5850-4c5d-b72f-849d619e4d11\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wcjsr" Apr 23 16:35:21.822634 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.822588 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3e1cd67e-5850-4c5d-b72f-849d619e4d11-sys-fs\") pod \"aws-ebs-csi-driver-node-wcjsr\" (UID: \"3e1cd67e-5850-4c5d-b72f-849d619e4d11\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wcjsr" Apr 23 16:35:21.822634 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.822593 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3e1cd67e-5850-4c5d-b72f-849d619e4d11-socket-dir\") pod \"aws-ebs-csi-driver-node-wcjsr\" (UID: \"3e1cd67e-5850-4c5d-b72f-849d619e4d11\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wcjsr" Apr 23 16:35:21.822838 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.822822 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a58d25fc-7e16-47db-81f3-d8e27f59a92b-konnectivity-ca\") pod \"konnectivity-agent-86gqj\" (UID: \"a58d25fc-7e16-47db-81f3-d8e27f59a92b\") " pod="kube-system/konnectivity-agent-86gqj" Apr 23 16:35:21.824631 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.824615 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a58d25fc-7e16-47db-81f3-d8e27f59a92b-agent-certs\") pod \"konnectivity-agent-86gqj\" (UID: \"a58d25fc-7e16-47db-81f3-d8e27f59a92b\") " pod="kube-system/konnectivity-agent-86gqj" Apr 23 16:35:21.840161 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:21.840132 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:21.840161 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:21.840152 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:21.840161 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:21.840162 2562 projected.go:194] Error preparing data for projected volume kube-api-access-95594 for pod openshift-network-diagnostics/network-check-target-wn6cd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:21.840353 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:21.840213 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0ee2ca50-34ff-4830-8c04-92018768a3a7-kube-api-access-95594 podName:0ee2ca50-34ff-4830-8c04-92018768a3a7 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:22.340198487 +0000 UTC m=+2.193013533 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-95594" (UniqueName: "kubernetes.io/projected/0ee2ca50-34ff-4830-8c04-92018768a3a7-kube-api-access-95594") pod "network-check-target-wn6cd" (UID: "0ee2ca50-34ff-4830-8c04-92018768a3a7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:21.841446 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.841426 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsw6x\" (UniqueName: \"kubernetes.io/projected/3e1cd67e-5850-4c5d-b72f-849d619e4d11-kube-api-access-zsw6x\") pod \"aws-ebs-csi-driver-node-wcjsr\" (UID: \"3e1cd67e-5850-4c5d-b72f-849d619e4d11\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wcjsr" Apr 23 16:35:21.917645 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.917584 2562 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:35:21.932191 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.932170 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-h6w7v" Apr 23 16:35:21.938766 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:21.938728 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27c0f8c4_6f27_4267_8d2e_7e39aa9adcef.slice/crio-d6d1780085a945b8122a2a60125e10dfc10da6b30c131f2d249d99cb3d48b8e9 WatchSource:0}: Error finding container d6d1780085a945b8122a2a60125e10dfc10da6b30c131f2d249d99cb3d48b8e9: Status 404 returned error can't find the container with id d6d1780085a945b8122a2a60125e10dfc10da6b30c131f2d249d99cb3d48b8e9 Apr 23 16:35:21.950732 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.950715 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-kjd4g" Apr 23 16:35:21.956454 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.956429 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:21.956685 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:21.956660 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod380d4b90_b7df_4855_ae70_0dc24d42f0d3.slice/crio-5abfef4a9036b6ffa1cb93f66f0749e98ea3ddb09ab6fb1a4410dc636c6de20b WatchSource:0}: Error finding container 5abfef4a9036b6ffa1cb93f66f0749e98ea3ddb09ab6fb1a4410dc636c6de20b: Status 404 returned error can't find the container with id 5abfef4a9036b6ffa1cb93f66f0749e98ea3ddb09ab6fb1a4410dc636c6de20b Apr 23 16:35:21.962961 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:21.962939 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd41c55a_49c5_41cd_8fd5_f7964f5444e9.slice/crio-2797cbd055defa8cb71ad0dc0b0d21144a216f1875ebf20d6595270696014c07 WatchSource:0}: Error finding container 2797cbd055defa8cb71ad0dc0b0d21144a216f1875ebf20d6595270696014c07: Status 404 returned error can't find the container with id 2797cbd055defa8cb71ad0dc0b0d21144a216f1875ebf20d6595270696014c07 Apr 23 16:35:21.968417 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.968397 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-vllcp" Apr 23 16:35:21.973216 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.973195 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-h68c5" Apr 23 16:35:21.974694 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:21.974670 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb15c0f5c_7376_4c94_b44b_d6eb7a192048.slice/crio-1b3a9422e9ef77ed362fa9d8455b89d05c167f4c16dd96507610f8bea24aa654 WatchSource:0}: Error finding container 1b3a9422e9ef77ed362fa9d8455b89d05c167f4c16dd96507610f8bea24aa654: Status 404 returned error can't find the container with id 1b3a9422e9ef77ed362fa9d8455b89d05c167f4c16dd96507610f8bea24aa654 Apr 23 16:35:21.978697 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.978679 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-q6xdb" Apr 23 16:35:21.980818 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:21.980792 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5a7d4ac_2ec3_48b6_8f5f_0d911a26eb86.slice/crio-0e3d6218828e74b5a13955b6b42aadb9000614cc1c2bf54b6128130a1f3f0454 WatchSource:0}: Error finding container 0e3d6218828e74b5a13955b6b42aadb9000614cc1c2bf54b6128130a1f3f0454: Status 404 returned error can't find the container with id 0e3d6218828e74b5a13955b6b42aadb9000614cc1c2bf54b6128130a1f3f0454 Apr 23 16:35:21.985366 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:21.985346 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2226bf67_86e3_4375_a378_075aedce2ea6.slice/crio-cfb51feeac261e62c1c9f27ba78bbfac7a68348302d1dd0115242a30641c48ad WatchSource:0}: Error finding container cfb51feeac261e62c1c9f27ba78bbfac7a68348302d1dd0115242a30641c48ad: Status 404 returned error can't find the container with id cfb51feeac261e62c1c9f27ba78bbfac7a68348302d1dd0115242a30641c48ad Apr 23 16:35:21.995497 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:21.995475 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-86gqj" Apr 23 16:35:22.000991 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:22.000964 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wcjsr" Apr 23 16:35:22.001235 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:22.001206 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda58d25fc_7e16_47db_81f3_d8e27f59a92b.slice/crio-2e0b5112c4a25b09a2d96d1e1ae48a5b1b202770951fe4abd9304977b743c583 WatchSource:0}: Error finding container 2e0b5112c4a25b09a2d96d1e1ae48a5b1b202770951fe4abd9304977b743c583: Status 404 returned error can't find the container with id 2e0b5112c4a25b09a2d96d1e1ae48a5b1b202770951fe4abd9304977b743c583 Apr 23 16:35:22.006892 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:35:22.006870 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e1cd67e_5850_4c5d_b72f_849d619e4d11.slice/crio-7055c7a0b276c94490963f5c9342da7b88a6595b2adbed54d900e5290cf6ee59 WatchSource:0}: Error finding container 7055c7a0b276c94490963f5c9342da7b88a6595b2adbed54d900e5290cf6ee59: Status 404 returned error can't find the container with id 7055c7a0b276c94490963f5c9342da7b88a6595b2adbed54d900e5290cf6ee59 Apr 23 16:35:22.225289 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:22.225198 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/adbb31b5-ee6b-431b-ac95-7775688ba039-metrics-certs\") pod \"network-metrics-daemon-6wbcq\" (UID: \"adbb31b5-ee6b-431b-ac95-7775688ba039\") " pod="openshift-multus/network-metrics-daemon-6wbcq" Apr 23 16:35:22.225451 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:22.225367 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:22.225451 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:22.225441 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/adbb31b5-ee6b-431b-ac95-7775688ba039-metrics-certs podName:adbb31b5-ee6b-431b-ac95-7775688ba039 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:23.225422488 +0000 UTC m=+3.078237536 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/adbb31b5-ee6b-431b-ac95-7775688ba039-metrics-certs") pod "network-metrics-daemon-6wbcq" (UID: "adbb31b5-ee6b-431b-ac95-7775688ba039") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:22.427027 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:22.426992 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-95594\" (UniqueName: \"kubernetes.io/projected/0ee2ca50-34ff-4830-8c04-92018768a3a7-kube-api-access-95594\") pod \"network-check-target-wn6cd\" (UID: \"0ee2ca50-34ff-4830-8c04-92018768a3a7\") " pod="openshift-network-diagnostics/network-check-target-wn6cd" Apr 23 16:35:22.427281 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:22.427164 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:22.427281 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:22.427183 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:22.427281 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:22.427196 2562 projected.go:194] Error preparing data for projected volume kube-api-access-95594 for pod openshift-network-diagnostics/network-check-target-wn6cd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:22.427281 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:22.427251 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0ee2ca50-34ff-4830-8c04-92018768a3a7-kube-api-access-95594 podName:0ee2ca50-34ff-4830-8c04-92018768a3a7 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:23.427233479 +0000 UTC m=+3.280048539 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-95594" (UniqueName: "kubernetes.io/projected/0ee2ca50-34ff-4830-8c04-92018768a3a7-kube-api-access-95594") pod "network-check-target-wn6cd" (UID: "0ee2ca50-34ff-4830-8c04-92018768a3a7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:22.525656 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:22.525570 2562 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:35:22.627817 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:22.627788 2562 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:35:22.665538 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:22.665495 2562 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 16:30:21 +0000 UTC" deadline="2027-10-12 13:23:45.709078959 +0000 UTC" Apr 23 16:35:22.665538 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:22.665537 2562 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12884h48m23.043547002s" Apr 23 16:35:22.748394 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:22.748363 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wn6cd" Apr 23 16:35:22.748570 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:22.748481 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wn6cd" podUID="0ee2ca50-34ff-4830-8c04-92018768a3a7" Apr 23 16:35:22.769450 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:22.769412 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q6xdb" event={"ID":"2226bf67-86e3-4375-a378-075aedce2ea6","Type":"ContainerStarted","Data":"cfb51feeac261e62c1c9f27ba78bbfac7a68348302d1dd0115242a30641c48ad"} Apr 23 16:35:22.780845 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:22.780761 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-h68c5" event={"ID":"e5a7d4ac-2ec3-48b6-8f5f-0d911a26eb86","Type":"ContainerStarted","Data":"0e3d6218828e74b5a13955b6b42aadb9000614cc1c2bf54b6128130a1f3f0454"} Apr 23 16:35:22.787768 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:22.787719 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-vllcp" event={"ID":"b15c0f5c-7376-4c94-b44b-d6eb7a192048","Type":"ContainerStarted","Data":"1b3a9422e9ef77ed362fa9d8455b89d05c167f4c16dd96507610f8bea24aa654"} Apr 23 16:35:22.795254 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:22.795223 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" event={"ID":"fd41c55a-49c5-41cd-8fd5-f7964f5444e9","Type":"ContainerStarted","Data":"2797cbd055defa8cb71ad0dc0b0d21144a216f1875ebf20d6595270696014c07"} Apr 23 16:35:22.803421 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:22.803338 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wcjsr" event={"ID":"3e1cd67e-5850-4c5d-b72f-849d619e4d11","Type":"ContainerStarted","Data":"7055c7a0b276c94490963f5c9342da7b88a6595b2adbed54d900e5290cf6ee59"} Apr 23 16:35:22.811350 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:22.811323 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-86gqj" event={"ID":"a58d25fc-7e16-47db-81f3-d8e27f59a92b","Type":"ContainerStarted","Data":"2e0b5112c4a25b09a2d96d1e1ae48a5b1b202770951fe4abd9304977b743c583"} Apr 23 16:35:22.824766 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:22.821471 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-kjd4g" event={"ID":"380d4b90-b7df-4855-ae70-0dc24d42f0d3","Type":"ContainerStarted","Data":"5abfef4a9036b6ffa1cb93f66f0749e98ea3ddb09ab6fb1a4410dc636c6de20b"} Apr 23 16:35:22.827992 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:22.827934 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-h6w7v" event={"ID":"27c0f8c4-6f27-4267-8d2e-7e39aa9adcef","Type":"ContainerStarted","Data":"d6d1780085a945b8122a2a60125e10dfc10da6b30c131f2d249d99cb3d48b8e9"} Apr 23 16:35:23.233946 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:23.233908 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/adbb31b5-ee6b-431b-ac95-7775688ba039-metrics-certs\") pod \"network-metrics-daemon-6wbcq\" (UID: \"adbb31b5-ee6b-431b-ac95-7775688ba039\") " pod="openshift-multus/network-metrics-daemon-6wbcq" Apr 23 16:35:23.234138 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:23.234073 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:23.234138 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:23.234136 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/adbb31b5-ee6b-431b-ac95-7775688ba039-metrics-certs podName:adbb31b5-ee6b-431b-ac95-7775688ba039 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:25.234117271 +0000 UTC m=+5.086932314 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/adbb31b5-ee6b-431b-ac95-7775688ba039-metrics-certs") pod "network-metrics-daemon-6wbcq" (UID: "adbb31b5-ee6b-431b-ac95-7775688ba039") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:23.436992 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:23.436715 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-95594\" (UniqueName: \"kubernetes.io/projected/0ee2ca50-34ff-4830-8c04-92018768a3a7-kube-api-access-95594\") pod \"network-check-target-wn6cd\" (UID: \"0ee2ca50-34ff-4830-8c04-92018768a3a7\") " pod="openshift-network-diagnostics/network-check-target-wn6cd" Apr 23 16:35:23.436992 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:23.436984 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:23.436992 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:23.437004 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:23.437264 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:23.437019 2562 projected.go:194] Error preparing data for projected volume kube-api-access-95594 for pod openshift-network-diagnostics/network-check-target-wn6cd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:23.437264 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:23.437081 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0ee2ca50-34ff-4830-8c04-92018768a3a7-kube-api-access-95594 podName:0ee2ca50-34ff-4830-8c04-92018768a3a7 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:25.437061809 +0000 UTC m=+5.289876854 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-95594" (UniqueName: "kubernetes.io/projected/0ee2ca50-34ff-4830-8c04-92018768a3a7-kube-api-access-95594") pod "network-check-target-wn6cd" (UID: "0ee2ca50-34ff-4830-8c04-92018768a3a7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:23.665755 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:23.665695 2562 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 16:30:21 +0000 UTC" deadline="2027-12-03 13:25:25.538059118 +0000 UTC" Apr 23 16:35:23.665755 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:23.665753 2562 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14132h50m1.872324932s" Apr 23 16:35:23.738436 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:23.738402 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6wbcq" Apr 23 16:35:23.738617 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:23.738553 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6wbcq" podUID="adbb31b5-ee6b-431b-ac95-7775688ba039" Apr 23 16:35:24.740717 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:24.740207 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wn6cd" Apr 23 16:35:24.740717 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:24.740335 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wn6cd" podUID="0ee2ca50-34ff-4830-8c04-92018768a3a7" Apr 23 16:35:25.251002 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:25.250964 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/adbb31b5-ee6b-431b-ac95-7775688ba039-metrics-certs\") pod \"network-metrics-daemon-6wbcq\" (UID: \"adbb31b5-ee6b-431b-ac95-7775688ba039\") " pod="openshift-multus/network-metrics-daemon-6wbcq" Apr 23 16:35:25.251171 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:25.251135 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:25.251244 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:25.251218 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/adbb31b5-ee6b-431b-ac95-7775688ba039-metrics-certs podName:adbb31b5-ee6b-431b-ac95-7775688ba039 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:29.251198327 +0000 UTC m=+9.104013374 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/adbb31b5-ee6b-431b-ac95-7775688ba039-metrics-certs") pod "network-metrics-daemon-6wbcq" (UID: "adbb31b5-ee6b-431b-ac95-7775688ba039") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:25.453204 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:25.453108 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-95594\" (UniqueName: \"kubernetes.io/projected/0ee2ca50-34ff-4830-8c04-92018768a3a7-kube-api-access-95594\") pod \"network-check-target-wn6cd\" (UID: \"0ee2ca50-34ff-4830-8c04-92018768a3a7\") " pod="openshift-network-diagnostics/network-check-target-wn6cd" Apr 23 16:35:25.453369 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:25.453284 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:25.453369 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:25.453310 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:25.453369 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:25.453323 2562 projected.go:194] Error preparing data for projected volume kube-api-access-95594 for pod openshift-network-diagnostics/network-check-target-wn6cd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:25.453542 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:25.453384 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0ee2ca50-34ff-4830-8c04-92018768a3a7-kube-api-access-95594 podName:0ee2ca50-34ff-4830-8c04-92018768a3a7 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:29.453363169 +0000 UTC m=+9.306178231 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-95594" (UniqueName: "kubernetes.io/projected/0ee2ca50-34ff-4830-8c04-92018768a3a7-kube-api-access-95594") pod "network-check-target-wn6cd" (UID: "0ee2ca50-34ff-4830-8c04-92018768a3a7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:25.738052 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:25.737789 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6wbcq" Apr 23 16:35:25.738221 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:25.738147 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6wbcq" podUID="adbb31b5-ee6b-431b-ac95-7775688ba039" Apr 23 16:35:26.573869 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:26.573125 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-bfsmf"] Apr 23 16:35:26.575174 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:26.575151 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-bfsmf" Apr 23 16:35:26.578528 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:26.578504 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 23 16:35:26.578996 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:26.578833 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 23 16:35:26.579874 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:26.579608 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-285c2\"" Apr 23 16:35:26.664141 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:26.664104 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/43aa3f2a-b871-42c1-a3ca-550b762203bf-tmp-dir\") pod \"node-resolver-bfsmf\" (UID: \"43aa3f2a-b871-42c1-a3ca-550b762203bf\") " pod="openshift-dns/node-resolver-bfsmf" Apr 23 16:35:26.664318 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:26.664218 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2l9v6\" (UniqueName: \"kubernetes.io/projected/43aa3f2a-b871-42c1-a3ca-550b762203bf-kube-api-access-2l9v6\") pod \"node-resolver-bfsmf\" (UID: \"43aa3f2a-b871-42c1-a3ca-550b762203bf\") " pod="openshift-dns/node-resolver-bfsmf" Apr 23 16:35:26.664318 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:26.664269 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/43aa3f2a-b871-42c1-a3ca-550b762203bf-hosts-file\") pod \"node-resolver-bfsmf\" (UID: \"43aa3f2a-b871-42c1-a3ca-550b762203bf\") " pod="openshift-dns/node-resolver-bfsmf" Apr 23 16:35:26.738231 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:26.737755 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wn6cd" Apr 23 16:35:26.738231 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:26.737880 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wn6cd" podUID="0ee2ca50-34ff-4830-8c04-92018768a3a7" Apr 23 16:35:26.765581 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:26.765544 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2l9v6\" (UniqueName: \"kubernetes.io/projected/43aa3f2a-b871-42c1-a3ca-550b762203bf-kube-api-access-2l9v6\") pod \"node-resolver-bfsmf\" (UID: \"43aa3f2a-b871-42c1-a3ca-550b762203bf\") " pod="openshift-dns/node-resolver-bfsmf" Apr 23 16:35:26.765791 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:26.765731 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/43aa3f2a-b871-42c1-a3ca-550b762203bf-hosts-file\") pod \"node-resolver-bfsmf\" (UID: \"43aa3f2a-b871-42c1-a3ca-550b762203bf\") " pod="openshift-dns/node-resolver-bfsmf" Apr 23 16:35:26.765791 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:26.765786 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/43aa3f2a-b871-42c1-a3ca-550b762203bf-tmp-dir\") pod \"node-resolver-bfsmf\" (UID: \"43aa3f2a-b871-42c1-a3ca-550b762203bf\") " pod="openshift-dns/node-resolver-bfsmf" Apr 23 16:35:26.765900 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:26.765822 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/43aa3f2a-b871-42c1-a3ca-550b762203bf-hosts-file\") pod \"node-resolver-bfsmf\" (UID: \"43aa3f2a-b871-42c1-a3ca-550b762203bf\") " pod="openshift-dns/node-resolver-bfsmf" Apr 23 16:35:26.766176 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:26.766156 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/43aa3f2a-b871-42c1-a3ca-550b762203bf-tmp-dir\") pod \"node-resolver-bfsmf\" (UID: \"43aa3f2a-b871-42c1-a3ca-550b762203bf\") " pod="openshift-dns/node-resolver-bfsmf" Apr 23 16:35:26.778667 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:26.778607 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2l9v6\" (UniqueName: \"kubernetes.io/projected/43aa3f2a-b871-42c1-a3ca-550b762203bf-kube-api-access-2l9v6\") pod \"node-resolver-bfsmf\" (UID: \"43aa3f2a-b871-42c1-a3ca-550b762203bf\") " pod="openshift-dns/node-resolver-bfsmf" Apr 23 16:35:26.888127 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:26.888090 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-bfsmf" Apr 23 16:35:27.338975 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:27.337987 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-qzdzg"] Apr 23 16:35:27.341944 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:27.341700 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qzdzg" Apr 23 16:35:27.341944 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:27.341795 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qzdzg" podUID="451de1c0-4375-4c37-8a62-0641aa75255d" Apr 23 16:35:27.370058 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:27.370023 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/451de1c0-4375-4c37-8a62-0641aa75255d-kubelet-config\") pod \"global-pull-secret-syncer-qzdzg\" (UID: \"451de1c0-4375-4c37-8a62-0641aa75255d\") " pod="kube-system/global-pull-secret-syncer-qzdzg" Apr 23 16:35:27.370225 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:27.370078 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/451de1c0-4375-4c37-8a62-0641aa75255d-original-pull-secret\") pod \"global-pull-secret-syncer-qzdzg\" (UID: \"451de1c0-4375-4c37-8a62-0641aa75255d\") " pod="kube-system/global-pull-secret-syncer-qzdzg" Apr 23 16:35:27.370225 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:27.370121 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/451de1c0-4375-4c37-8a62-0641aa75255d-dbus\") pod \"global-pull-secret-syncer-qzdzg\" (UID: \"451de1c0-4375-4c37-8a62-0641aa75255d\") " pod="kube-system/global-pull-secret-syncer-qzdzg" Apr 23 16:35:27.470771 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:27.470721 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/451de1c0-4375-4c37-8a62-0641aa75255d-kubelet-config\") pod \"global-pull-secret-syncer-qzdzg\" (UID: \"451de1c0-4375-4c37-8a62-0641aa75255d\") " pod="kube-system/global-pull-secret-syncer-qzdzg" Apr 23 16:35:27.470956 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:27.470792 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/451de1c0-4375-4c37-8a62-0641aa75255d-original-pull-secret\") pod \"global-pull-secret-syncer-qzdzg\" (UID: \"451de1c0-4375-4c37-8a62-0641aa75255d\") " pod="kube-system/global-pull-secret-syncer-qzdzg" Apr 23 16:35:27.470956 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:27.470843 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/451de1c0-4375-4c37-8a62-0641aa75255d-dbus\") pod \"global-pull-secret-syncer-qzdzg\" (UID: \"451de1c0-4375-4c37-8a62-0641aa75255d\") " pod="kube-system/global-pull-secret-syncer-qzdzg" Apr 23 16:35:27.470956 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:27.470883 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/451de1c0-4375-4c37-8a62-0641aa75255d-kubelet-config\") pod \"global-pull-secret-syncer-qzdzg\" (UID: \"451de1c0-4375-4c37-8a62-0641aa75255d\") " pod="kube-system/global-pull-secret-syncer-qzdzg" Apr 23 16:35:27.471110 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:27.471007 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/451de1c0-4375-4c37-8a62-0641aa75255d-dbus\") pod \"global-pull-secret-syncer-qzdzg\" (UID: \"451de1c0-4375-4c37-8a62-0641aa75255d\") " pod="kube-system/global-pull-secret-syncer-qzdzg" Apr 23 16:35:27.471110 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:27.471008 2562 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:27.471110 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:27.471088 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/451de1c0-4375-4c37-8a62-0641aa75255d-original-pull-secret podName:451de1c0-4375-4c37-8a62-0641aa75255d nodeName:}" failed. No retries permitted until 2026-04-23 16:35:27.971069335 +0000 UTC m=+7.823884380 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/451de1c0-4375-4c37-8a62-0641aa75255d-original-pull-secret") pod "global-pull-secret-syncer-qzdzg" (UID: "451de1c0-4375-4c37-8a62-0641aa75255d") : object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:27.737646 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:27.737617 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6wbcq" Apr 23 16:35:27.738111 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:27.737730 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6wbcq" podUID="adbb31b5-ee6b-431b-ac95-7775688ba039" Apr 23 16:35:27.974516 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:27.974482 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/451de1c0-4375-4c37-8a62-0641aa75255d-original-pull-secret\") pod \"global-pull-secret-syncer-qzdzg\" (UID: \"451de1c0-4375-4c37-8a62-0641aa75255d\") " pod="kube-system/global-pull-secret-syncer-qzdzg" Apr 23 16:35:27.974699 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:27.974625 2562 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:27.974699 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:27.974681 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/451de1c0-4375-4c37-8a62-0641aa75255d-original-pull-secret podName:451de1c0-4375-4c37-8a62-0641aa75255d nodeName:}" failed. No retries permitted until 2026-04-23 16:35:28.974667063 +0000 UTC m=+8.827482109 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/451de1c0-4375-4c37-8a62-0641aa75255d-original-pull-secret") pod "global-pull-secret-syncer-qzdzg" (UID: "451de1c0-4375-4c37-8a62-0641aa75255d") : object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:28.738445 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:28.737937 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wn6cd" Apr 23 16:35:28.738445 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:28.737972 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qzdzg" Apr 23 16:35:28.738445 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:28.738067 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wn6cd" podUID="0ee2ca50-34ff-4830-8c04-92018768a3a7" Apr 23 16:35:28.738445 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:28.738201 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qzdzg" podUID="451de1c0-4375-4c37-8a62-0641aa75255d" Apr 23 16:35:28.983046 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:28.982994 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/451de1c0-4375-4c37-8a62-0641aa75255d-original-pull-secret\") pod \"global-pull-secret-syncer-qzdzg\" (UID: \"451de1c0-4375-4c37-8a62-0641aa75255d\") " pod="kube-system/global-pull-secret-syncer-qzdzg" Apr 23 16:35:28.983227 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:28.983190 2562 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:28.983302 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:28.983254 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/451de1c0-4375-4c37-8a62-0641aa75255d-original-pull-secret podName:451de1c0-4375-4c37-8a62-0641aa75255d nodeName:}" failed. No retries permitted until 2026-04-23 16:35:30.983233611 +0000 UTC m=+10.836048657 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/451de1c0-4375-4c37-8a62-0641aa75255d-original-pull-secret") pod "global-pull-secret-syncer-qzdzg" (UID: "451de1c0-4375-4c37-8a62-0641aa75255d") : object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:29.285396 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:29.285360 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/adbb31b5-ee6b-431b-ac95-7775688ba039-metrics-certs\") pod \"network-metrics-daemon-6wbcq\" (UID: \"adbb31b5-ee6b-431b-ac95-7775688ba039\") " pod="openshift-multus/network-metrics-daemon-6wbcq" Apr 23 16:35:29.285583 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:29.285550 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:29.285653 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:29.285609 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/adbb31b5-ee6b-431b-ac95-7775688ba039-metrics-certs podName:adbb31b5-ee6b-431b-ac95-7775688ba039 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:37.285590762 +0000 UTC m=+17.138405807 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/adbb31b5-ee6b-431b-ac95-7775688ba039-metrics-certs") pod "network-metrics-daemon-6wbcq" (UID: "adbb31b5-ee6b-431b-ac95-7775688ba039") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:29.486931 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:29.486767 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-95594\" (UniqueName: \"kubernetes.io/projected/0ee2ca50-34ff-4830-8c04-92018768a3a7-kube-api-access-95594\") pod \"network-check-target-wn6cd\" (UID: \"0ee2ca50-34ff-4830-8c04-92018768a3a7\") " pod="openshift-network-diagnostics/network-check-target-wn6cd" Apr 23 16:35:29.487125 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:29.486957 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:29.487125 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:29.486976 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:29.487125 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:29.486988 2562 projected.go:194] Error preparing data for projected volume kube-api-access-95594 for pod openshift-network-diagnostics/network-check-target-wn6cd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:29.487125 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:29.487051 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0ee2ca50-34ff-4830-8c04-92018768a3a7-kube-api-access-95594 podName:0ee2ca50-34ff-4830-8c04-92018768a3a7 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:37.487031731 +0000 UTC m=+17.339846772 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-95594" (UniqueName: "kubernetes.io/projected/0ee2ca50-34ff-4830-8c04-92018768a3a7-kube-api-access-95594") pod "network-check-target-wn6cd" (UID: "0ee2ca50-34ff-4830-8c04-92018768a3a7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:29.737787 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:29.737735 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6wbcq" Apr 23 16:35:29.737965 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:29.737898 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6wbcq" podUID="adbb31b5-ee6b-431b-ac95-7775688ba039" Apr 23 16:35:30.738653 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:30.738614 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qzdzg" Apr 23 16:35:30.739113 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:30.738735 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qzdzg" podUID="451de1c0-4375-4c37-8a62-0641aa75255d" Apr 23 16:35:30.739113 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:30.738823 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wn6cd" Apr 23 16:35:30.739113 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:30.738925 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wn6cd" podUID="0ee2ca50-34ff-4830-8c04-92018768a3a7" Apr 23 16:35:31.000349 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:31.000255 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/451de1c0-4375-4c37-8a62-0641aa75255d-original-pull-secret\") pod \"global-pull-secret-syncer-qzdzg\" (UID: \"451de1c0-4375-4c37-8a62-0641aa75255d\") " pod="kube-system/global-pull-secret-syncer-qzdzg" Apr 23 16:35:31.000500 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:31.000376 2562 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:31.000500 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:31.000450 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/451de1c0-4375-4c37-8a62-0641aa75255d-original-pull-secret podName:451de1c0-4375-4c37-8a62-0641aa75255d nodeName:}" failed. No retries permitted until 2026-04-23 16:35:35.000434364 +0000 UTC m=+14.853249410 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/451de1c0-4375-4c37-8a62-0641aa75255d-original-pull-secret") pod "global-pull-secret-syncer-qzdzg" (UID: "451de1c0-4375-4c37-8a62-0641aa75255d") : object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:31.738149 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:31.738110 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6wbcq" Apr 23 16:35:31.738311 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:31.738244 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6wbcq" podUID="adbb31b5-ee6b-431b-ac95-7775688ba039" Apr 23 16:35:32.738360 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:32.738320 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wn6cd" Apr 23 16:35:32.738814 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:32.738362 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qzdzg" Apr 23 16:35:32.738814 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:32.738439 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wn6cd" podUID="0ee2ca50-34ff-4830-8c04-92018768a3a7" Apr 23 16:35:32.738814 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:32.738576 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qzdzg" podUID="451de1c0-4375-4c37-8a62-0641aa75255d" Apr 23 16:35:33.737911 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:33.737879 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6wbcq" Apr 23 16:35:33.738100 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:33.738010 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6wbcq" podUID="adbb31b5-ee6b-431b-ac95-7775688ba039" Apr 23 16:35:34.740817 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:34.740787 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wn6cd" Apr 23 16:35:34.741245 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:34.740786 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qzdzg" Apr 23 16:35:34.741245 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:34.740900 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wn6cd" podUID="0ee2ca50-34ff-4830-8c04-92018768a3a7" Apr 23 16:35:34.741245 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:34.740984 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qzdzg" podUID="451de1c0-4375-4c37-8a62-0641aa75255d" Apr 23 16:35:35.030236 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:35.030138 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/451de1c0-4375-4c37-8a62-0641aa75255d-original-pull-secret\") pod \"global-pull-secret-syncer-qzdzg\" (UID: \"451de1c0-4375-4c37-8a62-0641aa75255d\") " pod="kube-system/global-pull-secret-syncer-qzdzg" Apr 23 16:35:35.030389 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:35.030309 2562 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:35.030432 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:35.030395 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/451de1c0-4375-4c37-8a62-0641aa75255d-original-pull-secret podName:451de1c0-4375-4c37-8a62-0641aa75255d nodeName:}" failed. No retries permitted until 2026-04-23 16:35:43.030375594 +0000 UTC m=+22.883190639 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/451de1c0-4375-4c37-8a62-0641aa75255d-original-pull-secret") pod "global-pull-secret-syncer-qzdzg" (UID: "451de1c0-4375-4c37-8a62-0641aa75255d") : object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:35.738165 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:35.738124 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6wbcq" Apr 23 16:35:35.738344 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:35.738255 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6wbcq" podUID="adbb31b5-ee6b-431b-ac95-7775688ba039" Apr 23 16:35:36.737900 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:36.737861 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wn6cd" Apr 23 16:35:36.738357 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:36.737910 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qzdzg" Apr 23 16:35:36.738357 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:36.737995 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wn6cd" podUID="0ee2ca50-34ff-4830-8c04-92018768a3a7" Apr 23 16:35:36.738357 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:36.738156 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qzdzg" podUID="451de1c0-4375-4c37-8a62-0641aa75255d" Apr 23 16:35:37.348002 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:37.347965 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/adbb31b5-ee6b-431b-ac95-7775688ba039-metrics-certs\") pod \"network-metrics-daemon-6wbcq\" (UID: \"adbb31b5-ee6b-431b-ac95-7775688ba039\") " pod="openshift-multus/network-metrics-daemon-6wbcq" Apr 23 16:35:37.348226 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:37.348145 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:37.348226 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:37.348213 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/adbb31b5-ee6b-431b-ac95-7775688ba039-metrics-certs podName:adbb31b5-ee6b-431b-ac95-7775688ba039 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:53.348194046 +0000 UTC m=+33.201009089 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/adbb31b5-ee6b-431b-ac95-7775688ba039-metrics-certs") pod "network-metrics-daemon-6wbcq" (UID: "adbb31b5-ee6b-431b-ac95-7775688ba039") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:37.549776 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:37.549723 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-95594\" (UniqueName: \"kubernetes.io/projected/0ee2ca50-34ff-4830-8c04-92018768a3a7-kube-api-access-95594\") pod \"network-check-target-wn6cd\" (UID: \"0ee2ca50-34ff-4830-8c04-92018768a3a7\") " pod="openshift-network-diagnostics/network-check-target-wn6cd" Apr 23 16:35:37.549963 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:37.549886 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:37.549963 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:37.549911 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:37.549963 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:37.549924 2562 projected.go:194] Error preparing data for projected volume kube-api-access-95594 for pod openshift-network-diagnostics/network-check-target-wn6cd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:37.550110 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:37.549990 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0ee2ca50-34ff-4830-8c04-92018768a3a7-kube-api-access-95594 podName:0ee2ca50-34ff-4830-8c04-92018768a3a7 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:53.549971998 +0000 UTC m=+33.402787059 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-95594" (UniqueName: "kubernetes.io/projected/0ee2ca50-34ff-4830-8c04-92018768a3a7-kube-api-access-95594") pod "network-check-target-wn6cd" (UID: "0ee2ca50-34ff-4830-8c04-92018768a3a7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:37.737963 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:37.737922 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6wbcq" Apr 23 16:35:37.738352 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:37.738071 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6wbcq" podUID="adbb31b5-ee6b-431b-ac95-7775688ba039" Apr 23 16:35:38.738392 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:38.738357 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qzdzg" Apr 23 16:35:38.738835 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:38.738357 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wn6cd" Apr 23 16:35:38.738835 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:38.738484 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qzdzg" podUID="451de1c0-4375-4c37-8a62-0641aa75255d" Apr 23 16:35:38.738835 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:38.738587 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wn6cd" podUID="0ee2ca50-34ff-4830-8c04-92018768a3a7" Apr 23 16:35:39.738187 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:39.737843 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6wbcq" Apr 23 16:35:39.738187 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:39.738174 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6wbcq" podUID="adbb31b5-ee6b-431b-ac95-7775688ba039" Apr 23 16:35:39.870590 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:39.870558 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-vllcp" event={"ID":"b15c0f5c-7376-4c94-b44b-d6eb7a192048","Type":"ContainerStarted","Data":"f125172fc39a06b496e52ac75b8955d8cd96ee784bd3b9a69badd8fcf3fdbab5"} Apr 23 16:35:39.874206 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:39.874177 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" event={"ID":"fd41c55a-49c5-41cd-8fd5-f7964f5444e9","Type":"ContainerStarted","Data":"b9cfa1e22395a61c164841f50fd0927de6f5c0539a246296603f5549fa6c5453"} Apr 23 16:35:39.874296 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:39.874215 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" event={"ID":"fd41c55a-49c5-41cd-8fd5-f7964f5444e9","Type":"ContainerStarted","Data":"b9963e08766a9b0e4cdfd54089bcdba4f56db14ed0d4e8fee7eee026c011dc19"} Apr 23 16:35:39.874296 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:39.874230 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" event={"ID":"fd41c55a-49c5-41cd-8fd5-f7964f5444e9","Type":"ContainerStarted","Data":"aedbbcebbc42730f57e62e1d115c9709f0650cc724e01e56f32e353724d0f4aa"} Apr 23 16:35:39.874296 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:39.874243 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" event={"ID":"fd41c55a-49c5-41cd-8fd5-f7964f5444e9","Type":"ContainerStarted","Data":"7db43f8f9414375ffe440281007c25ba605c03a4d59530899f1b951a9a4e1b48"} Apr 23 16:35:39.878186 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:39.878154 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-h6w7v" event={"ID":"27c0f8c4-6f27-4267-8d2e-7e39aa9adcef","Type":"ContainerStarted","Data":"eea8a218fb9a5ee9be6612ea93e7a66f33d21e266243d7911e35ceab5079258d"} Apr 23 16:35:39.879419 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:39.879395 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-bfsmf" event={"ID":"43aa3f2a-b871-42c1-a3ca-550b762203bf","Type":"ContainerStarted","Data":"58458241a95f6c879dfc9215055065f0aa7bb744f27c752663fa72e6f113e5c5"} Apr 23 16:35:39.881024 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:39.880985 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-57.ec2.internal" event={"ID":"cb72edff024abc1bfc776fb556b861df","Type":"ContainerStarted","Data":"4a9d0793cfb5436074f3c3382b3e117d6e4e750848ecb0cef099b785e1e17eb2"} Apr 23 16:35:39.968473 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:39.967548 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-vllcp" podStartSLOduration=2.640089952 podStartE2EDuration="19.967529011s" podCreationTimestamp="2026-04-23 16:35:20 +0000 UTC" firstStartedPulling="2026-04-23 16:35:21.977342618 +0000 UTC m=+1.830157667" lastFinishedPulling="2026-04-23 16:35:39.304781681 +0000 UTC m=+19.157596726" observedRunningTime="2026-04-23 16:35:39.919641228 +0000 UTC m=+19.772456333" watchObservedRunningTime="2026-04-23 16:35:39.967529011 +0000 UTC m=+19.820344075" Apr 23 16:35:39.969020 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:39.968985 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-57.ec2.internal" podStartSLOduration=18.968969054 podStartE2EDuration="18.968969054s" podCreationTimestamp="2026-04-23 16:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:35:39.96773453 +0000 UTC m=+19.820549593" watchObservedRunningTime="2026-04-23 16:35:39.968969054 +0000 UTC m=+19.821784154" Apr 23 16:35:40.738270 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:40.738236 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wn6cd" Apr 23 16:35:40.738411 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:40.738324 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wn6cd" podUID="0ee2ca50-34ff-4830-8c04-92018768a3a7" Apr 23 16:35:40.738411 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:40.738401 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qzdzg" Apr 23 16:35:40.738503 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:40.738485 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qzdzg" podUID="451de1c0-4375-4c37-8a62-0641aa75255d" Apr 23 16:35:40.883860 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:40.883824 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-bfsmf" event={"ID":"43aa3f2a-b871-42c1-a3ca-550b762203bf","Type":"ContainerStarted","Data":"f40fb0af1f751fdd8d9fb47c1bcaa16afa1958ba1db53e47e0c7317b7f898fc8"} Apr 23 16:35:40.885149 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:40.885123 2562 generic.go:358] "Generic (PLEG): container finished" podID="2226bf67-86e3-4375-a378-075aedce2ea6" containerID="8a44ff9bcaf3ad2f50d2e7d1ceaad2bf0b62e520792742a58fff09c9bb109496" exitCode=0 Apr 23 16:35:40.885237 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:40.885197 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q6xdb" event={"ID":"2226bf67-86e3-4375-a378-075aedce2ea6","Type":"ContainerDied","Data":"8a44ff9bcaf3ad2f50d2e7d1ceaad2bf0b62e520792742a58fff09c9bb109496"} Apr 23 16:35:40.886541 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:40.886517 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-h68c5" event={"ID":"e5a7d4ac-2ec3-48b6-8f5f-0d911a26eb86","Type":"ContainerStarted","Data":"13d1145debd5539f193a9dd4fc2b0ea174edce1980b7cd5c508271f5a14a28f3"} Apr 23 16:35:40.888865 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:40.888844 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" event={"ID":"fd41c55a-49c5-41cd-8fd5-f7964f5444e9","Type":"ContainerStarted","Data":"72681110a562e5cc848e90b82fd6ac7f56ad311745334b507153e841644978b9"} Apr 23 16:35:40.888964 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:40.888872 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" event={"ID":"fd41c55a-49c5-41cd-8fd5-f7964f5444e9","Type":"ContainerStarted","Data":"b00fe22345c3de86aca4744199c80395e90ed8cf2bab22f3524ba389184feba4"} Apr 23 16:35:40.890046 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:40.890021 2562 generic.go:358] "Generic (PLEG): container finished" podID="6bfe44dce378e2f142aa223226db1983" containerID="cde5434bd2407011e95ebd5261e0402dfc1091217d92cb4807af0f5ce06c1862" exitCode=0 Apr 23 16:35:40.890136 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:40.890060 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-57.ec2.internal" event={"ID":"6bfe44dce378e2f142aa223226db1983","Type":"ContainerDied","Data":"cde5434bd2407011e95ebd5261e0402dfc1091217d92cb4807af0f5ce06c1862"} Apr 23 16:35:40.891674 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:40.891656 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wcjsr" event={"ID":"3e1cd67e-5850-4c5d-b72f-849d619e4d11","Type":"ContainerStarted","Data":"5617fe7b89c9a35924256d721dacb89e034d0e90a0867cf4fc18b71f0e086d2a"} Apr 23 16:35:40.892908 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:40.892881 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-86gqj" event={"ID":"a58d25fc-7e16-47db-81f3-d8e27f59a92b","Type":"ContainerStarted","Data":"6fd34b3773fd17efc8b6d49c49f1500bd4550b60bfa332d8a6fdb6f5b601a517"} Apr 23 16:35:40.894400 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:40.894378 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-kjd4g" event={"ID":"380d4b90-b7df-4855-ae70-0dc24d42f0d3","Type":"ContainerStarted","Data":"fbd45c2caf22f756e0bed269da7747dcb465c327c01d8ae339de26671b0e9c9d"} Apr 23 16:35:40.907344 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:40.907302 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-h6w7v" podStartSLOduration=3.409287721 podStartE2EDuration="20.907289339s" podCreationTimestamp="2026-04-23 16:35:20 +0000 UTC" firstStartedPulling="2026-04-23 16:35:21.940297406 +0000 UTC m=+1.793112447" lastFinishedPulling="2026-04-23 16:35:39.438299018 +0000 UTC m=+19.291114065" observedRunningTime="2026-04-23 16:35:40.048942166 +0000 UTC m=+19.901757228" watchObservedRunningTime="2026-04-23 16:35:40.907289339 +0000 UTC m=+20.760104403" Apr 23 16:35:40.908000 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:40.907980 2562 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 23 16:35:40.930364 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:40.930015 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-h68c5" podStartSLOduration=3.615452341 podStartE2EDuration="20.92999747s" podCreationTimestamp="2026-04-23 16:35:20 +0000 UTC" firstStartedPulling="2026-04-23 16:35:21.982697016 +0000 UTC m=+1.835512057" lastFinishedPulling="2026-04-23 16:35:39.297242139 +0000 UTC m=+19.150057186" observedRunningTime="2026-04-23 16:35:40.929555483 +0000 UTC m=+20.782370547" watchObservedRunningTime="2026-04-23 16:35:40.92999747 +0000 UTC m=+20.782812535" Apr 23 16:35:40.930853 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:40.930812 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-bfsmf" podStartSLOduration=14.930798197 podStartE2EDuration="14.930798197s" podCreationTimestamp="2026-04-23 16:35:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:35:40.908933091 +0000 UTC m=+20.761748156" watchObservedRunningTime="2026-04-23 16:35:40.930798197 +0000 UTC m=+20.783613263" Apr 23 16:35:41.018172 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:41.018123 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-86gqj" podStartSLOduration=3.723434728 podStartE2EDuration="21.018110771s" podCreationTimestamp="2026-04-23 16:35:20 +0000 UTC" firstStartedPulling="2026-04-23 16:35:22.002579294 +0000 UTC m=+1.855394336" lastFinishedPulling="2026-04-23 16:35:39.297255324 +0000 UTC m=+19.150070379" observedRunningTime="2026-04-23 16:35:41.017925448 +0000 UTC m=+20.870740512" watchObservedRunningTime="2026-04-23 16:35:41.018110771 +0000 UTC m=+20.870925834" Apr 23 16:35:41.018330 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:41.018295 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-kjd4g" podStartSLOduration=3.679544321 podStartE2EDuration="21.018292217s" podCreationTimestamp="2026-04-23 16:35:20 +0000 UTC" firstStartedPulling="2026-04-23 16:35:21.958510134 +0000 UTC m=+1.811325191" lastFinishedPulling="2026-04-23 16:35:39.297258037 +0000 UTC m=+19.150073087" observedRunningTime="2026-04-23 16:35:40.993291518 +0000 UTC m=+20.846106583" watchObservedRunningTime="2026-04-23 16:35:41.018292217 +0000 UTC m=+20.871107328" Apr 23 16:35:41.686119 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:41.686034 2562 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-23T16:35:40.907996936Z","UUID":"c1f202d9-f009-4163-93d0-7ff9729009e9","Handler":null,"Name":"","Endpoint":""} Apr 23 16:35:41.687645 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:41.687615 2562 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 23 16:35:41.687645 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:41.687645 2562 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 23 16:35:41.737534 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:41.737503 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6wbcq" Apr 23 16:35:41.737713 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:41.737640 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6wbcq" podUID="adbb31b5-ee6b-431b-ac95-7775688ba039" Apr 23 16:35:41.898417 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:41.898378 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-57.ec2.internal" event={"ID":"6bfe44dce378e2f142aa223226db1983","Type":"ContainerStarted","Data":"18afea7efd07e0ae8846bf1e0d5d1c5e4d598980cb9bc87c07c53a7917d6f6a7"} Apr 23 16:35:41.900499 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:41.900473 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wcjsr" event={"ID":"3e1cd67e-5850-4c5d-b72f-849d619e4d11","Type":"ContainerStarted","Data":"52e65607463a752a7339839418dd7d6ec4cc78383b5891569afb713cfef1440d"} Apr 23 16:35:41.900622 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:41.900504 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wcjsr" event={"ID":"3e1cd67e-5850-4c5d-b72f-849d619e4d11","Type":"ContainerStarted","Data":"1f7325eb447f98e1ada4430b51836a93e59e93caf0cf4da06659d372ab89d2fc"} Apr 23 16:35:41.930208 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:41.930162 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-57.ec2.internal" podStartSLOduration=20.930147494 podStartE2EDuration="20.930147494s" podCreationTimestamp="2026-04-23 16:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:35:41.929915388 +0000 UTC m=+21.782730451" watchObservedRunningTime="2026-04-23 16:35:41.930147494 +0000 UTC m=+21.782962558" Apr 23 16:35:41.956505 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:41.956391 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-wcjsr" podStartSLOduration=2.265632276 podStartE2EDuration="21.956372627s" podCreationTimestamp="2026-04-23 16:35:20 +0000 UTC" firstStartedPulling="2026-04-23 16:35:22.008412853 +0000 UTC m=+1.861227895" lastFinishedPulling="2026-04-23 16:35:41.699153191 +0000 UTC m=+21.551968246" observedRunningTime="2026-04-23 16:35:41.955506657 +0000 UTC m=+21.808321722" watchObservedRunningTime="2026-04-23 16:35:41.956372627 +0000 UTC m=+21.809187691" Apr 23 16:35:42.738027 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:42.737996 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wn6cd" Apr 23 16:35:42.738292 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:42.738006 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qzdzg" Apr 23 16:35:42.738292 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:42.738123 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wn6cd" podUID="0ee2ca50-34ff-4830-8c04-92018768a3a7" Apr 23 16:35:42.738292 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:42.738204 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qzdzg" podUID="451de1c0-4375-4c37-8a62-0641aa75255d" Apr 23 16:35:42.905712 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:42.905671 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" event={"ID":"fd41c55a-49c5-41cd-8fd5-f7964f5444e9","Type":"ContainerStarted","Data":"26e70560244d6f2cf6ee1f4b9d0e02134cd51d7fc087ac02c0b72bdb64ea19a8"} Apr 23 16:35:43.095558 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:43.095475 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/451de1c0-4375-4c37-8a62-0641aa75255d-original-pull-secret\") pod \"global-pull-secret-syncer-qzdzg\" (UID: \"451de1c0-4375-4c37-8a62-0641aa75255d\") " pod="kube-system/global-pull-secret-syncer-qzdzg" Apr 23 16:35:43.095715 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:43.095628 2562 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:43.095715 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:43.095700 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/451de1c0-4375-4c37-8a62-0641aa75255d-original-pull-secret podName:451de1c0-4375-4c37-8a62-0641aa75255d nodeName:}" failed. No retries permitted until 2026-04-23 16:35:59.095681237 +0000 UTC m=+38.948496293 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/451de1c0-4375-4c37-8a62-0641aa75255d-original-pull-secret") pod "global-pull-secret-syncer-qzdzg" (UID: "451de1c0-4375-4c37-8a62-0641aa75255d") : object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:43.738403 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:43.738371 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6wbcq" Apr 23 16:35:43.738584 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:43.738517 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6wbcq" podUID="adbb31b5-ee6b-431b-ac95-7775688ba039" Apr 23 16:35:44.679803 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:44.679609 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-86gqj" Apr 23 16:35:44.680261 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:44.680240 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-86gqj" Apr 23 16:35:44.738234 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:44.738204 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wn6cd" Apr 23 16:35:44.738413 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:44.738321 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wn6cd" podUID="0ee2ca50-34ff-4830-8c04-92018768a3a7" Apr 23 16:35:44.738413 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:44.738387 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qzdzg" Apr 23 16:35:44.738530 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:44.738493 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qzdzg" podUID="451de1c0-4375-4c37-8a62-0641aa75255d" Apr 23 16:35:45.294891 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:45.294799 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-86gqj" Apr 23 16:35:45.295451 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:45.295421 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-86gqj" Apr 23 16:35:45.737845 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:45.737662 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6wbcq" Apr 23 16:35:45.738443 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:45.737922 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6wbcq" podUID="adbb31b5-ee6b-431b-ac95-7775688ba039" Apr 23 16:35:45.918331 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:45.918295 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" event={"ID":"fd41c55a-49c5-41cd-8fd5-f7964f5444e9","Type":"ContainerStarted","Data":"573286a20883256f0acc22e865e79914fdf565e4c7716af7562855cb401be18d"} Apr 23 16:35:45.918661 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:45.918635 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:45.918818 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:45.918683 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:45.919917 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:45.919897 2562 generic.go:358] "Generic (PLEG): container finished" podID="2226bf67-86e3-4375-a378-075aedce2ea6" containerID="8f6c137e9d7297b6c5d43cd5f55e2bcad83c0a14926d96be05a5df70f6e6bada" exitCode=0 Apr 23 16:35:45.920016 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:45.919986 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q6xdb" event={"ID":"2226bf67-86e3-4375-a378-075aedce2ea6","Type":"ContainerDied","Data":"8f6c137e9d7297b6c5d43cd5f55e2bcad83c0a14926d96be05a5df70f6e6bada"} Apr 23 16:35:45.933569 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:45.933545 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:45.952694 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:45.952648 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" podStartSLOduration=8.567802007000001 podStartE2EDuration="25.952636162s" podCreationTimestamp="2026-04-23 16:35:20 +0000 UTC" firstStartedPulling="2026-04-23 16:35:21.964841226 +0000 UTC m=+1.817656268" lastFinishedPulling="2026-04-23 16:35:39.349675381 +0000 UTC m=+19.202490423" observedRunningTime="2026-04-23 16:35:45.951019712 +0000 UTC m=+25.803834776" watchObservedRunningTime="2026-04-23 16:35:45.952636162 +0000 UTC m=+25.805451225" Apr 23 16:35:46.738141 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:46.738106 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wn6cd" Apr 23 16:35:46.738586 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:46.738112 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qzdzg" Apr 23 16:35:46.738586 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:46.738208 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wn6cd" podUID="0ee2ca50-34ff-4830-8c04-92018768a3a7" Apr 23 16:35:46.738586 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:46.738283 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qzdzg" podUID="451de1c0-4375-4c37-8a62-0641aa75255d" Apr 23 16:35:46.923385 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:46.923352 2562 generic.go:358] "Generic (PLEG): container finished" podID="2226bf67-86e3-4375-a378-075aedce2ea6" containerID="e0b7d270cb1e8526cfb08e547598e3cc46e44996437cbc0a4edc3e18dcacfaa0" exitCode=0 Apr 23 16:35:46.923575 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:46.923452 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q6xdb" event={"ID":"2226bf67-86e3-4375-a378-075aedce2ea6","Type":"ContainerDied","Data":"e0b7d270cb1e8526cfb08e547598e3cc46e44996437cbc0a4edc3e18dcacfaa0"} Apr 23 16:35:46.924260 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:46.924215 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:46.938527 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:46.938503 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:35:47.737494 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:47.737406 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6wbcq" Apr 23 16:35:47.737624 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:47.737513 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6wbcq" podUID="adbb31b5-ee6b-431b-ac95-7775688ba039" Apr 23 16:35:47.927261 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:47.927225 2562 generic.go:358] "Generic (PLEG): container finished" podID="2226bf67-86e3-4375-a378-075aedce2ea6" containerID="f3b6b3105111a6e358c009441f9e38a894d9cc422703fe530e01c68e3285bc91" exitCode=0 Apr 23 16:35:47.927617 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:47.927263 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q6xdb" event={"ID":"2226bf67-86e3-4375-a378-075aedce2ea6","Type":"ContainerDied","Data":"f3b6b3105111a6e358c009441f9e38a894d9cc422703fe530e01c68e3285bc91"} Apr 23 16:35:48.743405 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:48.743378 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wn6cd" Apr 23 16:35:48.743567 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:48.743385 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qzdzg" Apr 23 16:35:48.743567 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:48.743483 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wn6cd" podUID="0ee2ca50-34ff-4830-8c04-92018768a3a7" Apr 23 16:35:48.743679 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:48.743580 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qzdzg" podUID="451de1c0-4375-4c37-8a62-0641aa75255d" Apr 23 16:35:49.737645 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:49.737609 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6wbcq" Apr 23 16:35:49.738115 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:49.737728 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6wbcq" podUID="adbb31b5-ee6b-431b-ac95-7775688ba039" Apr 23 16:35:50.738954 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:50.738919 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wn6cd" Apr 23 16:35:50.739398 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:50.739010 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qzdzg" Apr 23 16:35:50.739398 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:50.739060 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wn6cd" podUID="0ee2ca50-34ff-4830-8c04-92018768a3a7" Apr 23 16:35:50.739398 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:50.739101 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qzdzg" podUID="451de1c0-4375-4c37-8a62-0641aa75255d" Apr 23 16:35:51.737990 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:51.737961 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6wbcq" Apr 23 16:35:51.738161 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:51.738078 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6wbcq" podUID="adbb31b5-ee6b-431b-ac95-7775688ba039" Apr 23 16:35:52.738066 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:52.738019 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wn6cd" Apr 23 16:35:52.738495 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:52.738157 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wn6cd" podUID="0ee2ca50-34ff-4830-8c04-92018768a3a7" Apr 23 16:35:52.738495 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:52.738239 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qzdzg" Apr 23 16:35:52.738495 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:52.738348 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qzdzg" podUID="451de1c0-4375-4c37-8a62-0641aa75255d" Apr 23 16:35:53.377158 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:53.377119 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/adbb31b5-ee6b-431b-ac95-7775688ba039-metrics-certs\") pod \"network-metrics-daemon-6wbcq\" (UID: \"adbb31b5-ee6b-431b-ac95-7775688ba039\") " pod="openshift-multus/network-metrics-daemon-6wbcq" Apr 23 16:35:53.377325 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:53.377262 2562 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:53.377368 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:53.377335 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/adbb31b5-ee6b-431b-ac95-7775688ba039-metrics-certs podName:adbb31b5-ee6b-431b-ac95-7775688ba039 nodeName:}" failed. No retries permitted until 2026-04-23 16:36:25.377311217 +0000 UTC m=+65.230126258 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/adbb31b5-ee6b-431b-ac95-7775688ba039-metrics-certs") pod "network-metrics-daemon-6wbcq" (UID: "adbb31b5-ee6b-431b-ac95-7775688ba039") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:53.578917 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:53.578882 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-95594\" (UniqueName: \"kubernetes.io/projected/0ee2ca50-34ff-4830-8c04-92018768a3a7-kube-api-access-95594\") pod \"network-check-target-wn6cd\" (UID: \"0ee2ca50-34ff-4830-8c04-92018768a3a7\") " pod="openshift-network-diagnostics/network-check-target-wn6cd" Apr 23 16:35:53.579079 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:53.579005 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:53.579079 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:53.579018 2562 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:53.579079 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:53.579027 2562 projected.go:194] Error preparing data for projected volume kube-api-access-95594 for pod openshift-network-diagnostics/network-check-target-wn6cd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:53.579079 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:53.579075 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0ee2ca50-34ff-4830-8c04-92018768a3a7-kube-api-access-95594 podName:0ee2ca50-34ff-4830-8c04-92018768a3a7 nodeName:}" failed. No retries permitted until 2026-04-23 16:36:25.579062682 +0000 UTC m=+65.431877724 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-95594" (UniqueName: "kubernetes.io/projected/0ee2ca50-34ff-4830-8c04-92018768a3a7-kube-api-access-95594") pod "network-check-target-wn6cd" (UID: "0ee2ca50-34ff-4830-8c04-92018768a3a7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:53.738344 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:53.738313 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6wbcq" Apr 23 16:35:53.738690 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:53.738433 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6wbcq" podUID="adbb31b5-ee6b-431b-ac95-7775688ba039" Apr 23 16:35:53.941210 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:53.941177 2562 generic.go:358] "Generic (PLEG): container finished" podID="2226bf67-86e3-4375-a378-075aedce2ea6" containerID="ebd278068bd00dfeb1d00d11ce5977c0cf70f2fa6929ae344b6127e030274d13" exitCode=0 Apr 23 16:35:53.941210 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:53.941218 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q6xdb" event={"ID":"2226bf67-86e3-4375-a378-075aedce2ea6","Type":"ContainerDied","Data":"ebd278068bd00dfeb1d00d11ce5977c0cf70f2fa6929ae344b6127e030274d13"} Apr 23 16:35:54.740096 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:54.740069 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qzdzg" Apr 23 16:35:54.740452 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:54.740072 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wn6cd" Apr 23 16:35:54.740452 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:54.740162 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qzdzg" podUID="451de1c0-4375-4c37-8a62-0641aa75255d" Apr 23 16:35:54.740452 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:54.740254 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wn6cd" podUID="0ee2ca50-34ff-4830-8c04-92018768a3a7" Apr 23 16:35:54.945559 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:54.945521 2562 generic.go:358] "Generic (PLEG): container finished" podID="2226bf67-86e3-4375-a378-075aedce2ea6" containerID="43560647a0d83fe5e35e7621efe038ab83df244a47c4e2f676e25fb242ef67b0" exitCode=0 Apr 23 16:35:54.945714 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:54.945578 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q6xdb" event={"ID":"2226bf67-86e3-4375-a378-075aedce2ea6","Type":"ContainerDied","Data":"43560647a0d83fe5e35e7621efe038ab83df244a47c4e2f676e25fb242ef67b0"} Apr 23 16:35:55.738296 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:55.738247 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6wbcq" Apr 23 16:35:55.738480 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:55.738373 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6wbcq" podUID="adbb31b5-ee6b-431b-ac95-7775688ba039" Apr 23 16:35:55.949688 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:55.949652 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-q6xdb" event={"ID":"2226bf67-86e3-4375-a378-075aedce2ea6","Type":"ContainerStarted","Data":"b27bd303bf503355b178a64c4611493933547b8f7ece179ce5acd8415a617cee"} Apr 23 16:35:55.981535 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:55.981487 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-q6xdb" podStartSLOduration=4.325186099 podStartE2EDuration="35.98146936s" podCreationTimestamp="2026-04-23 16:35:20 +0000 UTC" firstStartedPulling="2026-04-23 16:35:21.986927637 +0000 UTC m=+1.839742679" lastFinishedPulling="2026-04-23 16:35:53.643210889 +0000 UTC m=+33.496025940" observedRunningTime="2026-04-23 16:35:55.979888419 +0000 UTC m=+35.832703483" watchObservedRunningTime="2026-04-23 16:35:55.98146936 +0000 UTC m=+35.834284424" Apr 23 16:35:56.740195 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:56.740164 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wn6cd" Apr 23 16:35:56.740353 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:56.740164 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qzdzg" Apr 23 16:35:56.740353 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:56.740270 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wn6cd" podUID="0ee2ca50-34ff-4830-8c04-92018768a3a7" Apr 23 16:35:56.740353 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:56.740323 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qzdzg" podUID="451de1c0-4375-4c37-8a62-0641aa75255d" Apr 23 16:35:57.737722 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:57.737687 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6wbcq" Apr 23 16:35:57.738164 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:57.737829 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6wbcq" podUID="adbb31b5-ee6b-431b-ac95-7775688ba039" Apr 23 16:35:58.737756 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:58.737712 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wn6cd" Apr 23 16:35:58.738127 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:58.737715 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qzdzg" Apr 23 16:35:58.738127 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:58.737817 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wn6cd" podUID="0ee2ca50-34ff-4830-8c04-92018768a3a7" Apr 23 16:35:58.738127 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:58.737898 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qzdzg" podUID="451de1c0-4375-4c37-8a62-0641aa75255d" Apr 23 16:35:59.126887 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:59.126790 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/451de1c0-4375-4c37-8a62-0641aa75255d-original-pull-secret\") pod \"global-pull-secret-syncer-qzdzg\" (UID: \"451de1c0-4375-4c37-8a62-0641aa75255d\") " pod="kube-system/global-pull-secret-syncer-qzdzg" Apr 23 16:35:59.127049 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:59.126962 2562 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:59.127049 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:59.127042 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/451de1c0-4375-4c37-8a62-0641aa75255d-original-pull-secret podName:451de1c0-4375-4c37-8a62-0641aa75255d nodeName:}" failed. No retries permitted until 2026-04-23 16:36:31.127022105 +0000 UTC m=+70.979837160 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/451de1c0-4375-4c37-8a62-0641aa75255d-original-pull-secret") pod "global-pull-secret-syncer-qzdzg" (UID: "451de1c0-4375-4c37-8a62-0641aa75255d") : object "kube-system"/"original-pull-secret" not registered Apr 23 16:35:59.738132 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:35:59.738094 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6wbcq" Apr 23 16:35:59.738502 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:35:59.738243 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6wbcq" podUID="adbb31b5-ee6b-431b-ac95-7775688ba039" Apr 23 16:36:00.739158 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:00.739130 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wn6cd" Apr 23 16:36:00.739622 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:36:00.739227 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wn6cd" podUID="0ee2ca50-34ff-4830-8c04-92018768a3a7" Apr 23 16:36:00.739622 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:00.739300 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qzdzg" Apr 23 16:36:00.739622 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:36:00.739381 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qzdzg" podUID="451de1c0-4375-4c37-8a62-0641aa75255d" Apr 23 16:36:01.737685 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:01.737649 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6wbcq" Apr 23 16:36:01.737863 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:36:01.737775 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6wbcq" podUID="adbb31b5-ee6b-431b-ac95-7775688ba039" Apr 23 16:36:02.737886 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:02.737853 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wn6cd" Apr 23 16:36:02.738257 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:36:02.737952 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wn6cd" podUID="0ee2ca50-34ff-4830-8c04-92018768a3a7" Apr 23 16:36:02.738257 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:02.737992 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qzdzg" Apr 23 16:36:02.738257 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:36:02.738041 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qzdzg" podUID="451de1c0-4375-4c37-8a62-0641aa75255d" Apr 23 16:36:03.738235 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:03.738199 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6wbcq" Apr 23 16:36:03.738641 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:36:03.738309 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6wbcq" podUID="adbb31b5-ee6b-431b-ac95-7775688ba039" Apr 23 16:36:04.737709 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:04.737543 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qzdzg" Apr 23 16:36:04.737883 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:04.737553 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wn6cd" Apr 23 16:36:04.737883 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:36:04.737811 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qzdzg" podUID="451de1c0-4375-4c37-8a62-0641aa75255d" Apr 23 16:36:04.737953 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:36:04.737928 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wn6cd" podUID="0ee2ca50-34ff-4830-8c04-92018768a3a7" Apr 23 16:36:04.834288 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:04.834250 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-qzdzg"] Apr 23 16:36:04.838790 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:04.838593 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-wn6cd"] Apr 23 16:36:04.839289 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:04.839263 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6wbcq"] Apr 23 16:36:04.839376 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:04.839368 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6wbcq" Apr 23 16:36:04.839508 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:36:04.839485 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6wbcq" podUID="adbb31b5-ee6b-431b-ac95-7775688ba039" Apr 23 16:36:04.966328 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:04.966294 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wn6cd" Apr 23 16:36:04.966499 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:04.966346 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qzdzg" Apr 23 16:36:04.966499 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:36:04.966421 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qzdzg" podUID="451de1c0-4375-4c37-8a62-0641aa75255d" Apr 23 16:36:04.966499 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:36:04.966467 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wn6cd" podUID="0ee2ca50-34ff-4830-8c04-92018768a3a7" Apr 23 16:36:06.738448 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:06.738409 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6wbcq" Apr 23 16:36:06.738955 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:06.738453 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wn6cd" Apr 23 16:36:06.738955 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:36:06.738556 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6wbcq" podUID="adbb31b5-ee6b-431b-ac95-7775688ba039" Apr 23 16:36:06.738955 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:06.738616 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qzdzg" Apr 23 16:36:06.738955 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:36:06.738608 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wn6cd" podUID="0ee2ca50-34ff-4830-8c04-92018768a3a7" Apr 23 16:36:06.738955 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:36:06.738673 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qzdzg" podUID="451de1c0-4375-4c37-8a62-0641aa75255d" Apr 23 16:36:08.738418 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:08.738382 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wn6cd" Apr 23 16:36:08.738926 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:08.738382 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qzdzg" Apr 23 16:36:08.738926 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:36:08.738486 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wn6cd" podUID="0ee2ca50-34ff-4830-8c04-92018768a3a7" Apr 23 16:36:08.738926 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:36:08.738582 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qzdzg" podUID="451de1c0-4375-4c37-8a62-0641aa75255d" Apr 23 16:36:08.738926 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:08.738397 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6wbcq" Apr 23 16:36:08.738926 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:36:08.738665 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-6wbcq" podUID="adbb31b5-ee6b-431b-ac95-7775688ba039" Apr 23 16:36:09.490919 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.490890 2562 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-57.ec2.internal" event="NodeReady" Apr 23 16:36:09.491091 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.491023 2562 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 23 16:36:09.564581 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.564547 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-mh25k"] Apr 23 16:36:09.577202 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.577167 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-wfdh5"] Apr 23 16:36:09.577339 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.577320 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-mh25k" Apr 23 16:36:09.581470 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.581441 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 23 16:36:09.584653 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.583485 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 23 16:36:09.584653 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.583641 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-w6ksl\"" Apr 23 16:36:09.586556 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.586377 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 23 16:36:09.586556 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.586406 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-mh25k"] Apr 23 16:36:09.586556 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.586555 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wfdh5" Apr 23 16:36:09.591578 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.591559 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 23 16:36:09.591686 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.591582 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-szgdw\"" Apr 23 16:36:09.597863 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.597841 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wfdh5"] Apr 23 16:36:09.602855 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.602832 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh4gx\" (UniqueName: \"kubernetes.io/projected/aa24ec5c-a6a3-4c23-90a0-58ac28a5e1f9-kube-api-access-rh4gx\") pod \"ingress-canary-mh25k\" (UID: \"aa24ec5c-a6a3-4c23-90a0-58ac28a5e1f9\") " pod="openshift-ingress-canary/ingress-canary-mh25k" Apr 23 16:36:09.602940 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.602871 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aa24ec5c-a6a3-4c23-90a0-58ac28a5e1f9-cert\") pod \"ingress-canary-mh25k\" (UID: \"aa24ec5c-a6a3-4c23-90a0-58ac28a5e1f9\") " pod="openshift-ingress-canary/ingress-canary-mh25k" Apr 23 16:36:09.610999 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.610981 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 23 16:36:09.625675 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.625650 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-56lmt"] Apr 23 16:36:09.637772 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.637733 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-56lmt" Apr 23 16:36:09.640710 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.640688 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 23 16:36:09.640819 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.640775 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 23 16:36:09.640861 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.640839 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 23 16:36:09.641065 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.641051 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 23 16:36:09.641541 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.641527 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-6plmq\"" Apr 23 16:36:09.645376 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.645355 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-56lmt"] Apr 23 16:36:09.703690 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.703657 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aa24ec5c-a6a3-4c23-90a0-58ac28a5e1f9-cert\") pod \"ingress-canary-mh25k\" (UID: \"aa24ec5c-a6a3-4c23-90a0-58ac28a5e1f9\") " pod="openshift-ingress-canary/ingress-canary-mh25k" Apr 23 16:36:09.703690 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.703699 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f9f9b357-558a-42d8-b068-7295ba867330-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-56lmt\" (UID: \"f9f9b357-558a-42d8-b068-7295ba867330\") " pod="openshift-insights/insights-runtime-extractor-56lmt" Apr 23 16:36:09.703958 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.703730 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcfvq\" (UniqueName: \"kubernetes.io/projected/eb5d7213-8672-4ab5-9189-ec203ba60c84-kube-api-access-mcfvq\") pod \"dns-default-wfdh5\" (UID: \"eb5d7213-8672-4ab5-9189-ec203ba60c84\") " pod="openshift-dns/dns-default-wfdh5" Apr 23 16:36:09.703958 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.703764 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f9f9b357-558a-42d8-b068-7295ba867330-data-volume\") pod \"insights-runtime-extractor-56lmt\" (UID: \"f9f9b357-558a-42d8-b068-7295ba867330\") " pod="openshift-insights/insights-runtime-extractor-56lmt" Apr 23 16:36:09.703958 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.703780 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eb5d7213-8672-4ab5-9189-ec203ba60c84-metrics-tls\") pod \"dns-default-wfdh5\" (UID: \"eb5d7213-8672-4ab5-9189-ec203ba60c84\") " pod="openshift-dns/dns-default-wfdh5" Apr 23 16:36:09.703958 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.703855 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f9f9b357-558a-42d8-b068-7295ba867330-crio-socket\") pod \"insights-runtime-extractor-56lmt\" (UID: \"f9f9b357-558a-42d8-b068-7295ba867330\") " pod="openshift-insights/insights-runtime-extractor-56lmt" Apr 23 16:36:09.703958 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.703889 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgp2c\" (UniqueName: \"kubernetes.io/projected/f9f9b357-558a-42d8-b068-7295ba867330-kube-api-access-zgp2c\") pod \"insights-runtime-extractor-56lmt\" (UID: \"f9f9b357-558a-42d8-b068-7295ba867330\") " pod="openshift-insights/insights-runtime-extractor-56lmt" Apr 23 16:36:09.703958 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.703910 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eb5d7213-8672-4ab5-9189-ec203ba60c84-config-volume\") pod \"dns-default-wfdh5\" (UID: \"eb5d7213-8672-4ab5-9189-ec203ba60c84\") " pod="openshift-dns/dns-default-wfdh5" Apr 23 16:36:09.703958 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.703926 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/eb5d7213-8672-4ab5-9189-ec203ba60c84-tmp-dir\") pod \"dns-default-wfdh5\" (UID: \"eb5d7213-8672-4ab5-9189-ec203ba60c84\") " pod="openshift-dns/dns-default-wfdh5" Apr 23 16:36:09.704210 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.703981 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rh4gx\" (UniqueName: \"kubernetes.io/projected/aa24ec5c-a6a3-4c23-90a0-58ac28a5e1f9-kube-api-access-rh4gx\") pod \"ingress-canary-mh25k\" (UID: \"aa24ec5c-a6a3-4c23-90a0-58ac28a5e1f9\") " pod="openshift-ingress-canary/ingress-canary-mh25k" Apr 23 16:36:09.704210 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.703997 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f9f9b357-558a-42d8-b068-7295ba867330-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-56lmt\" (UID: \"f9f9b357-558a-42d8-b068-7295ba867330\") " pod="openshift-insights/insights-runtime-extractor-56lmt" Apr 23 16:36:09.707861 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.707832 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/aa24ec5c-a6a3-4c23-90a0-58ac28a5e1f9-cert\") pod \"ingress-canary-mh25k\" (UID: \"aa24ec5c-a6a3-4c23-90a0-58ac28a5e1f9\") " pod="openshift-ingress-canary/ingress-canary-mh25k" Apr 23 16:36:09.712734 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.712711 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rh4gx\" (UniqueName: \"kubernetes.io/projected/aa24ec5c-a6a3-4c23-90a0-58ac28a5e1f9-kube-api-access-rh4gx\") pod \"ingress-canary-mh25k\" (UID: \"aa24ec5c-a6a3-4c23-90a0-58ac28a5e1f9\") " pod="openshift-ingress-canary/ingress-canary-mh25k" Apr 23 16:36:09.805144 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.805063 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mcfvq\" (UniqueName: \"kubernetes.io/projected/eb5d7213-8672-4ab5-9189-ec203ba60c84-kube-api-access-mcfvq\") pod \"dns-default-wfdh5\" (UID: \"eb5d7213-8672-4ab5-9189-ec203ba60c84\") " pod="openshift-dns/dns-default-wfdh5" Apr 23 16:36:09.805144 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.805119 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f9f9b357-558a-42d8-b068-7295ba867330-data-volume\") pod \"insights-runtime-extractor-56lmt\" (UID: \"f9f9b357-558a-42d8-b068-7295ba867330\") " pod="openshift-insights/insights-runtime-extractor-56lmt" Apr 23 16:36:09.805719 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.805146 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eb5d7213-8672-4ab5-9189-ec203ba60c84-metrics-tls\") pod \"dns-default-wfdh5\" (UID: \"eb5d7213-8672-4ab5-9189-ec203ba60c84\") " pod="openshift-dns/dns-default-wfdh5" Apr 23 16:36:09.805719 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.805190 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f9f9b357-558a-42d8-b068-7295ba867330-crio-socket\") pod \"insights-runtime-extractor-56lmt\" (UID: \"f9f9b357-558a-42d8-b068-7295ba867330\") " pod="openshift-insights/insights-runtime-extractor-56lmt" Apr 23 16:36:09.805719 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.805215 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zgp2c\" (UniqueName: \"kubernetes.io/projected/f9f9b357-558a-42d8-b068-7295ba867330-kube-api-access-zgp2c\") pod \"insights-runtime-extractor-56lmt\" (UID: \"f9f9b357-558a-42d8-b068-7295ba867330\") " pod="openshift-insights/insights-runtime-extractor-56lmt" Apr 23 16:36:09.805719 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.805233 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eb5d7213-8672-4ab5-9189-ec203ba60c84-config-volume\") pod \"dns-default-wfdh5\" (UID: \"eb5d7213-8672-4ab5-9189-ec203ba60c84\") " pod="openshift-dns/dns-default-wfdh5" Apr 23 16:36:09.805719 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.805293 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/eb5d7213-8672-4ab5-9189-ec203ba60c84-tmp-dir\") pod \"dns-default-wfdh5\" (UID: \"eb5d7213-8672-4ab5-9189-ec203ba60c84\") " pod="openshift-dns/dns-default-wfdh5" Apr 23 16:36:09.805719 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.805389 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f9f9b357-558a-42d8-b068-7295ba867330-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-56lmt\" (UID: \"f9f9b357-558a-42d8-b068-7295ba867330\") " pod="openshift-insights/insights-runtime-extractor-56lmt" Apr 23 16:36:09.805719 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.805428 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f9f9b357-558a-42d8-b068-7295ba867330-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-56lmt\" (UID: \"f9f9b357-558a-42d8-b068-7295ba867330\") " pod="openshift-insights/insights-runtime-extractor-56lmt" Apr 23 16:36:09.805719 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.805442 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f9f9b357-558a-42d8-b068-7295ba867330-crio-socket\") pod \"insights-runtime-extractor-56lmt\" (UID: \"f9f9b357-558a-42d8-b068-7295ba867330\") " pod="openshift-insights/insights-runtime-extractor-56lmt" Apr 23 16:36:09.805719 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.805451 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f9f9b357-558a-42d8-b068-7295ba867330-data-volume\") pod \"insights-runtime-extractor-56lmt\" (UID: \"f9f9b357-558a-42d8-b068-7295ba867330\") " pod="openshift-insights/insights-runtime-extractor-56lmt" Apr 23 16:36:09.806170 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.805950 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eb5d7213-8672-4ab5-9189-ec203ba60c84-config-volume\") pod \"dns-default-wfdh5\" (UID: \"eb5d7213-8672-4ab5-9189-ec203ba60c84\") " pod="openshift-dns/dns-default-wfdh5" Apr 23 16:36:09.806170 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.805984 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/eb5d7213-8672-4ab5-9189-ec203ba60c84-tmp-dir\") pod \"dns-default-wfdh5\" (UID: \"eb5d7213-8672-4ab5-9189-ec203ba60c84\") " pod="openshift-dns/dns-default-wfdh5" Apr 23 16:36:09.806170 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.806090 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f9f9b357-558a-42d8-b068-7295ba867330-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-56lmt\" (UID: \"f9f9b357-558a-42d8-b068-7295ba867330\") " pod="openshift-insights/insights-runtime-extractor-56lmt" Apr 23 16:36:09.807390 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.807368 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/eb5d7213-8672-4ab5-9189-ec203ba60c84-metrics-tls\") pod \"dns-default-wfdh5\" (UID: \"eb5d7213-8672-4ab5-9189-ec203ba60c84\") " pod="openshift-dns/dns-default-wfdh5" Apr 23 16:36:09.811774 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.811736 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f9f9b357-558a-42d8-b068-7295ba867330-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-56lmt\" (UID: \"f9f9b357-558a-42d8-b068-7295ba867330\") " pod="openshift-insights/insights-runtime-extractor-56lmt" Apr 23 16:36:09.817714 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.817690 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgp2c\" (UniqueName: \"kubernetes.io/projected/f9f9b357-558a-42d8-b068-7295ba867330-kube-api-access-zgp2c\") pod \"insights-runtime-extractor-56lmt\" (UID: \"f9f9b357-558a-42d8-b068-7295ba867330\") " pod="openshift-insights/insights-runtime-extractor-56lmt" Apr 23 16:36:09.817949 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.817933 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcfvq\" (UniqueName: \"kubernetes.io/projected/eb5d7213-8672-4ab5-9189-ec203ba60c84-kube-api-access-mcfvq\") pod \"dns-default-wfdh5\" (UID: \"eb5d7213-8672-4ab5-9189-ec203ba60c84\") " pod="openshift-dns/dns-default-wfdh5" Apr 23 16:36:09.892005 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.891972 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-mh25k" Apr 23 16:36:09.898140 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.898114 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wfdh5" Apr 23 16:36:09.947044 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:09.947013 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-56lmt" Apr 23 16:36:10.054636 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:10.054603 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-mh25k"] Apr 23 16:36:10.057842 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:10.057817 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wfdh5"] Apr 23 16:36:10.082864 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:10.082841 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-56lmt"] Apr 23 16:36:10.086111 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:36:10.086090 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9f9b357_558a_42d8_b068_7295ba867330.slice/crio-358e5f63958a4f8d814188ec4058b138969932b93c0180a3009d88ffbd81d77c WatchSource:0}: Error finding container 358e5f63958a4f8d814188ec4058b138969932b93c0180a3009d88ffbd81d77c: Status 404 returned error can't find the container with id 358e5f63958a4f8d814188ec4058b138969932b93c0180a3009d88ffbd81d77c Apr 23 16:36:10.744027 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:10.743990 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qzdzg" Apr 23 16:36:10.744333 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:10.744028 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wn6cd" Apr 23 16:36:10.744881 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:10.744598 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6wbcq" Apr 23 16:36:10.747109 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:10.747085 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 16:36:10.748461 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:10.748442 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 16:36:10.748578 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:10.748530 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-hhjgv\"" Apr 23 16:36:10.748686 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:10.748462 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 16:36:10.748789 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:10.748685 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 16:36:10.749257 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:10.749238 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-5zkmp\"" Apr 23 16:36:10.978681 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:10.978606 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wfdh5" event={"ID":"eb5d7213-8672-4ab5-9189-ec203ba60c84","Type":"ContainerStarted","Data":"0a1f58ed19b23745f02746c39e2106cc860d2d88bcce1beff7ba10705f3fb52a"} Apr 23 16:36:10.980940 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:10.980905 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-56lmt" event={"ID":"f9f9b357-558a-42d8-b068-7295ba867330","Type":"ContainerStarted","Data":"441ee717c50f41e6b11ccd207a2d2b1d4e4688234fe1960cb398e7ab89c771e7"} Apr 23 16:36:10.981050 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:10.980946 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-56lmt" event={"ID":"f9f9b357-558a-42d8-b068-7295ba867330","Type":"ContainerStarted","Data":"358e5f63958a4f8d814188ec4058b138969932b93c0180a3009d88ffbd81d77c"} Apr 23 16:36:10.982937 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:10.982897 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-mh25k" event={"ID":"aa24ec5c-a6a3-4c23-90a0-58ac28a5e1f9","Type":"ContainerStarted","Data":"baac5f34f9968bd079474212c1dd7891e55599612b1e3d5f067cead7111911f1"} Apr 23 16:36:11.989654 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:11.989622 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-56lmt" event={"ID":"f9f9b357-558a-42d8-b068-7295ba867330","Type":"ContainerStarted","Data":"bd706b57c8f57792717a8a7745076e880fd553c7ddd7330ddb6b061bca1d0053"} Apr 23 16:36:12.979128 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:12.978919 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-9qng7"] Apr 23 16:36:12.981806 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:12.981790 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-9qng7" Apr 23 16:36:12.986758 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:12.986554 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 23 16:36:12.986758 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:12.986563 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 23 16:36:12.986758 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:12.986626 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-75tcj\"" Apr 23 16:36:12.986758 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:12.986641 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 23 16:36:12.986758 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:12.986639 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 23 16:36:12.986758 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:12.986564 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 23 16:36:12.993483 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:12.993451 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-9qng7"] Apr 23 16:36:12.997582 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:12.997552 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-56lmt" event={"ID":"f9f9b357-558a-42d8-b068-7295ba867330","Type":"ContainerStarted","Data":"eb33a8a889aa48aa77d2dcd78552607a786f954c80208e6ae06b04062f01b1a9"} Apr 23 16:36:12.998901 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:12.998875 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-mh25k" event={"ID":"aa24ec5c-a6a3-4c23-90a0-58ac28a5e1f9","Type":"ContainerStarted","Data":"a141aefca9a278f995511b2400a3f419adc0c693826d30c1e019eebf61c3eb92"} Apr 23 16:36:13.000571 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:13.000542 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wfdh5" event={"ID":"eb5d7213-8672-4ab5-9189-ec203ba60c84","Type":"ContainerStarted","Data":"36e7d2414a9dc384705401b34d651b9e815a15930b17b5cc3cc6fb7e481531a1"} Apr 23 16:36:13.000669 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:13.000574 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wfdh5" event={"ID":"eb5d7213-8672-4ab5-9189-ec203ba60c84","Type":"ContainerStarted","Data":"f4136ea00aedee9e5fb0add8c5e7cd815a4c0217dfe6a6a06f5edd62612081e4"} Apr 23 16:36:13.000707 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:13.000682 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-wfdh5" Apr 23 16:36:13.020298 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:13.020245 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-mh25k" podStartSLOduration=1.62576462 podStartE2EDuration="4.020230427s" podCreationTimestamp="2026-04-23 16:36:09 +0000 UTC" firstStartedPulling="2026-04-23 16:36:10.066123096 +0000 UTC m=+49.918938141" lastFinishedPulling="2026-04-23 16:36:12.460588902 +0000 UTC m=+52.313403948" observedRunningTime="2026-04-23 16:36:13.019380852 +0000 UTC m=+52.872195911" watchObservedRunningTime="2026-04-23 16:36:13.020230427 +0000 UTC m=+52.873045491" Apr 23 16:36:13.037532 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:13.037502 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/46010ce7-8871-4f06-90d6-4933fe17216c-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-9qng7\" (UID: \"46010ce7-8871-4f06-90d6-4933fe17216c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9qng7" Apr 23 16:36:13.037697 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:13.037575 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/46010ce7-8871-4f06-90d6-4933fe17216c-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-9qng7\" (UID: \"46010ce7-8871-4f06-90d6-4933fe17216c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9qng7" Apr 23 16:36:13.037697 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:13.037618 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/46010ce7-8871-4f06-90d6-4933fe17216c-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-9qng7\" (UID: \"46010ce7-8871-4f06-90d6-4933fe17216c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9qng7" Apr 23 16:36:13.037697 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:13.037651 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx5cc\" (UniqueName: \"kubernetes.io/projected/46010ce7-8871-4f06-90d6-4933fe17216c-kube-api-access-bx5cc\") pod \"prometheus-operator-5676c8c784-9qng7\" (UID: \"46010ce7-8871-4f06-90d6-4933fe17216c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9qng7" Apr 23 16:36:13.042734 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:13.042691 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-wfdh5" podStartSLOduration=1.65413715 podStartE2EDuration="4.042677177s" podCreationTimestamp="2026-04-23 16:36:09 +0000 UTC" firstStartedPulling="2026-04-23 16:36:10.066548169 +0000 UTC m=+49.919363216" lastFinishedPulling="2026-04-23 16:36:12.455088197 +0000 UTC m=+52.307903243" observedRunningTime="2026-04-23 16:36:13.042083828 +0000 UTC m=+52.894898892" watchObservedRunningTime="2026-04-23 16:36:13.042677177 +0000 UTC m=+52.895492295" Apr 23 16:36:13.073319 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:13.073273 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-56lmt" podStartSLOduration=1.780010364 podStartE2EDuration="4.073256049s" podCreationTimestamp="2026-04-23 16:36:09 +0000 UTC" firstStartedPulling="2026-04-23 16:36:10.209623241 +0000 UTC m=+50.062438284" lastFinishedPulling="2026-04-23 16:36:12.502868916 +0000 UTC m=+52.355683969" observedRunningTime="2026-04-23 16:36:13.072167234 +0000 UTC m=+52.924982299" watchObservedRunningTime="2026-04-23 16:36:13.073256049 +0000 UTC m=+52.926071113" Apr 23 16:36:13.138215 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:13.138158 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/46010ce7-8871-4f06-90d6-4933fe17216c-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-9qng7\" (UID: \"46010ce7-8871-4f06-90d6-4933fe17216c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9qng7" Apr 23 16:36:13.138424 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:13.138308 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/46010ce7-8871-4f06-90d6-4933fe17216c-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-9qng7\" (UID: \"46010ce7-8871-4f06-90d6-4933fe17216c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9qng7" Apr 23 16:36:13.138424 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:36:13.138315 2562 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 23 16:36:13.138424 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:13.138347 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/46010ce7-8871-4f06-90d6-4933fe17216c-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-9qng7\" (UID: \"46010ce7-8871-4f06-90d6-4933fe17216c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9qng7" Apr 23 16:36:13.138424 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:13.138367 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bx5cc\" (UniqueName: \"kubernetes.io/projected/46010ce7-8871-4f06-90d6-4933fe17216c-kube-api-access-bx5cc\") pod \"prometheus-operator-5676c8c784-9qng7\" (UID: \"46010ce7-8871-4f06-90d6-4933fe17216c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9qng7" Apr 23 16:36:13.138424 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:36:13.138384 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46010ce7-8871-4f06-90d6-4933fe17216c-prometheus-operator-tls podName:46010ce7-8871-4f06-90d6-4933fe17216c nodeName:}" failed. No retries permitted until 2026-04-23 16:36:13.638364003 +0000 UTC m=+53.491179049 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/46010ce7-8871-4f06-90d6-4933fe17216c-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-9qng7" (UID: "46010ce7-8871-4f06-90d6-4933fe17216c") : secret "prometheus-operator-tls" not found Apr 23 16:36:13.139087 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:13.139066 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/46010ce7-8871-4f06-90d6-4933fe17216c-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-9qng7\" (UID: \"46010ce7-8871-4f06-90d6-4933fe17216c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9qng7" Apr 23 16:36:13.142310 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:13.142281 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/46010ce7-8871-4f06-90d6-4933fe17216c-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-9qng7\" (UID: \"46010ce7-8871-4f06-90d6-4933fe17216c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9qng7" Apr 23 16:36:13.149351 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:13.149331 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx5cc\" (UniqueName: \"kubernetes.io/projected/46010ce7-8871-4f06-90d6-4933fe17216c-kube-api-access-bx5cc\") pod \"prometheus-operator-5676c8c784-9qng7\" (UID: \"46010ce7-8871-4f06-90d6-4933fe17216c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9qng7" Apr 23 16:36:13.642446 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:13.642413 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/46010ce7-8871-4f06-90d6-4933fe17216c-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-9qng7\" (UID: \"46010ce7-8871-4f06-90d6-4933fe17216c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9qng7" Apr 23 16:36:13.644607 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:13.644589 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/46010ce7-8871-4f06-90d6-4933fe17216c-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-9qng7\" (UID: \"46010ce7-8871-4f06-90d6-4933fe17216c\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-9qng7" Apr 23 16:36:13.891262 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:13.891223 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-9qng7" Apr 23 16:36:14.007710 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:14.007679 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-9qng7"] Apr 23 16:36:14.011119 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:36:14.011075 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46010ce7_8871_4f06_90d6_4933fe17216c.slice/crio-1f7ee82a80cdf786536e67157e7155a08e6d3c437695b0fde2914677d3c77b5c WatchSource:0}: Error finding container 1f7ee82a80cdf786536e67157e7155a08e6d3c437695b0fde2914677d3c77b5c: Status 404 returned error can't find the container with id 1f7ee82a80cdf786536e67157e7155a08e6d3c437695b0fde2914677d3c77b5c Apr 23 16:36:15.006068 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:15.006035 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-9qng7" event={"ID":"46010ce7-8871-4f06-90d6-4933fe17216c","Type":"ContainerStarted","Data":"1f7ee82a80cdf786536e67157e7155a08e6d3c437695b0fde2914677d3c77b5c"} Apr 23 16:36:16.010620 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:16.010582 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-9qng7" event={"ID":"46010ce7-8871-4f06-90d6-4933fe17216c","Type":"ContainerStarted","Data":"e78b53d0814e7c7b762a8cef195cfc43e2229fa6cd54db6015da01a51cc54a83"} Apr 23 16:36:16.010620 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:16.010622 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-9qng7" event={"ID":"46010ce7-8871-4f06-90d6-4933fe17216c","Type":"ContainerStarted","Data":"724fb2b15887194a30448cecb7cbc98bb3d28a77a50f1fd50232549a3f1bce1d"} Apr 23 16:36:16.031283 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:16.031228 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-9qng7" podStartSLOduration=2.651222739 podStartE2EDuration="4.03121314s" podCreationTimestamp="2026-04-23 16:36:12 +0000 UTC" firstStartedPulling="2026-04-23 16:36:14.013101386 +0000 UTC m=+53.865916429" lastFinishedPulling="2026-04-23 16:36:15.393091788 +0000 UTC m=+55.245906830" observedRunningTime="2026-04-23 16:36:16.02998163 +0000 UTC m=+55.882796763" watchObservedRunningTime="2026-04-23 16:36:16.03121314 +0000 UTC m=+55.884028204" Apr 23 16:36:18.383125 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:18.383091 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-cxgtz"] Apr 23 16:36:18.387080 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:18.387064 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-cxgtz" Apr 23 16:36:18.390490 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:18.390467 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 23 16:36:18.390582 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:18.390499 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 23 16:36:18.390642 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:18.390592 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 23 16:36:18.390984 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:18.390971 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-k5wxx\"" Apr 23 16:36:18.474154 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:18.474123 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/354460b2-9790-4c05-9a2a-dc4bab2fa675-root\") pod \"node-exporter-cxgtz\" (UID: \"354460b2-9790-4c05-9a2a-dc4bab2fa675\") " pod="openshift-monitoring/node-exporter-cxgtz" Apr 23 16:36:18.474314 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:18.474177 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/354460b2-9790-4c05-9a2a-dc4bab2fa675-node-exporter-tls\") pod \"node-exporter-cxgtz\" (UID: \"354460b2-9790-4c05-9a2a-dc4bab2fa675\") " pod="openshift-monitoring/node-exporter-cxgtz" Apr 23 16:36:18.474314 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:18.474235 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/354460b2-9790-4c05-9a2a-dc4bab2fa675-node-exporter-wtmp\") pod \"node-exporter-cxgtz\" (UID: \"354460b2-9790-4c05-9a2a-dc4bab2fa675\") " pod="openshift-monitoring/node-exporter-cxgtz" Apr 23 16:36:18.474314 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:18.474258 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/354460b2-9790-4c05-9a2a-dc4bab2fa675-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-cxgtz\" (UID: \"354460b2-9790-4c05-9a2a-dc4bab2fa675\") " pod="openshift-monitoring/node-exporter-cxgtz" Apr 23 16:36:18.474314 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:18.474284 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/354460b2-9790-4c05-9a2a-dc4bab2fa675-node-exporter-accelerators-collector-config\") pod \"node-exporter-cxgtz\" (UID: \"354460b2-9790-4c05-9a2a-dc4bab2fa675\") " pod="openshift-monitoring/node-exporter-cxgtz" Apr 23 16:36:18.474445 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:18.474365 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/354460b2-9790-4c05-9a2a-dc4bab2fa675-sys\") pod \"node-exporter-cxgtz\" (UID: \"354460b2-9790-4c05-9a2a-dc4bab2fa675\") " pod="openshift-monitoring/node-exporter-cxgtz" Apr 23 16:36:18.474445 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:18.474402 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94rfv\" (UniqueName: \"kubernetes.io/projected/354460b2-9790-4c05-9a2a-dc4bab2fa675-kube-api-access-94rfv\") pod \"node-exporter-cxgtz\" (UID: \"354460b2-9790-4c05-9a2a-dc4bab2fa675\") " pod="openshift-monitoring/node-exporter-cxgtz" Apr 23 16:36:18.474445 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:18.474441 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/354460b2-9790-4c05-9a2a-dc4bab2fa675-node-exporter-textfile\") pod \"node-exporter-cxgtz\" (UID: \"354460b2-9790-4c05-9a2a-dc4bab2fa675\") " pod="openshift-monitoring/node-exporter-cxgtz" Apr 23 16:36:18.474536 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:18.474459 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/354460b2-9790-4c05-9a2a-dc4bab2fa675-metrics-client-ca\") pod \"node-exporter-cxgtz\" (UID: \"354460b2-9790-4c05-9a2a-dc4bab2fa675\") " pod="openshift-monitoring/node-exporter-cxgtz" Apr 23 16:36:18.575216 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:18.575183 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/354460b2-9790-4c05-9a2a-dc4bab2fa675-node-exporter-textfile\") pod \"node-exporter-cxgtz\" (UID: \"354460b2-9790-4c05-9a2a-dc4bab2fa675\") " pod="openshift-monitoring/node-exporter-cxgtz" Apr 23 16:36:18.575216 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:18.575217 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/354460b2-9790-4c05-9a2a-dc4bab2fa675-metrics-client-ca\") pod \"node-exporter-cxgtz\" (UID: \"354460b2-9790-4c05-9a2a-dc4bab2fa675\") " pod="openshift-monitoring/node-exporter-cxgtz" Apr 23 16:36:18.575442 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:18.575237 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/354460b2-9790-4c05-9a2a-dc4bab2fa675-root\") pod \"node-exporter-cxgtz\" (UID: \"354460b2-9790-4c05-9a2a-dc4bab2fa675\") " pod="openshift-monitoring/node-exporter-cxgtz" Apr 23 16:36:18.575442 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:18.575278 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/354460b2-9790-4c05-9a2a-dc4bab2fa675-node-exporter-tls\") pod \"node-exporter-cxgtz\" (UID: \"354460b2-9790-4c05-9a2a-dc4bab2fa675\") " pod="openshift-monitoring/node-exporter-cxgtz" Apr 23 16:36:18.575442 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:18.575313 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/354460b2-9790-4c05-9a2a-dc4bab2fa675-node-exporter-wtmp\") pod \"node-exporter-cxgtz\" (UID: \"354460b2-9790-4c05-9a2a-dc4bab2fa675\") " pod="openshift-monitoring/node-exporter-cxgtz" Apr 23 16:36:18.575442 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:18.575336 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/354460b2-9790-4c05-9a2a-dc4bab2fa675-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-cxgtz\" (UID: \"354460b2-9790-4c05-9a2a-dc4bab2fa675\") " pod="openshift-monitoring/node-exporter-cxgtz" Apr 23 16:36:18.575442 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:18.575356 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/354460b2-9790-4c05-9a2a-dc4bab2fa675-node-exporter-accelerators-collector-config\") pod \"node-exporter-cxgtz\" (UID: \"354460b2-9790-4c05-9a2a-dc4bab2fa675\") " pod="openshift-monitoring/node-exporter-cxgtz" Apr 23 16:36:18.575442 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:18.575381 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/354460b2-9790-4c05-9a2a-dc4bab2fa675-root\") pod \"node-exporter-cxgtz\" (UID: \"354460b2-9790-4c05-9a2a-dc4bab2fa675\") " pod="openshift-monitoring/node-exporter-cxgtz" Apr 23 16:36:18.575442 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:18.575391 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/354460b2-9790-4c05-9a2a-dc4bab2fa675-sys\") pod \"node-exporter-cxgtz\" (UID: \"354460b2-9790-4c05-9a2a-dc4bab2fa675\") " pod="openshift-monitoring/node-exporter-cxgtz" Apr 23 16:36:18.575442 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:18.575433 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/354460b2-9790-4c05-9a2a-dc4bab2fa675-sys\") pod \"node-exporter-cxgtz\" (UID: \"354460b2-9790-4c05-9a2a-dc4bab2fa675\") " pod="openshift-monitoring/node-exporter-cxgtz" Apr 23 16:36:18.575821 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:18.575465 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-94rfv\" (UniqueName: \"kubernetes.io/projected/354460b2-9790-4c05-9a2a-dc4bab2fa675-kube-api-access-94rfv\") pod \"node-exporter-cxgtz\" (UID: \"354460b2-9790-4c05-9a2a-dc4bab2fa675\") " pod="openshift-monitoring/node-exporter-cxgtz" Apr 23 16:36:18.575821 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:18.575489 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/354460b2-9790-4c05-9a2a-dc4bab2fa675-node-exporter-wtmp\") pod \"node-exporter-cxgtz\" (UID: \"354460b2-9790-4c05-9a2a-dc4bab2fa675\") " pod="openshift-monitoring/node-exporter-cxgtz" Apr 23 16:36:18.575821 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:18.575569 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/354460b2-9790-4c05-9a2a-dc4bab2fa675-node-exporter-textfile\") pod \"node-exporter-cxgtz\" (UID: \"354460b2-9790-4c05-9a2a-dc4bab2fa675\") " pod="openshift-monitoring/node-exporter-cxgtz" Apr 23 16:36:18.575921 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:18.575858 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/354460b2-9790-4c05-9a2a-dc4bab2fa675-metrics-client-ca\") pod \"node-exporter-cxgtz\" (UID: \"354460b2-9790-4c05-9a2a-dc4bab2fa675\") " pod="openshift-monitoring/node-exporter-cxgtz" Apr 23 16:36:18.576127 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:18.576110 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/354460b2-9790-4c05-9a2a-dc4bab2fa675-node-exporter-accelerators-collector-config\") pod \"node-exporter-cxgtz\" (UID: \"354460b2-9790-4c05-9a2a-dc4bab2fa675\") " pod="openshift-monitoring/node-exporter-cxgtz" Apr 23 16:36:18.577680 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:18.577658 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/354460b2-9790-4c05-9a2a-dc4bab2fa675-node-exporter-tls\") pod \"node-exporter-cxgtz\" (UID: \"354460b2-9790-4c05-9a2a-dc4bab2fa675\") " pod="openshift-monitoring/node-exporter-cxgtz" Apr 23 16:36:18.577805 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:18.577787 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/354460b2-9790-4c05-9a2a-dc4bab2fa675-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-cxgtz\" (UID: \"354460b2-9790-4c05-9a2a-dc4bab2fa675\") " pod="openshift-monitoring/node-exporter-cxgtz" Apr 23 16:36:18.587820 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:18.587798 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-94rfv\" (UniqueName: \"kubernetes.io/projected/354460b2-9790-4c05-9a2a-dc4bab2fa675-kube-api-access-94rfv\") pod \"node-exporter-cxgtz\" (UID: \"354460b2-9790-4c05-9a2a-dc4bab2fa675\") " pod="openshift-monitoring/node-exporter-cxgtz" Apr 23 16:36:18.695967 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:18.695935 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-cxgtz" Apr 23 16:36:18.703727 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:36:18.703699 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod354460b2_9790_4c05_9a2a_dc4bab2fa675.slice/crio-7ab34da7bb63405747b43f6f4a8b44f8d2f3ca033711cb13907938b8c43efca2 WatchSource:0}: Error finding container 7ab34da7bb63405747b43f6f4a8b44f8d2f3ca033711cb13907938b8c43efca2: Status 404 returned error can't find the container with id 7ab34da7bb63405747b43f6f4a8b44f8d2f3ca033711cb13907938b8c43efca2 Apr 23 16:36:18.940115 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:18.940084 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-krt5k" Apr 23 16:36:19.021168 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:19.021086 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cxgtz" event={"ID":"354460b2-9790-4c05-9a2a-dc4bab2fa675","Type":"ContainerStarted","Data":"7ab34da7bb63405747b43f6f4a8b44f8d2f3ca033711cb13907938b8c43efca2"} Apr 23 16:36:20.025119 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:20.025082 2562 generic.go:358] "Generic (PLEG): container finished" podID="354460b2-9790-4c05-9a2a-dc4bab2fa675" containerID="aa5d28b47b3a1d2dc6961ecd0b6edb29b449b08ed9aba36afb1828f8ec3a2a4a" exitCode=0 Apr 23 16:36:20.025119 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:20.025123 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cxgtz" event={"ID":"354460b2-9790-4c05-9a2a-dc4bab2fa675","Type":"ContainerDied","Data":"aa5d28b47b3a1d2dc6961ecd0b6edb29b449b08ed9aba36afb1828f8ec3a2a4a"} Apr 23 16:36:21.029698 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.029654 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cxgtz" event={"ID":"354460b2-9790-4c05-9a2a-dc4bab2fa675","Type":"ContainerStarted","Data":"e2537fb386fe56a0dbd4ee9d00f9241981c6cc9b522e13441b3dc5ebfc7ee105"} Apr 23 16:36:21.029698 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.029702 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cxgtz" event={"ID":"354460b2-9790-4c05-9a2a-dc4bab2fa675","Type":"ContainerStarted","Data":"682f0016822ab8ffc5d76f72afcbb3cd957da578863a7c6db437826798881842"} Apr 23 16:36:21.073528 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.073475 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-cxgtz" podStartSLOduration=2.251035426 podStartE2EDuration="3.073461324s" podCreationTimestamp="2026-04-23 16:36:18 +0000 UTC" firstStartedPulling="2026-04-23 16:36:18.70567002 +0000 UTC m=+58.558485065" lastFinishedPulling="2026-04-23 16:36:19.52809592 +0000 UTC m=+59.380910963" observedRunningTime="2026-04-23 16:36:21.073256554 +0000 UTC m=+60.926071619" watchObservedRunningTime="2026-04-23 16:36:21.073461324 +0000 UTC m=+60.926276388" Apr 23 16:36:21.335543 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.335464 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6cbbbb9d44-bmcf2"] Apr 23 16:36:21.339605 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.339587 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6cbbbb9d44-bmcf2" Apr 23 16:36:21.346866 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.346845 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 23 16:36:21.346979 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.346958 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 23 16:36:21.348440 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.348425 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-g68ff\"" Apr 23 16:36:21.348564 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.348549 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 23 16:36:21.348714 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.348696 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 23 16:36:21.348803 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.348703 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 23 16:36:21.349645 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.349631 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 23 16:36:21.359517 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.359499 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 23 16:36:21.360892 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.360870 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 23 16:36:21.361618 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.361590 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6cbbbb9d44-bmcf2"] Apr 23 16:36:21.400412 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.400378 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d1b9341c-ba96-4e86-80f2-5260ed07f0a1-console-config\") pod \"console-6cbbbb9d44-bmcf2\" (UID: \"d1b9341c-ba96-4e86-80f2-5260ed07f0a1\") " pod="openshift-console/console-6cbbbb9d44-bmcf2" Apr 23 16:36:21.400584 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.400464 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d56vj\" (UniqueName: \"kubernetes.io/projected/d1b9341c-ba96-4e86-80f2-5260ed07f0a1-kube-api-access-d56vj\") pod \"console-6cbbbb9d44-bmcf2\" (UID: \"d1b9341c-ba96-4e86-80f2-5260ed07f0a1\") " pod="openshift-console/console-6cbbbb9d44-bmcf2" Apr 23 16:36:21.400584 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.400511 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d1b9341c-ba96-4e86-80f2-5260ed07f0a1-console-serving-cert\") pod \"console-6cbbbb9d44-bmcf2\" (UID: \"d1b9341c-ba96-4e86-80f2-5260ed07f0a1\") " pod="openshift-console/console-6cbbbb9d44-bmcf2" Apr 23 16:36:21.400584 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.400534 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d1b9341c-ba96-4e86-80f2-5260ed07f0a1-console-oauth-config\") pod \"console-6cbbbb9d44-bmcf2\" (UID: \"d1b9341c-ba96-4e86-80f2-5260ed07f0a1\") " pod="openshift-console/console-6cbbbb9d44-bmcf2" Apr 23 16:36:21.400584 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.400560 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d1b9341c-ba96-4e86-80f2-5260ed07f0a1-service-ca\") pod \"console-6cbbbb9d44-bmcf2\" (UID: \"d1b9341c-ba96-4e86-80f2-5260ed07f0a1\") " pod="openshift-console/console-6cbbbb9d44-bmcf2" Apr 23 16:36:21.400584 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.400578 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1b9341c-ba96-4e86-80f2-5260ed07f0a1-trusted-ca-bundle\") pod \"console-6cbbbb9d44-bmcf2\" (UID: \"d1b9341c-ba96-4e86-80f2-5260ed07f0a1\") " pod="openshift-console/console-6cbbbb9d44-bmcf2" Apr 23 16:36:21.400794 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.400668 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d1b9341c-ba96-4e86-80f2-5260ed07f0a1-oauth-serving-cert\") pod \"console-6cbbbb9d44-bmcf2\" (UID: \"d1b9341c-ba96-4e86-80f2-5260ed07f0a1\") " pod="openshift-console/console-6cbbbb9d44-bmcf2" Apr 23 16:36:21.457240 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.457207 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-7cbf84d895-rj6tl"] Apr 23 16:36:21.460423 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.460404 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7cbf84d895-rj6tl" Apr 23 16:36:21.464412 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.464393 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 23 16:36:21.464519 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.464451 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 23 16:36:21.464519 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.464476 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 23 16:36:21.464519 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.464495 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-cs59eopmq9ef3\"" Apr 23 16:36:21.464667 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.464548 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-mpsbc\"" Apr 23 16:36:21.464936 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.464919 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 23 16:36:21.464992 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.464959 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 23 16:36:21.478454 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.478431 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7cbf84d895-rj6tl"] Apr 23 16:36:21.501754 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.501708 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d1b9341c-ba96-4e86-80f2-5260ed07f0a1-console-serving-cert\") pod \"console-6cbbbb9d44-bmcf2\" (UID: \"d1b9341c-ba96-4e86-80f2-5260ed07f0a1\") " pod="openshift-console/console-6cbbbb9d44-bmcf2" Apr 23 16:36:21.501863 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.501769 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d1b9341c-ba96-4e86-80f2-5260ed07f0a1-console-oauth-config\") pod \"console-6cbbbb9d44-bmcf2\" (UID: \"d1b9341c-ba96-4e86-80f2-5260ed07f0a1\") " pod="openshift-console/console-6cbbbb9d44-bmcf2" Apr 23 16:36:21.501863 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.501797 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d1b9341c-ba96-4e86-80f2-5260ed07f0a1-service-ca\") pod \"console-6cbbbb9d44-bmcf2\" (UID: \"d1b9341c-ba96-4e86-80f2-5260ed07f0a1\") " pod="openshift-console/console-6cbbbb9d44-bmcf2" Apr 23 16:36:21.501863 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.501823 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1b9341c-ba96-4e86-80f2-5260ed07f0a1-trusted-ca-bundle\") pod \"console-6cbbbb9d44-bmcf2\" (UID: \"d1b9341c-ba96-4e86-80f2-5260ed07f0a1\") " pod="openshift-console/console-6cbbbb9d44-bmcf2" Apr 23 16:36:21.502071 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.501865 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d1b9341c-ba96-4e86-80f2-5260ed07f0a1-oauth-serving-cert\") pod \"console-6cbbbb9d44-bmcf2\" (UID: \"d1b9341c-ba96-4e86-80f2-5260ed07f0a1\") " pod="openshift-console/console-6cbbbb9d44-bmcf2" Apr 23 16:36:21.502071 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.501898 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d1b9341c-ba96-4e86-80f2-5260ed07f0a1-console-config\") pod \"console-6cbbbb9d44-bmcf2\" (UID: \"d1b9341c-ba96-4e86-80f2-5260ed07f0a1\") " pod="openshift-console/console-6cbbbb9d44-bmcf2" Apr 23 16:36:21.502071 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.501942 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d56vj\" (UniqueName: \"kubernetes.io/projected/d1b9341c-ba96-4e86-80f2-5260ed07f0a1-kube-api-access-d56vj\") pod \"console-6cbbbb9d44-bmcf2\" (UID: \"d1b9341c-ba96-4e86-80f2-5260ed07f0a1\") " pod="openshift-console/console-6cbbbb9d44-bmcf2" Apr 23 16:36:21.502623 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.502587 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d1b9341c-ba96-4e86-80f2-5260ed07f0a1-service-ca\") pod \"console-6cbbbb9d44-bmcf2\" (UID: \"d1b9341c-ba96-4e86-80f2-5260ed07f0a1\") " pod="openshift-console/console-6cbbbb9d44-bmcf2" Apr 23 16:36:21.502715 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.502622 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d1b9341c-ba96-4e86-80f2-5260ed07f0a1-console-config\") pod \"console-6cbbbb9d44-bmcf2\" (UID: \"d1b9341c-ba96-4e86-80f2-5260ed07f0a1\") " pod="openshift-console/console-6cbbbb9d44-bmcf2" Apr 23 16:36:21.502715 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.502685 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d1b9341c-ba96-4e86-80f2-5260ed07f0a1-oauth-serving-cert\") pod \"console-6cbbbb9d44-bmcf2\" (UID: \"d1b9341c-ba96-4e86-80f2-5260ed07f0a1\") " pod="openshift-console/console-6cbbbb9d44-bmcf2" Apr 23 16:36:21.502820 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.502735 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1b9341c-ba96-4e86-80f2-5260ed07f0a1-trusted-ca-bundle\") pod \"console-6cbbbb9d44-bmcf2\" (UID: \"d1b9341c-ba96-4e86-80f2-5260ed07f0a1\") " pod="openshift-console/console-6cbbbb9d44-bmcf2" Apr 23 16:36:21.504367 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.504348 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d1b9341c-ba96-4e86-80f2-5260ed07f0a1-console-oauth-config\") pod \"console-6cbbbb9d44-bmcf2\" (UID: \"d1b9341c-ba96-4e86-80f2-5260ed07f0a1\") " pod="openshift-console/console-6cbbbb9d44-bmcf2" Apr 23 16:36:21.504367 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.504355 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d1b9341c-ba96-4e86-80f2-5260ed07f0a1-console-serving-cert\") pod \"console-6cbbbb9d44-bmcf2\" (UID: \"d1b9341c-ba96-4e86-80f2-5260ed07f0a1\") " pod="openshift-console/console-6cbbbb9d44-bmcf2" Apr 23 16:36:21.514200 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.514179 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d56vj\" (UniqueName: \"kubernetes.io/projected/d1b9341c-ba96-4e86-80f2-5260ed07f0a1-kube-api-access-d56vj\") pod \"console-6cbbbb9d44-bmcf2\" (UID: \"d1b9341c-ba96-4e86-80f2-5260ed07f0a1\") " pod="openshift-console/console-6cbbbb9d44-bmcf2" Apr 23 16:36:21.602955 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.602878 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/d2eb8e62-9f07-4d19-9d67-f66771d26287-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7cbf84d895-rj6tl\" (UID: \"d2eb8e62-9f07-4d19-9d67-f66771d26287\") " pod="openshift-monitoring/thanos-querier-7cbf84d895-rj6tl" Apr 23 16:36:21.602955 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.602914 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfmtd\" (UniqueName: \"kubernetes.io/projected/d2eb8e62-9f07-4d19-9d67-f66771d26287-kube-api-access-bfmtd\") pod \"thanos-querier-7cbf84d895-rj6tl\" (UID: \"d2eb8e62-9f07-4d19-9d67-f66771d26287\") " pod="openshift-monitoring/thanos-querier-7cbf84d895-rj6tl" Apr 23 16:36:21.603135 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.602955 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d2eb8e62-9f07-4d19-9d67-f66771d26287-secret-grpc-tls\") pod \"thanos-querier-7cbf84d895-rj6tl\" (UID: \"d2eb8e62-9f07-4d19-9d67-f66771d26287\") " pod="openshift-monitoring/thanos-querier-7cbf84d895-rj6tl" Apr 23 16:36:21.603135 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.602986 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d2eb8e62-9f07-4d19-9d67-f66771d26287-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7cbf84d895-rj6tl\" (UID: \"d2eb8e62-9f07-4d19-9d67-f66771d26287\") " pod="openshift-monitoring/thanos-querier-7cbf84d895-rj6tl" Apr 23 16:36:21.603135 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.603026 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/d2eb8e62-9f07-4d19-9d67-f66771d26287-secret-thanos-querier-tls\") pod \"thanos-querier-7cbf84d895-rj6tl\" (UID: \"d2eb8e62-9f07-4d19-9d67-f66771d26287\") " pod="openshift-monitoring/thanos-querier-7cbf84d895-rj6tl" Apr 23 16:36:21.603135 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.603068 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d2eb8e62-9f07-4d19-9d67-f66771d26287-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7cbf84d895-rj6tl\" (UID: \"d2eb8e62-9f07-4d19-9d67-f66771d26287\") " pod="openshift-monitoring/thanos-querier-7cbf84d895-rj6tl" Apr 23 16:36:21.603135 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.603087 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/d2eb8e62-9f07-4d19-9d67-f66771d26287-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7cbf84d895-rj6tl\" (UID: \"d2eb8e62-9f07-4d19-9d67-f66771d26287\") " pod="openshift-monitoring/thanos-querier-7cbf84d895-rj6tl" Apr 23 16:36:21.603135 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.603124 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d2eb8e62-9f07-4d19-9d67-f66771d26287-metrics-client-ca\") pod \"thanos-querier-7cbf84d895-rj6tl\" (UID: \"d2eb8e62-9f07-4d19-9d67-f66771d26287\") " pod="openshift-monitoring/thanos-querier-7cbf84d895-rj6tl" Apr 23 16:36:21.648294 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.648257 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6cbbbb9d44-bmcf2" Apr 23 16:36:21.704129 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.704098 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/d2eb8e62-9f07-4d19-9d67-f66771d26287-secret-thanos-querier-tls\") pod \"thanos-querier-7cbf84d895-rj6tl\" (UID: \"d2eb8e62-9f07-4d19-9d67-f66771d26287\") " pod="openshift-monitoring/thanos-querier-7cbf84d895-rj6tl" Apr 23 16:36:21.704129 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.704141 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d2eb8e62-9f07-4d19-9d67-f66771d26287-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7cbf84d895-rj6tl\" (UID: \"d2eb8e62-9f07-4d19-9d67-f66771d26287\") " pod="openshift-monitoring/thanos-querier-7cbf84d895-rj6tl" Apr 23 16:36:21.704367 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.704162 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/d2eb8e62-9f07-4d19-9d67-f66771d26287-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7cbf84d895-rj6tl\" (UID: \"d2eb8e62-9f07-4d19-9d67-f66771d26287\") " pod="openshift-monitoring/thanos-querier-7cbf84d895-rj6tl" Apr 23 16:36:21.704367 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.704181 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d2eb8e62-9f07-4d19-9d67-f66771d26287-metrics-client-ca\") pod \"thanos-querier-7cbf84d895-rj6tl\" (UID: \"d2eb8e62-9f07-4d19-9d67-f66771d26287\") " pod="openshift-monitoring/thanos-querier-7cbf84d895-rj6tl" Apr 23 16:36:21.704367 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.704219 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/d2eb8e62-9f07-4d19-9d67-f66771d26287-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7cbf84d895-rj6tl\" (UID: \"d2eb8e62-9f07-4d19-9d67-f66771d26287\") " pod="openshift-monitoring/thanos-querier-7cbf84d895-rj6tl" Apr 23 16:36:21.704367 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.704236 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bfmtd\" (UniqueName: \"kubernetes.io/projected/d2eb8e62-9f07-4d19-9d67-f66771d26287-kube-api-access-bfmtd\") pod \"thanos-querier-7cbf84d895-rj6tl\" (UID: \"d2eb8e62-9f07-4d19-9d67-f66771d26287\") " pod="openshift-monitoring/thanos-querier-7cbf84d895-rj6tl" Apr 23 16:36:21.704367 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.704255 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d2eb8e62-9f07-4d19-9d67-f66771d26287-secret-grpc-tls\") pod \"thanos-querier-7cbf84d895-rj6tl\" (UID: \"d2eb8e62-9f07-4d19-9d67-f66771d26287\") " pod="openshift-monitoring/thanos-querier-7cbf84d895-rj6tl" Apr 23 16:36:21.704367 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.704276 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d2eb8e62-9f07-4d19-9d67-f66771d26287-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7cbf84d895-rj6tl\" (UID: \"d2eb8e62-9f07-4d19-9d67-f66771d26287\") " pod="openshift-monitoring/thanos-querier-7cbf84d895-rj6tl" Apr 23 16:36:21.705471 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.705426 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d2eb8e62-9f07-4d19-9d67-f66771d26287-metrics-client-ca\") pod \"thanos-querier-7cbf84d895-rj6tl\" (UID: \"d2eb8e62-9f07-4d19-9d67-f66771d26287\") " pod="openshift-monitoring/thanos-querier-7cbf84d895-rj6tl" Apr 23 16:36:21.707199 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.707169 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d2eb8e62-9f07-4d19-9d67-f66771d26287-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7cbf84d895-rj6tl\" (UID: \"d2eb8e62-9f07-4d19-9d67-f66771d26287\") " pod="openshift-monitoring/thanos-querier-7cbf84d895-rj6tl" Apr 23 16:36:21.708095 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.708064 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/d2eb8e62-9f07-4d19-9d67-f66771d26287-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7cbf84d895-rj6tl\" (UID: \"d2eb8e62-9f07-4d19-9d67-f66771d26287\") " pod="openshift-monitoring/thanos-querier-7cbf84d895-rj6tl" Apr 23 16:36:21.711068 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.710907 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/d2eb8e62-9f07-4d19-9d67-f66771d26287-secret-thanos-querier-tls\") pod \"thanos-querier-7cbf84d895-rj6tl\" (UID: \"d2eb8e62-9f07-4d19-9d67-f66771d26287\") " pod="openshift-monitoring/thanos-querier-7cbf84d895-rj6tl" Apr 23 16:36:21.711068 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.711015 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/d2eb8e62-9f07-4d19-9d67-f66771d26287-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7cbf84d895-rj6tl\" (UID: \"d2eb8e62-9f07-4d19-9d67-f66771d26287\") " pod="openshift-monitoring/thanos-querier-7cbf84d895-rj6tl" Apr 23 16:36:21.711579 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.711540 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d2eb8e62-9f07-4d19-9d67-f66771d26287-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7cbf84d895-rj6tl\" (UID: \"d2eb8e62-9f07-4d19-9d67-f66771d26287\") " pod="openshift-monitoring/thanos-querier-7cbf84d895-rj6tl" Apr 23 16:36:21.713175 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.713145 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d2eb8e62-9f07-4d19-9d67-f66771d26287-secret-grpc-tls\") pod \"thanos-querier-7cbf84d895-rj6tl\" (UID: \"d2eb8e62-9f07-4d19-9d67-f66771d26287\") " pod="openshift-monitoring/thanos-querier-7cbf84d895-rj6tl" Apr 23 16:36:21.714216 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.714198 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfmtd\" (UniqueName: \"kubernetes.io/projected/d2eb8e62-9f07-4d19-9d67-f66771d26287-kube-api-access-bfmtd\") pod \"thanos-querier-7cbf84d895-rj6tl\" (UID: \"d2eb8e62-9f07-4d19-9d67-f66771d26287\") " pod="openshift-monitoring/thanos-querier-7cbf84d895-rj6tl" Apr 23 16:36:21.766776 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.766729 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6cbbbb9d44-bmcf2"] Apr 23 16:36:21.769832 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.769811 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7cbf84d895-rj6tl" Apr 23 16:36:21.770486 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:36:21.770470 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b9341c_ba96_4e86_80f2_5260ed07f0a1.slice/crio-66059a173649183a80e48974ce0abc52b3f39b7b5cdbf7519ee7eee4ed48e10b WatchSource:0}: Error finding container 66059a173649183a80e48974ce0abc52b3f39b7b5cdbf7519ee7eee4ed48e10b: Status 404 returned error can't find the container with id 66059a173649183a80e48974ce0abc52b3f39b7b5cdbf7519ee7eee4ed48e10b Apr 23 16:36:21.898932 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:21.898893 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7cbf84d895-rj6tl"] Apr 23 16:36:21.903466 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:36:21.903442 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2eb8e62_9f07_4d19_9d67_f66771d26287.slice/crio-d107f9a89db29ccac5886a8294cab65cb980f3d40099edae75ef67ae01c04e1c WatchSource:0}: Error finding container d107f9a89db29ccac5886a8294cab65cb980f3d40099edae75ef67ae01c04e1c: Status 404 returned error can't find the container with id d107f9a89db29ccac5886a8294cab65cb980f3d40099edae75ef67ae01c04e1c Apr 23 16:36:22.033864 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:22.033829 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6cbbbb9d44-bmcf2" event={"ID":"d1b9341c-ba96-4e86-80f2-5260ed07f0a1","Type":"ContainerStarted","Data":"66059a173649183a80e48974ce0abc52b3f39b7b5cdbf7519ee7eee4ed48e10b"} Apr 23 16:36:22.034912 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:22.034886 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7cbf84d895-rj6tl" event={"ID":"d2eb8e62-9f07-4d19-9d67-f66771d26287","Type":"ContainerStarted","Data":"d107f9a89db29ccac5886a8294cab65cb980f3d40099edae75ef67ae01c04e1c"} Apr 23 16:36:22.909853 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:22.909820 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-6c9b88dfdd-b9ljj"] Apr 23 16:36:22.913541 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:22.913267 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6c9b88dfdd-b9ljj" Apr 23 16:36:22.918459 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:22.918433 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 23 16:36:22.918726 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:22.918703 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 23 16:36:22.918821 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:22.918756 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 23 16:36:22.918821 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:22.918786 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-jlvnw\"" Apr 23 16:36:22.919171 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:22.918925 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 23 16:36:22.919171 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:22.919032 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-cgp8lamtcggms\"" Apr 23 16:36:22.927090 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:22.927061 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6c9b88dfdd-b9ljj"] Apr 23 16:36:23.005353 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.005323 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-wfdh5" Apr 23 16:36:23.015814 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.015780 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/3b9aeaea-8c1b-40ec-b888-528be672bf52-audit-log\") pod \"metrics-server-6c9b88dfdd-b9ljj\" (UID: \"3b9aeaea-8c1b-40ec-b888-528be672bf52\") " pod="openshift-monitoring/metrics-server-6c9b88dfdd-b9ljj" Apr 23 16:36:23.015989 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.015822 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2bnj\" (UniqueName: \"kubernetes.io/projected/3b9aeaea-8c1b-40ec-b888-528be672bf52-kube-api-access-z2bnj\") pod \"metrics-server-6c9b88dfdd-b9ljj\" (UID: \"3b9aeaea-8c1b-40ec-b888-528be672bf52\") " pod="openshift-monitoring/metrics-server-6c9b88dfdd-b9ljj" Apr 23 16:36:23.015989 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.015883 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/3b9aeaea-8c1b-40ec-b888-528be672bf52-secret-metrics-server-client-certs\") pod \"metrics-server-6c9b88dfdd-b9ljj\" (UID: \"3b9aeaea-8c1b-40ec-b888-528be672bf52\") " pod="openshift-monitoring/metrics-server-6c9b88dfdd-b9ljj" Apr 23 16:36:23.015989 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.015939 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/3b9aeaea-8c1b-40ec-b888-528be672bf52-secret-metrics-server-tls\") pod \"metrics-server-6c9b88dfdd-b9ljj\" (UID: \"3b9aeaea-8c1b-40ec-b888-528be672bf52\") " pod="openshift-monitoring/metrics-server-6c9b88dfdd-b9ljj" Apr 23 16:36:23.016123 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.016030 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/3b9aeaea-8c1b-40ec-b888-528be672bf52-metrics-server-audit-profiles\") pod \"metrics-server-6c9b88dfdd-b9ljj\" (UID: \"3b9aeaea-8c1b-40ec-b888-528be672bf52\") " pod="openshift-monitoring/metrics-server-6c9b88dfdd-b9ljj" Apr 23 16:36:23.016123 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.016083 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b9aeaea-8c1b-40ec-b888-528be672bf52-client-ca-bundle\") pod \"metrics-server-6c9b88dfdd-b9ljj\" (UID: \"3b9aeaea-8c1b-40ec-b888-528be672bf52\") " pod="openshift-monitoring/metrics-server-6c9b88dfdd-b9ljj" Apr 23 16:36:23.016191 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.016130 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b9aeaea-8c1b-40ec-b888-528be672bf52-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6c9b88dfdd-b9ljj\" (UID: \"3b9aeaea-8c1b-40ec-b888-528be672bf52\") " pod="openshift-monitoring/metrics-server-6c9b88dfdd-b9ljj" Apr 23 16:36:23.117391 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.117326 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/3b9aeaea-8c1b-40ec-b888-528be672bf52-audit-log\") pod \"metrics-server-6c9b88dfdd-b9ljj\" (UID: \"3b9aeaea-8c1b-40ec-b888-528be672bf52\") " pod="openshift-monitoring/metrics-server-6c9b88dfdd-b9ljj" Apr 23 16:36:23.117857 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.117396 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z2bnj\" (UniqueName: \"kubernetes.io/projected/3b9aeaea-8c1b-40ec-b888-528be672bf52-kube-api-access-z2bnj\") pod \"metrics-server-6c9b88dfdd-b9ljj\" (UID: \"3b9aeaea-8c1b-40ec-b888-528be672bf52\") " pod="openshift-monitoring/metrics-server-6c9b88dfdd-b9ljj" Apr 23 16:36:23.117857 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.117460 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/3b9aeaea-8c1b-40ec-b888-528be672bf52-secret-metrics-server-client-certs\") pod \"metrics-server-6c9b88dfdd-b9ljj\" (UID: \"3b9aeaea-8c1b-40ec-b888-528be672bf52\") " pod="openshift-monitoring/metrics-server-6c9b88dfdd-b9ljj" Apr 23 16:36:23.117857 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.117520 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/3b9aeaea-8c1b-40ec-b888-528be672bf52-secret-metrics-server-tls\") pod \"metrics-server-6c9b88dfdd-b9ljj\" (UID: \"3b9aeaea-8c1b-40ec-b888-528be672bf52\") " pod="openshift-monitoring/metrics-server-6c9b88dfdd-b9ljj" Apr 23 16:36:23.117857 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.117569 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/3b9aeaea-8c1b-40ec-b888-528be672bf52-metrics-server-audit-profiles\") pod \"metrics-server-6c9b88dfdd-b9ljj\" (UID: \"3b9aeaea-8c1b-40ec-b888-528be672bf52\") " pod="openshift-monitoring/metrics-server-6c9b88dfdd-b9ljj" Apr 23 16:36:23.117857 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.117621 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b9aeaea-8c1b-40ec-b888-528be672bf52-client-ca-bundle\") pod \"metrics-server-6c9b88dfdd-b9ljj\" (UID: \"3b9aeaea-8c1b-40ec-b888-528be672bf52\") " pod="openshift-monitoring/metrics-server-6c9b88dfdd-b9ljj" Apr 23 16:36:23.117857 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.117671 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b9aeaea-8c1b-40ec-b888-528be672bf52-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6c9b88dfdd-b9ljj\" (UID: \"3b9aeaea-8c1b-40ec-b888-528be672bf52\") " pod="openshift-monitoring/metrics-server-6c9b88dfdd-b9ljj" Apr 23 16:36:23.118157 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.118136 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/3b9aeaea-8c1b-40ec-b888-528be672bf52-audit-log\") pod \"metrics-server-6c9b88dfdd-b9ljj\" (UID: \"3b9aeaea-8c1b-40ec-b888-528be672bf52\") " pod="openshift-monitoring/metrics-server-6c9b88dfdd-b9ljj" Apr 23 16:36:23.118463 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.118417 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3b9aeaea-8c1b-40ec-b888-528be672bf52-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6c9b88dfdd-b9ljj\" (UID: \"3b9aeaea-8c1b-40ec-b888-528be672bf52\") " pod="openshift-monitoring/metrics-server-6c9b88dfdd-b9ljj" Apr 23 16:36:23.119818 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.119793 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/3b9aeaea-8c1b-40ec-b888-528be672bf52-metrics-server-audit-profiles\") pod \"metrics-server-6c9b88dfdd-b9ljj\" (UID: \"3b9aeaea-8c1b-40ec-b888-528be672bf52\") " pod="openshift-monitoring/metrics-server-6c9b88dfdd-b9ljj" Apr 23 16:36:23.120786 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.120765 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/3b9aeaea-8c1b-40ec-b888-528be672bf52-secret-metrics-server-tls\") pod \"metrics-server-6c9b88dfdd-b9ljj\" (UID: \"3b9aeaea-8c1b-40ec-b888-528be672bf52\") " pod="openshift-monitoring/metrics-server-6c9b88dfdd-b9ljj" Apr 23 16:36:23.121317 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.121297 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/3b9aeaea-8c1b-40ec-b888-528be672bf52-secret-metrics-server-client-certs\") pod \"metrics-server-6c9b88dfdd-b9ljj\" (UID: \"3b9aeaea-8c1b-40ec-b888-528be672bf52\") " pod="openshift-monitoring/metrics-server-6c9b88dfdd-b9ljj" Apr 23 16:36:23.123479 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.123436 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b9aeaea-8c1b-40ec-b888-528be672bf52-client-ca-bundle\") pod \"metrics-server-6c9b88dfdd-b9ljj\" (UID: \"3b9aeaea-8c1b-40ec-b888-528be672bf52\") " pod="openshift-monitoring/metrics-server-6c9b88dfdd-b9ljj" Apr 23 16:36:23.129320 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.129214 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2bnj\" (UniqueName: \"kubernetes.io/projected/3b9aeaea-8c1b-40ec-b888-528be672bf52-kube-api-access-z2bnj\") pod \"metrics-server-6c9b88dfdd-b9ljj\" (UID: \"3b9aeaea-8c1b-40ec-b888-528be672bf52\") " pod="openshift-monitoring/metrics-server-6c9b88dfdd-b9ljj" Apr 23 16:36:23.163820 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.163172 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-wrchg"] Apr 23 16:36:23.166726 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.166699 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-wrchg" Apr 23 16:36:23.171110 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.170853 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 23 16:36:23.171233 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.171190 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-8xnml\"" Apr 23 16:36:23.191692 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.191631 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-wrchg"] Apr 23 16:36:23.225899 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.225853 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6c9b88dfdd-b9ljj" Apr 23 16:36:23.320162 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.320112 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/848ea94b-d7c7-4f3c-8f74-d35ab6f9770a-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-wrchg\" (UID: \"848ea94b-d7c7-4f3c-8f74-d35ab6f9770a\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-wrchg" Apr 23 16:36:23.421327 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.421237 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/848ea94b-d7c7-4f3c-8f74-d35ab6f9770a-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-wrchg\" (UID: \"848ea94b-d7c7-4f3c-8f74-d35ab6f9770a\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-wrchg" Apr 23 16:36:23.425492 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.425460 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/848ea94b-d7c7-4f3c-8f74-d35ab6f9770a-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-wrchg\" (UID: \"848ea94b-d7c7-4f3c-8f74-d35ab6f9770a\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-wrchg" Apr 23 16:36:23.479153 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.479120 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-wrchg" Apr 23 16:36:23.615331 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.615298 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-6bfdbdfb67-mxpcf"] Apr 23 16:36:23.623502 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.623478 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6bfdbdfb67-mxpcf" Apr 23 16:36:23.626661 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.626636 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 23 16:36:23.626828 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.626737 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 23 16:36:23.626828 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.626812 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 23 16:36:23.626944 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.626882 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-jb5pt\"" Apr 23 16:36:23.627134 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.627113 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 23 16:36:23.627243 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.627137 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 23 16:36:23.633423 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.633399 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 23 16:36:23.634149 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.634113 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6bfdbdfb67-mxpcf"] Apr 23 16:36:23.724809 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.724708 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/acf14525-d550-4e69-b0a5-9d0557d765ad-federate-client-tls\") pod \"telemeter-client-6bfdbdfb67-mxpcf\" (UID: \"acf14525-d550-4e69-b0a5-9d0557d765ad\") " pod="openshift-monitoring/telemeter-client-6bfdbdfb67-mxpcf" Apr 23 16:36:23.724966 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.724825 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/acf14525-d550-4e69-b0a5-9d0557d765ad-telemeter-client-tls\") pod \"telemeter-client-6bfdbdfb67-mxpcf\" (UID: \"acf14525-d550-4e69-b0a5-9d0557d765ad\") " pod="openshift-monitoring/telemeter-client-6bfdbdfb67-mxpcf" Apr 23 16:36:23.724966 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.724877 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/acf14525-d550-4e69-b0a5-9d0557d765ad-metrics-client-ca\") pod \"telemeter-client-6bfdbdfb67-mxpcf\" (UID: \"acf14525-d550-4e69-b0a5-9d0557d765ad\") " pod="openshift-monitoring/telemeter-client-6bfdbdfb67-mxpcf" Apr 23 16:36:23.725082 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.724972 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/acf14525-d550-4e69-b0a5-9d0557d765ad-secret-telemeter-client\") pod \"telemeter-client-6bfdbdfb67-mxpcf\" (UID: \"acf14525-d550-4e69-b0a5-9d0557d765ad\") " pod="openshift-monitoring/telemeter-client-6bfdbdfb67-mxpcf" Apr 23 16:36:23.725082 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.725067 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/acf14525-d550-4e69-b0a5-9d0557d765ad-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6bfdbdfb67-mxpcf\" (UID: \"acf14525-d550-4e69-b0a5-9d0557d765ad\") " pod="openshift-monitoring/telemeter-client-6bfdbdfb67-mxpcf" Apr 23 16:36:23.725183 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.725101 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhvmg\" (UniqueName: \"kubernetes.io/projected/acf14525-d550-4e69-b0a5-9d0557d765ad-kube-api-access-bhvmg\") pod \"telemeter-client-6bfdbdfb67-mxpcf\" (UID: \"acf14525-d550-4e69-b0a5-9d0557d765ad\") " pod="openshift-monitoring/telemeter-client-6bfdbdfb67-mxpcf" Apr 23 16:36:23.725236 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.725203 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/acf14525-d550-4e69-b0a5-9d0557d765ad-serving-certs-ca-bundle\") pod \"telemeter-client-6bfdbdfb67-mxpcf\" (UID: \"acf14525-d550-4e69-b0a5-9d0557d765ad\") " pod="openshift-monitoring/telemeter-client-6bfdbdfb67-mxpcf" Apr 23 16:36:23.725296 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.725241 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/acf14525-d550-4e69-b0a5-9d0557d765ad-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6bfdbdfb67-mxpcf\" (UID: \"acf14525-d550-4e69-b0a5-9d0557d765ad\") " pod="openshift-monitoring/telemeter-client-6bfdbdfb67-mxpcf" Apr 23 16:36:23.825899 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.825862 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/acf14525-d550-4e69-b0a5-9d0557d765ad-federate-client-tls\") pod \"telemeter-client-6bfdbdfb67-mxpcf\" (UID: \"acf14525-d550-4e69-b0a5-9d0557d765ad\") " pod="openshift-monitoring/telemeter-client-6bfdbdfb67-mxpcf" Apr 23 16:36:23.826080 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.825929 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/acf14525-d550-4e69-b0a5-9d0557d765ad-telemeter-client-tls\") pod \"telemeter-client-6bfdbdfb67-mxpcf\" (UID: \"acf14525-d550-4e69-b0a5-9d0557d765ad\") " pod="openshift-monitoring/telemeter-client-6bfdbdfb67-mxpcf" Apr 23 16:36:23.826080 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.825970 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/acf14525-d550-4e69-b0a5-9d0557d765ad-metrics-client-ca\") pod \"telemeter-client-6bfdbdfb67-mxpcf\" (UID: \"acf14525-d550-4e69-b0a5-9d0557d765ad\") " pod="openshift-monitoring/telemeter-client-6bfdbdfb67-mxpcf" Apr 23 16:36:23.826080 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.826000 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/acf14525-d550-4e69-b0a5-9d0557d765ad-secret-telemeter-client\") pod \"telemeter-client-6bfdbdfb67-mxpcf\" (UID: \"acf14525-d550-4e69-b0a5-9d0557d765ad\") " pod="openshift-monitoring/telemeter-client-6bfdbdfb67-mxpcf" Apr 23 16:36:23.826080 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.826047 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/acf14525-d550-4e69-b0a5-9d0557d765ad-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6bfdbdfb67-mxpcf\" (UID: \"acf14525-d550-4e69-b0a5-9d0557d765ad\") " pod="openshift-monitoring/telemeter-client-6bfdbdfb67-mxpcf" Apr 23 16:36:23.826080 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.826074 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bhvmg\" (UniqueName: \"kubernetes.io/projected/acf14525-d550-4e69-b0a5-9d0557d765ad-kube-api-access-bhvmg\") pod \"telemeter-client-6bfdbdfb67-mxpcf\" (UID: \"acf14525-d550-4e69-b0a5-9d0557d765ad\") " pod="openshift-monitoring/telemeter-client-6bfdbdfb67-mxpcf" Apr 23 16:36:23.826357 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.826152 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/acf14525-d550-4e69-b0a5-9d0557d765ad-serving-certs-ca-bundle\") pod \"telemeter-client-6bfdbdfb67-mxpcf\" (UID: \"acf14525-d550-4e69-b0a5-9d0557d765ad\") " pod="openshift-monitoring/telemeter-client-6bfdbdfb67-mxpcf" Apr 23 16:36:23.826357 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.826177 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/acf14525-d550-4e69-b0a5-9d0557d765ad-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6bfdbdfb67-mxpcf\" (UID: \"acf14525-d550-4e69-b0a5-9d0557d765ad\") " pod="openshift-monitoring/telemeter-client-6bfdbdfb67-mxpcf" Apr 23 16:36:23.827265 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.826794 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/acf14525-d550-4e69-b0a5-9d0557d765ad-metrics-client-ca\") pod \"telemeter-client-6bfdbdfb67-mxpcf\" (UID: \"acf14525-d550-4e69-b0a5-9d0557d765ad\") " pod="openshift-monitoring/telemeter-client-6bfdbdfb67-mxpcf" Apr 23 16:36:23.827265 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.827112 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/acf14525-d550-4e69-b0a5-9d0557d765ad-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6bfdbdfb67-mxpcf\" (UID: \"acf14525-d550-4e69-b0a5-9d0557d765ad\") " pod="openshift-monitoring/telemeter-client-6bfdbdfb67-mxpcf" Apr 23 16:36:23.827445 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.827414 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/acf14525-d550-4e69-b0a5-9d0557d765ad-serving-certs-ca-bundle\") pod \"telemeter-client-6bfdbdfb67-mxpcf\" (UID: \"acf14525-d550-4e69-b0a5-9d0557d765ad\") " pod="openshift-monitoring/telemeter-client-6bfdbdfb67-mxpcf" Apr 23 16:36:23.829014 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.828985 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/acf14525-d550-4e69-b0a5-9d0557d765ad-federate-client-tls\") pod \"telemeter-client-6bfdbdfb67-mxpcf\" (UID: \"acf14525-d550-4e69-b0a5-9d0557d765ad\") " pod="openshift-monitoring/telemeter-client-6bfdbdfb67-mxpcf" Apr 23 16:36:23.829172 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.829139 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/acf14525-d550-4e69-b0a5-9d0557d765ad-telemeter-client-tls\") pod \"telemeter-client-6bfdbdfb67-mxpcf\" (UID: \"acf14525-d550-4e69-b0a5-9d0557d765ad\") " pod="openshift-monitoring/telemeter-client-6bfdbdfb67-mxpcf" Apr 23 16:36:23.829544 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.829521 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/acf14525-d550-4e69-b0a5-9d0557d765ad-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6bfdbdfb67-mxpcf\" (UID: \"acf14525-d550-4e69-b0a5-9d0557d765ad\") " pod="openshift-monitoring/telemeter-client-6bfdbdfb67-mxpcf" Apr 23 16:36:23.829777 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.829732 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/acf14525-d550-4e69-b0a5-9d0557d765ad-secret-telemeter-client\") pod \"telemeter-client-6bfdbdfb67-mxpcf\" (UID: \"acf14525-d550-4e69-b0a5-9d0557d765ad\") " pod="openshift-monitoring/telemeter-client-6bfdbdfb67-mxpcf" Apr 23 16:36:23.835594 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.835566 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhvmg\" (UniqueName: \"kubernetes.io/projected/acf14525-d550-4e69-b0a5-9d0557d765ad-kube-api-access-bhvmg\") pod \"telemeter-client-6bfdbdfb67-mxpcf\" (UID: \"acf14525-d550-4e69-b0a5-9d0557d765ad\") " pod="openshift-monitoring/telemeter-client-6bfdbdfb67-mxpcf" Apr 23 16:36:23.935755 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:23.935710 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6bfdbdfb67-mxpcf" Apr 23 16:36:24.659886 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.659836 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 16:36:24.680917 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.680709 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:24.683915 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.683844 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 16:36:24.684756 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.684710 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-tdx6x\"" Apr 23 16:36:24.686313 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.686279 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 23 16:36:24.686805 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.686785 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 23 16:36:24.686979 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.686957 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 23 16:36:24.687069 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.687032 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 23 16:36:24.687944 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.687353 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 23 16:36:24.687944 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.687433 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 23 16:36:24.687944 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.687652 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 23 16:36:24.687944 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.687823 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 23 16:36:24.687944 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.687880 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 23 16:36:24.691604 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.688201 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-4chnbk5ae574f\"" Apr 23 16:36:24.691604 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.688710 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 23 16:36:24.691604 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.689400 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 23 16:36:24.691604 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.691442 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 23 16:36:24.834752 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.834695 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e86cbe0f-1d12-44e9-8b77-a009364ca559-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:24.834925 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.834775 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e86cbe0f-1d12-44e9-8b77-a009364ca559-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:24.834925 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.834831 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e86cbe0f-1d12-44e9-8b77-a009364ca559-config\") pod \"prometheus-k8s-0\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:24.834925 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.834861 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e86cbe0f-1d12-44e9-8b77-a009364ca559-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:24.834925 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.834896 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e86cbe0f-1d12-44e9-8b77-a009364ca559-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:24.835171 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.834958 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e86cbe0f-1d12-44e9-8b77-a009364ca559-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:24.835171 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.834982 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e86cbe0f-1d12-44e9-8b77-a009364ca559-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:24.835171 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.835011 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e86cbe0f-1d12-44e9-8b77-a009364ca559-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:24.835171 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.835096 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e86cbe0f-1d12-44e9-8b77-a009364ca559-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:24.835171 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.835145 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e86cbe0f-1d12-44e9-8b77-a009364ca559-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:24.835171 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.835173 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e86cbe0f-1d12-44e9-8b77-a009364ca559-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:24.835441 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.835200 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e86cbe0f-1d12-44e9-8b77-a009364ca559-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:24.835441 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.835229 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e86cbe0f-1d12-44e9-8b77-a009364ca559-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:24.835441 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.835335 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nc75\" (UniqueName: \"kubernetes.io/projected/e86cbe0f-1d12-44e9-8b77-a009364ca559-kube-api-access-2nc75\") pod \"prometheus-k8s-0\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:24.835441 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.835392 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e86cbe0f-1d12-44e9-8b77-a009364ca559-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:24.835441 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.835420 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e86cbe0f-1d12-44e9-8b77-a009364ca559-web-config\") pod \"prometheus-k8s-0\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:24.835667 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.835461 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e86cbe0f-1d12-44e9-8b77-a009364ca559-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:24.835667 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.835485 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e86cbe0f-1d12-44e9-8b77-a009364ca559-config-out\") pod \"prometheus-k8s-0\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:24.936263 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.936178 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e86cbe0f-1d12-44e9-8b77-a009364ca559-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:24.936263 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.936235 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2nc75\" (UniqueName: \"kubernetes.io/projected/e86cbe0f-1d12-44e9-8b77-a009364ca559-kube-api-access-2nc75\") pod \"prometheus-k8s-0\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:24.936492 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.936269 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e86cbe0f-1d12-44e9-8b77-a009364ca559-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:24.936492 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.936293 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e86cbe0f-1d12-44e9-8b77-a009364ca559-web-config\") pod \"prometheus-k8s-0\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:24.936492 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.936335 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e86cbe0f-1d12-44e9-8b77-a009364ca559-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:24.936492 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.936358 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e86cbe0f-1d12-44e9-8b77-a009364ca559-config-out\") pod \"prometheus-k8s-0\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:24.936492 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.936392 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e86cbe0f-1d12-44e9-8b77-a009364ca559-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:24.936492 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.936421 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e86cbe0f-1d12-44e9-8b77-a009364ca559-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:24.936492 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.936450 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e86cbe0f-1d12-44e9-8b77-a009364ca559-config\") pod \"prometheus-k8s-0\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:24.936492 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.936476 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e86cbe0f-1d12-44e9-8b77-a009364ca559-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:24.936871 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.936508 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e86cbe0f-1d12-44e9-8b77-a009364ca559-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:24.936871 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.936565 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e86cbe0f-1d12-44e9-8b77-a009364ca559-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:24.936871 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.936588 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e86cbe0f-1d12-44e9-8b77-a009364ca559-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:24.936871 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.936618 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e86cbe0f-1d12-44e9-8b77-a009364ca559-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:24.936871 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.936664 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e86cbe0f-1d12-44e9-8b77-a009364ca559-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:24.936871 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.936705 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e86cbe0f-1d12-44e9-8b77-a009364ca559-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:24.936871 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.936730 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e86cbe0f-1d12-44e9-8b77-a009364ca559-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:24.936871 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.936786 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e86cbe0f-1d12-44e9-8b77-a009364ca559-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:24.937694 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.937662 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e86cbe0f-1d12-44e9-8b77-a009364ca559-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:24.938346 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.938238 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e86cbe0f-1d12-44e9-8b77-a009364ca559-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:24.941034 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.939205 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e86cbe0f-1d12-44e9-8b77-a009364ca559-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:24.941034 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.939587 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e86cbe0f-1d12-44e9-8b77-a009364ca559-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:24.942349 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.942325 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e86cbe0f-1d12-44e9-8b77-a009364ca559-config\") pod \"prometheus-k8s-0\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:24.942608 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.942588 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e86cbe0f-1d12-44e9-8b77-a009364ca559-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:24.942711 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.942692 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e86cbe0f-1d12-44e9-8b77-a009364ca559-config-out\") pod \"prometheus-k8s-0\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:24.943089 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.943070 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e86cbe0f-1d12-44e9-8b77-a009364ca559-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:24.943659 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.943633 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e86cbe0f-1d12-44e9-8b77-a009364ca559-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:24.944403 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.944378 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e86cbe0f-1d12-44e9-8b77-a009364ca559-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:24.945009 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.944986 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e86cbe0f-1d12-44e9-8b77-a009364ca559-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:24.953615 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.953548 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e86cbe0f-1d12-44e9-8b77-a009364ca559-web-config\") pod \"prometheus-k8s-0\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:24.953800 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.953764 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e86cbe0f-1d12-44e9-8b77-a009364ca559-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:24.954158 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.954131 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e86cbe0f-1d12-44e9-8b77-a009364ca559-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:24.954304 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.954282 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nc75\" (UniqueName: \"kubernetes.io/projected/e86cbe0f-1d12-44e9-8b77-a009364ca559-kube-api-access-2nc75\") pod \"prometheus-k8s-0\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:24.954653 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.954612 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e86cbe0f-1d12-44e9-8b77-a009364ca559-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:24.955143 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.955112 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e86cbe0f-1d12-44e9-8b77-a009364ca559-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:24.963445 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.963411 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e86cbe0f-1d12-44e9-8b77-a009364ca559-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:24.997560 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:24.997494 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:25.441173 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:25.441135 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/adbb31b5-ee6b-431b-ac95-7775688ba039-metrics-certs\") pod \"network-metrics-daemon-6wbcq\" (UID: \"adbb31b5-ee6b-431b-ac95-7775688ba039\") " pod="openshift-multus/network-metrics-daemon-6wbcq" Apr 23 16:36:25.444180 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:25.444136 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 16:36:25.456417 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:25.456330 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/adbb31b5-ee6b-431b-ac95-7775688ba039-metrics-certs\") pod \"network-metrics-daemon-6wbcq\" (UID: \"adbb31b5-ee6b-431b-ac95-7775688ba039\") " pod="openshift-multus/network-metrics-daemon-6wbcq" Apr 23 16:36:25.491995 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:25.489776 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-5zkmp\"" Apr 23 16:36:25.501321 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:25.500015 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-6wbcq" Apr 23 16:36:25.589422 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:25.589273 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6bfdbdfb67-mxpcf"] Apr 23 16:36:25.600921 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:25.600861 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-wrchg"] Apr 23 16:36:25.644473 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:25.644438 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-95594\" (UniqueName: \"kubernetes.io/projected/0ee2ca50-34ff-4830-8c04-92018768a3a7-kube-api-access-95594\") pod \"network-check-target-wn6cd\" (UID: \"0ee2ca50-34ff-4830-8c04-92018768a3a7\") " pod="openshift-network-diagnostics/network-check-target-wn6cd" Apr 23 16:36:25.647725 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:25.647691 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 16:36:25.657512 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:25.657481 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 16:36:25.668316 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:25.668293 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-95594\" (UniqueName: \"kubernetes.io/projected/0ee2ca50-34ff-4830-8c04-92018768a3a7-kube-api-access-95594\") pod \"network-check-target-wn6cd\" (UID: \"0ee2ca50-34ff-4830-8c04-92018768a3a7\") " pod="openshift-network-diagnostics/network-check-target-wn6cd" Apr 23 16:36:25.682990 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:25.682829 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-6wbcq"] Apr 23 16:36:25.686370 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:36:25.686337 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadbb31b5_ee6b_431b_ac95_7775688ba039.slice/crio-b706dbf5afd0e5cae9b024ca7bbfd1d09df4ad146c8561a323ac334c74ba9aa1 WatchSource:0}: Error finding container b706dbf5afd0e5cae9b024ca7bbfd1d09df4ad146c8561a323ac334c74ba9aa1: Status 404 returned error can't find the container with id b706dbf5afd0e5cae9b024ca7bbfd1d09df4ad146c8561a323ac334c74ba9aa1 Apr 23 16:36:25.778582 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:25.778552 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-hhjgv\"" Apr 23 16:36:25.786140 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:25.786108 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wn6cd" Apr 23 16:36:25.833598 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:25.833547 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6c9b88dfdd-b9ljj"] Apr 23 16:36:25.841880 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:25.841824 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 16:36:25.922999 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:25.922968 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-wn6cd"] Apr 23 16:36:25.926394 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:36:25.926364 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ee2ca50_34ff_4830_8c04_92018768a3a7.slice/crio-0b0618b67b773f28f99458a483cad23acd13f859a6172af55c2fbad4108d8cbb WatchSource:0}: Error finding container 0b0618b67b773f28f99458a483cad23acd13f859a6172af55c2fbad4108d8cbb: Status 404 returned error can't find the container with id 0b0618b67b773f28f99458a483cad23acd13f859a6172af55c2fbad4108d8cbb Apr 23 16:36:26.049156 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:26.049117 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6cbbbb9d44-bmcf2" event={"ID":"d1b9341c-ba96-4e86-80f2-5260ed07f0a1","Type":"ContainerStarted","Data":"2dc1118604faa682a1571b866c0bc4a0a1333cd0f6326f076bdc73c31aecaf17"} Apr 23 16:36:26.050140 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:26.050116 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-wn6cd" event={"ID":"0ee2ca50-34ff-4830-8c04-92018768a3a7","Type":"ContainerStarted","Data":"0b0618b67b773f28f99458a483cad23acd13f859a6172af55c2fbad4108d8cbb"} Apr 23 16:36:26.051136 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:26.051117 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-wrchg" event={"ID":"848ea94b-d7c7-4f3c-8f74-d35ab6f9770a","Type":"ContainerStarted","Data":"1d9b022bbf08578d47110d70f4545af8623bd77425ea8e3a699af917fc2bfd67"} Apr 23 16:36:26.052858 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:26.052831 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7cbf84d895-rj6tl" event={"ID":"d2eb8e62-9f07-4d19-9d67-f66771d26287","Type":"ContainerStarted","Data":"5a73d52910651445ae656e79e71344824f5836abe33a773fceac12d78988a20c"} Apr 23 16:36:26.052858 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:26.052856 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7cbf84d895-rj6tl" event={"ID":"d2eb8e62-9f07-4d19-9d67-f66771d26287","Type":"ContainerStarted","Data":"da82c4a83fbe8a079e3afe6cb9246ac1bff5150befe071c3027a27db91058539"} Apr 23 16:36:26.052997 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:26.052870 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7cbf84d895-rj6tl" event={"ID":"d2eb8e62-9f07-4d19-9d67-f66771d26287","Type":"ContainerStarted","Data":"96ca1755924db0ae387e7a880e71dd6c118bfd958d549e8f502aa9f0d08831ed"} Apr 23 16:36:26.053848 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:26.053819 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6wbcq" event={"ID":"adbb31b5-ee6b-431b-ac95-7775688ba039","Type":"ContainerStarted","Data":"b706dbf5afd0e5cae9b024ca7bbfd1d09df4ad146c8561a323ac334c74ba9aa1"} Apr 23 16:36:26.054822 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:26.054802 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6c9b88dfdd-b9ljj" event={"ID":"3b9aeaea-8c1b-40ec-b888-528be672bf52","Type":"ContainerStarted","Data":"cd97260cf779085ab9ca28506bafa821b60764be56ae308530cb38cd16868033"} Apr 23 16:36:26.055676 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:26.055657 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6bfdbdfb67-mxpcf" event={"ID":"acf14525-d550-4e69-b0a5-9d0557d765ad","Type":"ContainerStarted","Data":"a2af1651f2f1c6f473b80a4bd4c87de1dc8ae25c88e6220ba81a8f2db55165c9"} Apr 23 16:36:26.056578 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:26.056561 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e86cbe0f-1d12-44e9-8b77-a009364ca559","Type":"ContainerStarted","Data":"53894a29fa714ab12508ee5acc3c6d6e457dcca493ea53d8a4d8ce0d8411e563"} Apr 23 16:36:26.071996 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:26.071955 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6cbbbb9d44-bmcf2" podStartSLOduration=1.4329324030000001 podStartE2EDuration="5.071942589s" podCreationTimestamp="2026-04-23 16:36:21 +0000 UTC" firstStartedPulling="2026-04-23 16:36:21.773566317 +0000 UTC m=+61.626381362" lastFinishedPulling="2026-04-23 16:36:25.412576495 +0000 UTC m=+65.265391548" observedRunningTime="2026-04-23 16:36:26.071366773 +0000 UTC m=+65.924181847" watchObservedRunningTime="2026-04-23 16:36:26.071942589 +0000 UTC m=+65.924757653" Apr 23 16:36:29.915932 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:29.915897 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6cbbbb9d44-bmcf2"] Apr 23 16:36:31.097123 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:31.097085 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7cbf84d895-rj6tl" event={"ID":"d2eb8e62-9f07-4d19-9d67-f66771d26287","Type":"ContainerStarted","Data":"33733baf59db54d98edf66ccdfc889b42e306df69fbe6e50178acbdac516dee6"} Apr 23 16:36:31.097123 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:31.097129 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7cbf84d895-rj6tl" event={"ID":"d2eb8e62-9f07-4d19-9d67-f66771d26287","Type":"ContainerStarted","Data":"153a533f0b96690df8a68933e16f5877dee08f07b6449dcc49fe145b7f7e1551"} Apr 23 16:36:31.097610 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:31.097142 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7cbf84d895-rj6tl" event={"ID":"d2eb8e62-9f07-4d19-9d67-f66771d26287","Type":"ContainerStarted","Data":"b98996da87d065c77fcbd50d50899553f00c3435d56737736eb81282dd43ec8a"} Apr 23 16:36:31.097610 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:31.097420 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-7cbf84d895-rj6tl" Apr 23 16:36:31.099325 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:31.099294 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6wbcq" event={"ID":"adbb31b5-ee6b-431b-ac95-7775688ba039","Type":"ContainerStarted","Data":"bb3f4205eb39d3c9d05dcce73d175c7e986bb1dc54e93a74599b1372a96ecaf4"} Apr 23 16:36:31.099513 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:31.099496 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-6wbcq" event={"ID":"adbb31b5-ee6b-431b-ac95-7775688ba039","Type":"ContainerStarted","Data":"d4933a704295055bf761808767db11a569d9a5849a136f2e1eca8bf1933ae4f1"} Apr 23 16:36:31.100793 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:31.100770 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6c9b88dfdd-b9ljj" event={"ID":"3b9aeaea-8c1b-40ec-b888-528be672bf52","Type":"ContainerStarted","Data":"e81739212bf8c9b3057d7feb5b02b7ed8aba9c23836b0168afc7b7eb59cd22a7"} Apr 23 16:36:31.103044 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:31.103007 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6bfdbdfb67-mxpcf" event={"ID":"acf14525-d550-4e69-b0a5-9d0557d765ad","Type":"ContainerStarted","Data":"130f5b5ed0430e67623b4782dae80ebfb592e19aa6443b3172d84d10acbf4fac"} Apr 23 16:36:31.103180 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:31.103166 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6bfdbdfb67-mxpcf" event={"ID":"acf14525-d550-4e69-b0a5-9d0557d765ad","Type":"ContainerStarted","Data":"5c3392916a34349e3ee52015bdcad67a9a3d903c6b6f96d9fcf904eb50d3e27e"} Apr 23 16:36:31.103379 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:31.103364 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6bfdbdfb67-mxpcf" event={"ID":"acf14525-d550-4e69-b0a5-9d0557d765ad","Type":"ContainerStarted","Data":"578257d8574fcee3dd540edeb56a63bcf4c6633e092bc2094bd8451f6eea1a76"} Apr 23 16:36:31.104801 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:31.104775 2562 generic.go:358] "Generic (PLEG): container finished" podID="e86cbe0f-1d12-44e9-8b77-a009364ca559" containerID="fcb955a8997e18ad73021bed8baafba609e9448e5312b6614c345bbdd1c1cfe3" exitCode=0 Apr 23 16:36:31.104909 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:31.104854 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e86cbe0f-1d12-44e9-8b77-a009364ca559","Type":"ContainerDied","Data":"fcb955a8997e18ad73021bed8baafba609e9448e5312b6614c345bbdd1c1cfe3"} Apr 23 16:36:31.105489 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:31.105473 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-7cbf84d895-rj6tl" Apr 23 16:36:31.106379 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:31.106338 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-wn6cd" event={"ID":"0ee2ca50-34ff-4830-8c04-92018768a3a7","Type":"ContainerStarted","Data":"6a9a36a8577368b64a53bfecabd20aa260c8117646e1eab40b68e793e5c440be"} Apr 23 16:36:31.106514 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:31.106497 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-wn6cd" Apr 23 16:36:31.107631 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:31.107610 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-wrchg" event={"ID":"848ea94b-d7c7-4f3c-8f74-d35ab6f9770a","Type":"ContainerStarted","Data":"5e95e00631afedc96400b6d54bfc33643f41d86e13835ec6724e812d21f01363"} Apr 23 16:36:31.107865 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:31.107850 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-wrchg" Apr 23 16:36:31.112882 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:31.112851 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-wrchg" Apr 23 16:36:31.129850 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:31.129806 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-7cbf84d895-rj6tl" podStartSLOduration=2.096140184 podStartE2EDuration="10.12979267s" podCreationTimestamp="2026-04-23 16:36:21 +0000 UTC" firstStartedPulling="2026-04-23 16:36:21.905284818 +0000 UTC m=+61.758099866" lastFinishedPulling="2026-04-23 16:36:29.9389373 +0000 UTC m=+69.791752352" observedRunningTime="2026-04-23 16:36:31.127127474 +0000 UTC m=+70.979942538" watchObservedRunningTime="2026-04-23 16:36:31.12979267 +0000 UTC m=+70.982607734" Apr 23 16:36:31.191931 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:31.191879 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-6bfdbdfb67-mxpcf" podStartSLOduration=3.853158876 podStartE2EDuration="8.191864256s" podCreationTimestamp="2026-04-23 16:36:23 +0000 UTC" firstStartedPulling="2026-04-23 16:36:25.613980495 +0000 UTC m=+65.466795536" lastFinishedPulling="2026-04-23 16:36:29.95268586 +0000 UTC m=+69.805500916" observedRunningTime="2026-04-23 16:36:31.18991861 +0000 UTC m=+71.042733675" watchObservedRunningTime="2026-04-23 16:36:31.191864256 +0000 UTC m=+71.044679298" Apr 23 16:36:31.193455 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:31.193432 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/451de1c0-4375-4c37-8a62-0641aa75255d-original-pull-secret\") pod \"global-pull-secret-syncer-qzdzg\" (UID: \"451de1c0-4375-4c37-8a62-0641aa75255d\") " pod="kube-system/global-pull-secret-syncer-qzdzg" Apr 23 16:36:31.196458 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:31.196434 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 16:36:31.206278 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:31.206249 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/451de1c0-4375-4c37-8a62-0641aa75255d-original-pull-secret\") pod \"global-pull-secret-syncer-qzdzg\" (UID: \"451de1c0-4375-4c37-8a62-0641aa75255d\") " pod="kube-system/global-pull-secret-syncer-qzdzg" Apr 23 16:36:31.213998 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:31.213955 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-6wbcq" podStartSLOduration=66.954614492 podStartE2EDuration="1m11.213939965s" podCreationTimestamp="2026-04-23 16:35:20 +0000 UTC" firstStartedPulling="2026-04-23 16:36:25.688290479 +0000 UTC m=+65.541105525" lastFinishedPulling="2026-04-23 16:36:29.94761595 +0000 UTC m=+69.800430998" observedRunningTime="2026-04-23 16:36:31.213073611 +0000 UTC m=+71.065888675" watchObservedRunningTime="2026-04-23 16:36:31.213939965 +0000 UTC m=+71.066755029" Apr 23 16:36:31.240192 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:31.240145 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-wn6cd" podStartSLOduration=67.201418869 podStartE2EDuration="1m11.240128781s" podCreationTimestamp="2026-04-23 16:35:20 +0000 UTC" firstStartedPulling="2026-04-23 16:36:25.928366011 +0000 UTC m=+65.781181053" lastFinishedPulling="2026-04-23 16:36:29.967075911 +0000 UTC m=+69.819890965" observedRunningTime="2026-04-23 16:36:31.239623723 +0000 UTC m=+71.092438804" watchObservedRunningTime="2026-04-23 16:36:31.240128781 +0000 UTC m=+71.092943846" Apr 23 16:36:31.270870 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:31.270820 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-6c9b88dfdd-b9ljj" podStartSLOduration=5.176025556 podStartE2EDuration="9.270805127s" podCreationTimestamp="2026-04-23 16:36:22 +0000 UTC" firstStartedPulling="2026-04-23 16:36:25.858695699 +0000 UTC m=+65.711510742" lastFinishedPulling="2026-04-23 16:36:29.953475266 +0000 UTC m=+69.806290313" observedRunningTime="2026-04-23 16:36:31.268733518 +0000 UTC m=+71.121548581" watchObservedRunningTime="2026-04-23 16:36:31.270805127 +0000 UTC m=+71.123620191" Apr 23 16:36:31.290944 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:31.290895 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-wrchg" podStartSLOduration=3.9540747769999998 podStartE2EDuration="8.290880823s" podCreationTimestamp="2026-04-23 16:36:23 +0000 UTC" firstStartedPulling="2026-04-23 16:36:25.615858022 +0000 UTC m=+65.468673070" lastFinishedPulling="2026-04-23 16:36:29.952664069 +0000 UTC m=+69.805479116" observedRunningTime="2026-04-23 16:36:31.29012467 +0000 UTC m=+71.142939726" watchObservedRunningTime="2026-04-23 16:36:31.290880823 +0000 UTC m=+71.143695886" Apr 23 16:36:31.463495 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:31.463452 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qzdzg" Apr 23 16:36:31.623022 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:31.622816 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-qzdzg"] Apr 23 16:36:31.626215 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:36:31.626169 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod451de1c0_4375_4c37_8a62_0641aa75255d.slice/crio-9906c5075f8c74d52e7afdd9f78bfceac383a16d4c48317bd35090d3ac8d93be WatchSource:0}: Error finding container 9906c5075f8c74d52e7afdd9f78bfceac383a16d4c48317bd35090d3ac8d93be: Status 404 returned error can't find the container with id 9906c5075f8c74d52e7afdd9f78bfceac383a16d4c48317bd35090d3ac8d93be Apr 23 16:36:31.648532 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:31.648502 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6cbbbb9d44-bmcf2" Apr 23 16:36:32.112736 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:32.112687 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-qzdzg" event={"ID":"451de1c0-4375-4c37-8a62-0641aa75255d","Type":"ContainerStarted","Data":"9906c5075f8c74d52e7afdd9f78bfceac383a16d4c48317bd35090d3ac8d93be"} Apr 23 16:36:36.128770 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:36.128676 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e86cbe0f-1d12-44e9-8b77-a009364ca559","Type":"ContainerStarted","Data":"e7aa6b764629d6c69f1903ce8ae868fbaabdacc0c13bf3ec539385257202be42"} Apr 23 16:36:36.128770 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:36.128715 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e86cbe0f-1d12-44e9-8b77-a009364ca559","Type":"ContainerStarted","Data":"d9913196204ce6713caae5213041d1aba6d71f5a727d2b3588dd9e4db9b75f5e"} Apr 23 16:36:36.128770 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:36.128729 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e86cbe0f-1d12-44e9-8b77-a009364ca559","Type":"ContainerStarted","Data":"9d2067864899f370575c570f30e9f0ce2aed1e5bcff88c221c8c6e62dd1ea393"} Apr 23 16:36:36.128770 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:36.128756 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e86cbe0f-1d12-44e9-8b77-a009364ca559","Type":"ContainerStarted","Data":"2a87ebf5c434c4390f8b278dd5bfb562bcd5d797b61aa9247251a773e431c8b1"} Apr 23 16:36:36.128770 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:36.128767 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e86cbe0f-1d12-44e9-8b77-a009364ca559","Type":"ContainerStarted","Data":"dba06d812a3c4d2dd3912a3ebe958762b30e870180ed03dde4ab1f7787202a6d"} Apr 23 16:36:36.128770 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:36.128776 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e86cbe0f-1d12-44e9-8b77-a009364ca559","Type":"ContainerStarted","Data":"60242ccdc09d321ac02426ef69b0c8d1270c3397c772c238ecc7cefe5b36957b"} Apr 23 16:36:36.130173 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:36.130147 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-qzdzg" event={"ID":"451de1c0-4375-4c37-8a62-0641aa75255d","Type":"ContainerStarted","Data":"c155d36a91d70615f624889d7f431a34bcec3a923c9b836e121c9ef38889b523"} Apr 23 16:36:36.166885 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:36.166834 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.566764981 podStartE2EDuration="12.166815881s" podCreationTimestamp="2026-04-23 16:36:24 +0000 UTC" firstStartedPulling="2026-04-23 16:36:25.858326145 +0000 UTC m=+65.711141204" lastFinishedPulling="2026-04-23 16:36:35.458377059 +0000 UTC m=+75.311192104" observedRunningTime="2026-04-23 16:36:36.163797719 +0000 UTC m=+76.016612788" watchObservedRunningTime="2026-04-23 16:36:36.166815881 +0000 UTC m=+76.019630948" Apr 23 16:36:36.181852 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:36.181804 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-qzdzg" podStartSLOduration=65.306195976 podStartE2EDuration="1m9.181789441s" podCreationTimestamp="2026-04-23 16:35:27 +0000 UTC" firstStartedPulling="2026-04-23 16:36:31.62969286 +0000 UTC m=+71.482507919" lastFinishedPulling="2026-04-23 16:36:35.505286339 +0000 UTC m=+75.358101384" observedRunningTime="2026-04-23 16:36:36.180664936 +0000 UTC m=+76.033480001" watchObservedRunningTime="2026-04-23 16:36:36.181789441 +0000 UTC m=+76.034604505" Apr 23 16:36:39.998761 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:39.998638 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:43.226240 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:43.226202 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-6c9b88dfdd-b9ljj" Apr 23 16:36:43.226240 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:43.226246 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-6c9b88dfdd-b9ljj" Apr 23 16:36:54.935329 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:54.935263 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6cbbbb9d44-bmcf2" podUID="d1b9341c-ba96-4e86-80f2-5260ed07f0a1" containerName="console" containerID="cri-o://2dc1118604faa682a1571b866c0bc4a0a1333cd0f6326f076bdc73c31aecaf17" gracePeriod=15 Apr 23 16:36:55.187244 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:55.187185 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6cbbbb9d44-bmcf2_d1b9341c-ba96-4e86-80f2-5260ed07f0a1/console/0.log" Apr 23 16:36:55.187244 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:55.187230 2562 generic.go:358] "Generic (PLEG): container finished" podID="d1b9341c-ba96-4e86-80f2-5260ed07f0a1" containerID="2dc1118604faa682a1571b866c0bc4a0a1333cd0f6326f076bdc73c31aecaf17" exitCode=2 Apr 23 16:36:55.187412 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:55.187273 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6cbbbb9d44-bmcf2" event={"ID":"d1b9341c-ba96-4e86-80f2-5260ed07f0a1","Type":"ContainerDied","Data":"2dc1118604faa682a1571b866c0bc4a0a1333cd0f6326f076bdc73c31aecaf17"} Apr 23 16:36:55.187412 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:55.187317 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6cbbbb9d44-bmcf2" event={"ID":"d1b9341c-ba96-4e86-80f2-5260ed07f0a1","Type":"ContainerDied","Data":"66059a173649183a80e48974ce0abc52b3f39b7b5cdbf7519ee7eee4ed48e10b"} Apr 23 16:36:55.187412 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:55.187332 2562 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66059a173649183a80e48974ce0abc52b3f39b7b5cdbf7519ee7eee4ed48e10b" Apr 23 16:36:55.187736 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:55.187724 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6cbbbb9d44-bmcf2_d1b9341c-ba96-4e86-80f2-5260ed07f0a1/console/0.log" Apr 23 16:36:55.187817 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:55.187807 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6cbbbb9d44-bmcf2" Apr 23 16:36:55.204894 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:55.204865 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d1b9341c-ba96-4e86-80f2-5260ed07f0a1-console-serving-cert\") pod \"d1b9341c-ba96-4e86-80f2-5260ed07f0a1\" (UID: \"d1b9341c-ba96-4e86-80f2-5260ed07f0a1\") " Apr 23 16:36:55.205055 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:55.204901 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d1b9341c-ba96-4e86-80f2-5260ed07f0a1-console-config\") pod \"d1b9341c-ba96-4e86-80f2-5260ed07f0a1\" (UID: \"d1b9341c-ba96-4e86-80f2-5260ed07f0a1\") " Apr 23 16:36:55.205055 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:55.204947 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1b9341c-ba96-4e86-80f2-5260ed07f0a1-trusted-ca-bundle\") pod \"d1b9341c-ba96-4e86-80f2-5260ed07f0a1\" (UID: \"d1b9341c-ba96-4e86-80f2-5260ed07f0a1\") " Apr 23 16:36:55.205055 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:55.204993 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d1b9341c-ba96-4e86-80f2-5260ed07f0a1-console-oauth-config\") pod \"d1b9341c-ba96-4e86-80f2-5260ed07f0a1\" (UID: \"d1b9341c-ba96-4e86-80f2-5260ed07f0a1\") " Apr 23 16:36:55.205055 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:55.205024 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d56vj\" (UniqueName: \"kubernetes.io/projected/d1b9341c-ba96-4e86-80f2-5260ed07f0a1-kube-api-access-d56vj\") pod \"d1b9341c-ba96-4e86-80f2-5260ed07f0a1\" (UID: \"d1b9341c-ba96-4e86-80f2-5260ed07f0a1\") " Apr 23 16:36:55.205055 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:55.205047 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d1b9341c-ba96-4e86-80f2-5260ed07f0a1-oauth-serving-cert\") pod \"d1b9341c-ba96-4e86-80f2-5260ed07f0a1\" (UID: \"d1b9341c-ba96-4e86-80f2-5260ed07f0a1\") " Apr 23 16:36:55.205279 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:55.205118 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d1b9341c-ba96-4e86-80f2-5260ed07f0a1-service-ca\") pod \"d1b9341c-ba96-4e86-80f2-5260ed07f0a1\" (UID: \"d1b9341c-ba96-4e86-80f2-5260ed07f0a1\") " Apr 23 16:36:55.205830 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:55.205774 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1b9341c-ba96-4e86-80f2-5260ed07f0a1-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d1b9341c-ba96-4e86-80f2-5260ed07f0a1" (UID: "d1b9341c-ba96-4e86-80f2-5260ed07f0a1"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:36:55.205959 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:55.205817 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1b9341c-ba96-4e86-80f2-5260ed07f0a1-console-config" (OuterVolumeSpecName: "console-config") pod "d1b9341c-ba96-4e86-80f2-5260ed07f0a1" (UID: "d1b9341c-ba96-4e86-80f2-5260ed07f0a1"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:36:55.206288 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:55.206135 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1b9341c-ba96-4e86-80f2-5260ed07f0a1-service-ca" (OuterVolumeSpecName: "service-ca") pod "d1b9341c-ba96-4e86-80f2-5260ed07f0a1" (UID: "d1b9341c-ba96-4e86-80f2-5260ed07f0a1"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:36:55.206373 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:55.206355 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1b9341c-ba96-4e86-80f2-5260ed07f0a1-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d1b9341c-ba96-4e86-80f2-5260ed07f0a1" (UID: "d1b9341c-ba96-4e86-80f2-5260ed07f0a1"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:36:55.215345 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:55.215289 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1b9341c-ba96-4e86-80f2-5260ed07f0a1-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d1b9341c-ba96-4e86-80f2-5260ed07f0a1" (UID: "d1b9341c-ba96-4e86-80f2-5260ed07f0a1"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:36:55.215554 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:55.215391 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1b9341c-ba96-4e86-80f2-5260ed07f0a1-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d1b9341c-ba96-4e86-80f2-5260ed07f0a1" (UID: "d1b9341c-ba96-4e86-80f2-5260ed07f0a1"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:36:55.216229 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:55.216164 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1b9341c-ba96-4e86-80f2-5260ed07f0a1-kube-api-access-d56vj" (OuterVolumeSpecName: "kube-api-access-d56vj") pod "d1b9341c-ba96-4e86-80f2-5260ed07f0a1" (UID: "d1b9341c-ba96-4e86-80f2-5260ed07f0a1"). InnerVolumeSpecName "kube-api-access-d56vj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:36:55.306366 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:55.306329 2562 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d1b9341c-ba96-4e86-80f2-5260ed07f0a1-service-ca\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 16:36:55.306366 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:55.306361 2562 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d1b9341c-ba96-4e86-80f2-5260ed07f0a1-console-serving-cert\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 16:36:55.306366 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:55.306370 2562 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d1b9341c-ba96-4e86-80f2-5260ed07f0a1-console-config\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 16:36:55.306596 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:55.306387 2562 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1b9341c-ba96-4e86-80f2-5260ed07f0a1-trusted-ca-bundle\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 16:36:55.306596 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:55.306396 2562 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d1b9341c-ba96-4e86-80f2-5260ed07f0a1-console-oauth-config\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 16:36:55.306596 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:55.306405 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d56vj\" (UniqueName: \"kubernetes.io/projected/d1b9341c-ba96-4e86-80f2-5260ed07f0a1-kube-api-access-d56vj\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 16:36:55.306596 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:55.306414 2562 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d1b9341c-ba96-4e86-80f2-5260ed07f0a1-oauth-serving-cert\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 16:36:56.189785 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:56.189738 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6cbbbb9d44-bmcf2" Apr 23 16:36:56.211185 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:56.211157 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6cbbbb9d44-bmcf2"] Apr 23 16:36:56.215025 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:56.215003 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6cbbbb9d44-bmcf2"] Apr 23 16:36:56.742043 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:36:56.742010 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1b9341c-ba96-4e86-80f2-5260ed07f0a1" path="/var/lib/kubelet/pods/d1b9341c-ba96-4e86-80f2-5260ed07f0a1/volumes" Apr 23 16:37:02.116058 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:02.116022 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-wn6cd" Apr 23 16:37:03.231707 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:03.231677 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-6c9b88dfdd-b9ljj" Apr 23 16:37:03.235813 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:03.235784 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-6c9b88dfdd-b9ljj" Apr 23 16:37:03.709974 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:03.709942 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-6c9b88dfdd-b9ljj_3b9aeaea-8c1b-40ec-b888-528be672bf52/metrics-server/0.log" Apr 23 16:37:03.910261 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:03.910221 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-wrchg_848ea94b-d7c7-4f3c-8f74-d35ab6f9770a/monitoring-plugin/0.log" Apr 23 16:37:04.113531 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:04.113449 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cxgtz_354460b2-9790-4c05-9a2a-dc4bab2fa675/init-textfile/0.log" Apr 23 16:37:04.308488 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:04.308457 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cxgtz_354460b2-9790-4c05-9a2a-dc4bab2fa675/node-exporter/0.log" Apr 23 16:37:04.508547 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:04.508513 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cxgtz_354460b2-9790-4c05-9a2a-dc4bab2fa675/kube-rbac-proxy/0.log" Apr 23 16:37:06.507820 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:06.507794 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e86cbe0f-1d12-44e9-8b77-a009364ca559/init-config-reloader/0.log" Apr 23 16:37:06.708249 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:06.708217 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e86cbe0f-1d12-44e9-8b77-a009364ca559/prometheus/0.log" Apr 23 16:37:06.907636 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:06.907600 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e86cbe0f-1d12-44e9-8b77-a009364ca559/config-reloader/0.log" Apr 23 16:37:07.107927 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:07.107879 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e86cbe0f-1d12-44e9-8b77-a009364ca559/thanos-sidecar/0.log" Apr 23 16:37:07.309461 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:07.309326 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e86cbe0f-1d12-44e9-8b77-a009364ca559/kube-rbac-proxy-web/0.log" Apr 23 16:37:07.507582 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:07.507558 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e86cbe0f-1d12-44e9-8b77-a009364ca559/kube-rbac-proxy/0.log" Apr 23 16:37:07.711437 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:07.711384 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e86cbe0f-1d12-44e9-8b77-a009364ca559/kube-rbac-proxy-thanos/0.log" Apr 23 16:37:07.908695 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:07.908667 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-9qng7_46010ce7-8871-4f06-90d6-4933fe17216c/prometheus-operator/0.log" Apr 23 16:37:08.108078 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:08.107994 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-9qng7_46010ce7-8871-4f06-90d6-4933fe17216c/kube-rbac-proxy/0.log" Apr 23 16:37:08.508574 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:08.508546 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6bfdbdfb67-mxpcf_acf14525-d550-4e69-b0a5-9d0557d765ad/telemeter-client/0.log" Apr 23 16:37:08.707503 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:08.707474 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6bfdbdfb67-mxpcf_acf14525-d550-4e69-b0a5-9d0557d765ad/reload/0.log" Apr 23 16:37:08.907838 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:08.907811 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6bfdbdfb67-mxpcf_acf14525-d550-4e69-b0a5-9d0557d765ad/kube-rbac-proxy/0.log" Apr 23 16:37:09.109381 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:09.109354 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7cbf84d895-rj6tl_d2eb8e62-9f07-4d19-9d67-f66771d26287/thanos-query/0.log" Apr 23 16:37:09.308139 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:09.308069 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7cbf84d895-rj6tl_d2eb8e62-9f07-4d19-9d67-f66771d26287/kube-rbac-proxy-web/0.log" Apr 23 16:37:09.507374 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:09.507350 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7cbf84d895-rj6tl_d2eb8e62-9f07-4d19-9d67-f66771d26287/kube-rbac-proxy/0.log" Apr 23 16:37:09.707987 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:09.707953 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7cbf84d895-rj6tl_d2eb8e62-9f07-4d19-9d67-f66771d26287/prom-label-proxy/0.log" Apr 23 16:37:09.908412 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:09.908371 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7cbf84d895-rj6tl_d2eb8e62-9f07-4d19-9d67-f66771d26287/kube-rbac-proxy-rules/0.log" Apr 23 16:37:10.108574 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:10.108500 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7cbf84d895-rj6tl_d2eb8e62-9f07-4d19-9d67-f66771d26287/kube-rbac-proxy-metrics/0.log" Apr 23 16:37:12.107926 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:12.107894 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-mh25k_aa24ec5c-a6a3-4c23-90a0-58ac28a5e1f9/serve-healthcheck-canary/0.log" Apr 23 16:37:24.998633 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:24.998588 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:25.018214 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:25.018186 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:25.289563 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:25.289493 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.032649 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.032609 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 16:37:43.033121 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.033065 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="e86cbe0f-1d12-44e9-8b77-a009364ca559" containerName="prometheus" containerID="cri-o://60242ccdc09d321ac02426ef69b0c8d1270c3397c772c238ecc7cefe5b36957b" gracePeriod=600 Apr 23 16:37:43.033121 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.033095 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="e86cbe0f-1d12-44e9-8b77-a009364ca559" containerName="kube-rbac-proxy" containerID="cri-o://d9913196204ce6713caae5213041d1aba6d71f5a727d2b3588dd9e4db9b75f5e" gracePeriod=600 Apr 23 16:37:43.033246 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.033136 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="e86cbe0f-1d12-44e9-8b77-a009364ca559" containerName="thanos-sidecar" containerID="cri-o://2a87ebf5c434c4390f8b278dd5bfb562bcd5d797b61aa9247251a773e431c8b1" gracePeriod=600 Apr 23 16:37:43.033246 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.033157 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="e86cbe0f-1d12-44e9-8b77-a009364ca559" containerName="kube-rbac-proxy-thanos" containerID="cri-o://e7aa6b764629d6c69f1903ce8ae868fbaabdacc0c13bf3ec539385257202be42" gracePeriod=600 Apr 23 16:37:43.033246 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.033174 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="e86cbe0f-1d12-44e9-8b77-a009364ca559" containerName="kube-rbac-proxy-web" containerID="cri-o://9d2067864899f370575c570f30e9f0ce2aed1e5bcff88c221c8c6e62dd1ea393" gracePeriod=600 Apr 23 16:37:43.033246 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.033213 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="e86cbe0f-1d12-44e9-8b77-a009364ca559" containerName="config-reloader" containerID="cri-o://dba06d812a3c4d2dd3912a3ebe958762b30e870180ed03dde4ab1f7787202a6d" gracePeriod=600 Apr 23 16:37:43.288590 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.288272 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.306488 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.306453 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e86cbe0f-1d12-44e9-8b77-a009364ca559-prometheus-k8s-rulefiles-0\") pod \"e86cbe0f-1d12-44e9-8b77-a009364ca559\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " Apr 23 16:37:43.306665 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.306504 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e86cbe0f-1d12-44e9-8b77-a009364ca559-configmap-metrics-client-ca\") pod \"e86cbe0f-1d12-44e9-8b77-a009364ca559\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " Apr 23 16:37:43.306665 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.306531 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e86cbe0f-1d12-44e9-8b77-a009364ca559-prometheus-trusted-ca-bundle\") pod \"e86cbe0f-1d12-44e9-8b77-a009364ca559\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " Apr 23 16:37:43.306665 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.306561 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e86cbe0f-1d12-44e9-8b77-a009364ca559-prometheus-k8s-db\") pod \"e86cbe0f-1d12-44e9-8b77-a009364ca559\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " Apr 23 16:37:43.306665 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.306601 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e86cbe0f-1d12-44e9-8b77-a009364ca559-config\") pod \"e86cbe0f-1d12-44e9-8b77-a009364ca559\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " Apr 23 16:37:43.306665 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.306624 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e86cbe0f-1d12-44e9-8b77-a009364ca559-secret-grpc-tls\") pod \"e86cbe0f-1d12-44e9-8b77-a009364ca559\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " Apr 23 16:37:43.306665 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.306646 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e86cbe0f-1d12-44e9-8b77-a009364ca559-configmap-serving-certs-ca-bundle\") pod \"e86cbe0f-1d12-44e9-8b77-a009364ca559\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " Apr 23 16:37:43.306992 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.306698 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e86cbe0f-1d12-44e9-8b77-a009364ca559-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"e86cbe0f-1d12-44e9-8b77-a009364ca559\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " Apr 23 16:37:43.306992 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.306726 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e86cbe0f-1d12-44e9-8b77-a009364ca559-tls-assets\") pod \"e86cbe0f-1d12-44e9-8b77-a009364ca559\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " Apr 23 16:37:43.306992 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.306804 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e86cbe0f-1d12-44e9-8b77-a009364ca559-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"e86cbe0f-1d12-44e9-8b77-a009364ca559\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " Apr 23 16:37:43.306992 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.306835 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e86cbe0f-1d12-44e9-8b77-a009364ca559-thanos-prometheus-http-client-file\") pod \"e86cbe0f-1d12-44e9-8b77-a009364ca559\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " Apr 23 16:37:43.306992 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.306866 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e86cbe0f-1d12-44e9-8b77-a009364ca559-configmap-kubelet-serving-ca-bundle\") pod \"e86cbe0f-1d12-44e9-8b77-a009364ca559\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " Apr 23 16:37:43.306992 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.306893 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e86cbe0f-1d12-44e9-8b77-a009364ca559-secret-metrics-client-certs\") pod \"e86cbe0f-1d12-44e9-8b77-a009364ca559\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " Apr 23 16:37:43.306992 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.306907 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e86cbe0f-1d12-44e9-8b77-a009364ca559-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "e86cbe0f-1d12-44e9-8b77-a009364ca559" (UID: "e86cbe0f-1d12-44e9-8b77-a009364ca559"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:37:43.306992 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.306924 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2nc75\" (UniqueName: \"kubernetes.io/projected/e86cbe0f-1d12-44e9-8b77-a009364ca559-kube-api-access-2nc75\") pod \"e86cbe0f-1d12-44e9-8b77-a009364ca559\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " Apr 23 16:37:43.306992 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.306951 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e86cbe0f-1d12-44e9-8b77-a009364ca559-secret-kube-rbac-proxy\") pod \"e86cbe0f-1d12-44e9-8b77-a009364ca559\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " Apr 23 16:37:43.306992 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.306953 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e86cbe0f-1d12-44e9-8b77-a009364ca559-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "e86cbe0f-1d12-44e9-8b77-a009364ca559" (UID: "e86cbe0f-1d12-44e9-8b77-a009364ca559"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:37:43.306992 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.306979 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e86cbe0f-1d12-44e9-8b77-a009364ca559-web-config\") pod \"e86cbe0f-1d12-44e9-8b77-a009364ca559\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " Apr 23 16:37:43.307493 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.307005 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e86cbe0f-1d12-44e9-8b77-a009364ca559-config-out\") pod \"e86cbe0f-1d12-44e9-8b77-a009364ca559\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " Apr 23 16:37:43.307493 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.307040 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e86cbe0f-1d12-44e9-8b77-a009364ca559-secret-prometheus-k8s-tls\") pod \"e86cbe0f-1d12-44e9-8b77-a009364ca559\" (UID: \"e86cbe0f-1d12-44e9-8b77-a009364ca559\") " Apr 23 16:37:43.307493 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.307222 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e86cbe0f-1d12-44e9-8b77-a009364ca559-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "e86cbe0f-1d12-44e9-8b77-a009364ca559" (UID: "e86cbe0f-1d12-44e9-8b77-a009364ca559"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:37:43.307493 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.307265 2562 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e86cbe0f-1d12-44e9-8b77-a009364ca559-configmap-metrics-client-ca\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 16:37:43.307493 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.307284 2562 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e86cbe0f-1d12-44e9-8b77-a009364ca559-prometheus-trusted-ca-bundle\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 16:37:43.307493 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.307413 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e86cbe0f-1d12-44e9-8b77-a009364ca559-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "e86cbe0f-1d12-44e9-8b77-a009364ca559" (UID: "e86cbe0f-1d12-44e9-8b77-a009364ca559"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:37:43.307967 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.307903 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e86cbe0f-1d12-44e9-8b77-a009364ca559-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "e86cbe0f-1d12-44e9-8b77-a009364ca559" (UID: "e86cbe0f-1d12-44e9-8b77-a009364ca559"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:37:43.307967 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.307930 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e86cbe0f-1d12-44e9-8b77-a009364ca559-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "e86cbe0f-1d12-44e9-8b77-a009364ca559" (UID: "e86cbe0f-1d12-44e9-8b77-a009364ca559"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:37:43.310946 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.310701 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e86cbe0f-1d12-44e9-8b77-a009364ca559-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "e86cbe0f-1d12-44e9-8b77-a009364ca559" (UID: "e86cbe0f-1d12-44e9-8b77-a009364ca559"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:37:43.311242 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.311194 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e86cbe0f-1d12-44e9-8b77-a009364ca559-config-out" (OuterVolumeSpecName: "config-out") pod "e86cbe0f-1d12-44e9-8b77-a009364ca559" (UID: "e86cbe0f-1d12-44e9-8b77-a009364ca559"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:37:43.311242 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.311205 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e86cbe0f-1d12-44e9-8b77-a009364ca559-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "e86cbe0f-1d12-44e9-8b77-a009364ca559" (UID: "e86cbe0f-1d12-44e9-8b77-a009364ca559"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:37:43.311616 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.311579 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e86cbe0f-1d12-44e9-8b77-a009364ca559-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "e86cbe0f-1d12-44e9-8b77-a009364ca559" (UID: "e86cbe0f-1d12-44e9-8b77-a009364ca559"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:37:43.313535 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.313506 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e86cbe0f-1d12-44e9-8b77-a009364ca559-config" (OuterVolumeSpecName: "config") pod "e86cbe0f-1d12-44e9-8b77-a009364ca559" (UID: "e86cbe0f-1d12-44e9-8b77-a009364ca559"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:37:43.313734 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.313705 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e86cbe0f-1d12-44e9-8b77-a009364ca559-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "e86cbe0f-1d12-44e9-8b77-a009364ca559" (UID: "e86cbe0f-1d12-44e9-8b77-a009364ca559"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:37:43.313869 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.313778 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e86cbe0f-1d12-44e9-8b77-a009364ca559-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "e86cbe0f-1d12-44e9-8b77-a009364ca559" (UID: "e86cbe0f-1d12-44e9-8b77-a009364ca559"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:37:43.313869 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.313840 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e86cbe0f-1d12-44e9-8b77-a009364ca559-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "e86cbe0f-1d12-44e9-8b77-a009364ca559" (UID: "e86cbe0f-1d12-44e9-8b77-a009364ca559"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:37:43.314045 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.314020 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e86cbe0f-1d12-44e9-8b77-a009364ca559-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "e86cbe0f-1d12-44e9-8b77-a009364ca559" (UID: "e86cbe0f-1d12-44e9-8b77-a009364ca559"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:37:43.314154 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.314106 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e86cbe0f-1d12-44e9-8b77-a009364ca559-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "e86cbe0f-1d12-44e9-8b77-a009364ca559" (UID: "e86cbe0f-1d12-44e9-8b77-a009364ca559"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:37:43.314154 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.314145 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e86cbe0f-1d12-44e9-8b77-a009364ca559-kube-api-access-2nc75" (OuterVolumeSpecName: "kube-api-access-2nc75") pod "e86cbe0f-1d12-44e9-8b77-a009364ca559" (UID: "e86cbe0f-1d12-44e9-8b77-a009364ca559"). InnerVolumeSpecName "kube-api-access-2nc75". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:37:43.323883 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.323854 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e86cbe0f-1d12-44e9-8b77-a009364ca559-web-config" (OuterVolumeSpecName: "web-config") pod "e86cbe0f-1d12-44e9-8b77-a009364ca559" (UID: "e86cbe0f-1d12-44e9-8b77-a009364ca559"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:37:43.325687 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.325657 2562 generic.go:358] "Generic (PLEG): container finished" podID="e86cbe0f-1d12-44e9-8b77-a009364ca559" containerID="e7aa6b764629d6c69f1903ce8ae868fbaabdacc0c13bf3ec539385257202be42" exitCode=0 Apr 23 16:37:43.325687 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.325687 2562 generic.go:358] "Generic (PLEG): container finished" podID="e86cbe0f-1d12-44e9-8b77-a009364ca559" containerID="d9913196204ce6713caae5213041d1aba6d71f5a727d2b3588dd9e4db9b75f5e" exitCode=0 Apr 23 16:37:43.325852 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.325701 2562 generic.go:358] "Generic (PLEG): container finished" podID="e86cbe0f-1d12-44e9-8b77-a009364ca559" containerID="9d2067864899f370575c570f30e9f0ce2aed1e5bcff88c221c8c6e62dd1ea393" exitCode=0 Apr 23 16:37:43.325852 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.325710 2562 generic.go:358] "Generic (PLEG): container finished" podID="e86cbe0f-1d12-44e9-8b77-a009364ca559" containerID="2a87ebf5c434c4390f8b278dd5bfb562bcd5d797b61aa9247251a773e431c8b1" exitCode=0 Apr 23 16:37:43.325852 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.325720 2562 generic.go:358] "Generic (PLEG): container finished" podID="e86cbe0f-1d12-44e9-8b77-a009364ca559" containerID="dba06d812a3c4d2dd3912a3ebe958762b30e870180ed03dde4ab1f7787202a6d" exitCode=0 Apr 23 16:37:43.325852 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.325728 2562 generic.go:358] "Generic (PLEG): container finished" podID="e86cbe0f-1d12-44e9-8b77-a009364ca559" containerID="60242ccdc09d321ac02426ef69b0c8d1270c3397c772c238ecc7cefe5b36957b" exitCode=0 Apr 23 16:37:43.325852 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.325729 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e86cbe0f-1d12-44e9-8b77-a009364ca559","Type":"ContainerDied","Data":"e7aa6b764629d6c69f1903ce8ae868fbaabdacc0c13bf3ec539385257202be42"} Apr 23 16:37:43.325852 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.325770 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.325852 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.325790 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e86cbe0f-1d12-44e9-8b77-a009364ca559","Type":"ContainerDied","Data":"d9913196204ce6713caae5213041d1aba6d71f5a727d2b3588dd9e4db9b75f5e"} Apr 23 16:37:43.325852 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.325802 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e86cbe0f-1d12-44e9-8b77-a009364ca559","Type":"ContainerDied","Data":"9d2067864899f370575c570f30e9f0ce2aed1e5bcff88c221c8c6e62dd1ea393"} Apr 23 16:37:43.325852 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.325813 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e86cbe0f-1d12-44e9-8b77-a009364ca559","Type":"ContainerDied","Data":"2a87ebf5c434c4390f8b278dd5bfb562bcd5d797b61aa9247251a773e431c8b1"} Apr 23 16:37:43.325852 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.325823 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e86cbe0f-1d12-44e9-8b77-a009364ca559","Type":"ContainerDied","Data":"dba06d812a3c4d2dd3912a3ebe958762b30e870180ed03dde4ab1f7787202a6d"} Apr 23 16:37:43.325852 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.325832 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e86cbe0f-1d12-44e9-8b77-a009364ca559","Type":"ContainerDied","Data":"60242ccdc09d321ac02426ef69b0c8d1270c3397c772c238ecc7cefe5b36957b"} Apr 23 16:37:43.325852 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.325841 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e86cbe0f-1d12-44e9-8b77-a009364ca559","Type":"ContainerDied","Data":"53894a29fa714ab12508ee5acc3c6d6e457dcca493ea53d8a4d8ce0d8411e563"} Apr 23 16:37:43.325852 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.325853 2562 scope.go:117] "RemoveContainer" containerID="e7aa6b764629d6c69f1903ce8ae868fbaabdacc0c13bf3ec539385257202be42" Apr 23 16:37:43.336272 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.335709 2562 scope.go:117] "RemoveContainer" containerID="d9913196204ce6713caae5213041d1aba6d71f5a727d2b3588dd9e4db9b75f5e" Apr 23 16:37:43.349230 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.346245 2562 scope.go:117] "RemoveContainer" containerID="9d2067864899f370575c570f30e9f0ce2aed1e5bcff88c221c8c6e62dd1ea393" Apr 23 16:37:43.357276 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.357257 2562 scope.go:117] "RemoveContainer" containerID="2a87ebf5c434c4390f8b278dd5bfb562bcd5d797b61aa9247251a773e431c8b1" Apr 23 16:37:43.365436 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.365404 2562 scope.go:117] "RemoveContainer" containerID="dba06d812a3c4d2dd3912a3ebe958762b30e870180ed03dde4ab1f7787202a6d" Apr 23 16:37:43.366732 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.366711 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 16:37:43.373730 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.373701 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 16:37:43.374523 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.374502 2562 scope.go:117] "RemoveContainer" containerID="60242ccdc09d321ac02426ef69b0c8d1270c3397c772c238ecc7cefe5b36957b" Apr 23 16:37:43.383913 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.383892 2562 scope.go:117] "RemoveContainer" containerID="fcb955a8997e18ad73021bed8baafba609e9448e5312b6614c345bbdd1c1cfe3" Apr 23 16:37:43.392639 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.392613 2562 scope.go:117] "RemoveContainer" containerID="e7aa6b764629d6c69f1903ce8ae868fbaabdacc0c13bf3ec539385257202be42" Apr 23 16:37:43.392897 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:37:43.392876 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7aa6b764629d6c69f1903ce8ae868fbaabdacc0c13bf3ec539385257202be42\": container with ID starting with e7aa6b764629d6c69f1903ce8ae868fbaabdacc0c13bf3ec539385257202be42 not found: ID does not exist" containerID="e7aa6b764629d6c69f1903ce8ae868fbaabdacc0c13bf3ec539385257202be42" Apr 23 16:37:43.392961 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.392907 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7aa6b764629d6c69f1903ce8ae868fbaabdacc0c13bf3ec539385257202be42"} err="failed to get container status \"e7aa6b764629d6c69f1903ce8ae868fbaabdacc0c13bf3ec539385257202be42\": rpc error: code = NotFound desc = could not find container \"e7aa6b764629d6c69f1903ce8ae868fbaabdacc0c13bf3ec539385257202be42\": container with ID starting with e7aa6b764629d6c69f1903ce8ae868fbaabdacc0c13bf3ec539385257202be42 not found: ID does not exist" Apr 23 16:37:43.392961 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.392941 2562 scope.go:117] "RemoveContainer" containerID="d9913196204ce6713caae5213041d1aba6d71f5a727d2b3588dd9e4db9b75f5e" Apr 23 16:37:43.393255 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:37:43.393234 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9913196204ce6713caae5213041d1aba6d71f5a727d2b3588dd9e4db9b75f5e\": container with ID starting with d9913196204ce6713caae5213041d1aba6d71f5a727d2b3588dd9e4db9b75f5e not found: ID does not exist" containerID="d9913196204ce6713caae5213041d1aba6d71f5a727d2b3588dd9e4db9b75f5e" Apr 23 16:37:43.393351 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.393263 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9913196204ce6713caae5213041d1aba6d71f5a727d2b3588dd9e4db9b75f5e"} err="failed to get container status \"d9913196204ce6713caae5213041d1aba6d71f5a727d2b3588dd9e4db9b75f5e\": rpc error: code = NotFound desc = could not find container \"d9913196204ce6713caae5213041d1aba6d71f5a727d2b3588dd9e4db9b75f5e\": container with ID starting with d9913196204ce6713caae5213041d1aba6d71f5a727d2b3588dd9e4db9b75f5e not found: ID does not exist" Apr 23 16:37:43.393351 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.393291 2562 scope.go:117] "RemoveContainer" containerID="9d2067864899f370575c570f30e9f0ce2aed1e5bcff88c221c8c6e62dd1ea393" Apr 23 16:37:43.393568 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:37:43.393548 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d2067864899f370575c570f30e9f0ce2aed1e5bcff88c221c8c6e62dd1ea393\": container with ID starting with 9d2067864899f370575c570f30e9f0ce2aed1e5bcff88c221c8c6e62dd1ea393 not found: ID does not exist" containerID="9d2067864899f370575c570f30e9f0ce2aed1e5bcff88c221c8c6e62dd1ea393" Apr 23 16:37:43.393652 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.393578 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d2067864899f370575c570f30e9f0ce2aed1e5bcff88c221c8c6e62dd1ea393"} err="failed to get container status \"9d2067864899f370575c570f30e9f0ce2aed1e5bcff88c221c8c6e62dd1ea393\": rpc error: code = NotFound desc = could not find container \"9d2067864899f370575c570f30e9f0ce2aed1e5bcff88c221c8c6e62dd1ea393\": container with ID starting with 9d2067864899f370575c570f30e9f0ce2aed1e5bcff88c221c8c6e62dd1ea393 not found: ID does not exist" Apr 23 16:37:43.393652 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.393601 2562 scope.go:117] "RemoveContainer" containerID="2a87ebf5c434c4390f8b278dd5bfb562bcd5d797b61aa9247251a773e431c8b1" Apr 23 16:37:43.394027 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:37:43.394009 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a87ebf5c434c4390f8b278dd5bfb562bcd5d797b61aa9247251a773e431c8b1\": container with ID starting with 2a87ebf5c434c4390f8b278dd5bfb562bcd5d797b61aa9247251a773e431c8b1 not found: ID does not exist" containerID="2a87ebf5c434c4390f8b278dd5bfb562bcd5d797b61aa9247251a773e431c8b1" Apr 23 16:37:43.394101 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.394032 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a87ebf5c434c4390f8b278dd5bfb562bcd5d797b61aa9247251a773e431c8b1"} err="failed to get container status \"2a87ebf5c434c4390f8b278dd5bfb562bcd5d797b61aa9247251a773e431c8b1\": rpc error: code = NotFound desc = could not find container \"2a87ebf5c434c4390f8b278dd5bfb562bcd5d797b61aa9247251a773e431c8b1\": container with ID starting with 2a87ebf5c434c4390f8b278dd5bfb562bcd5d797b61aa9247251a773e431c8b1 not found: ID does not exist" Apr 23 16:37:43.394101 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.394053 2562 scope.go:117] "RemoveContainer" containerID="dba06d812a3c4d2dd3912a3ebe958762b30e870180ed03dde4ab1f7787202a6d" Apr 23 16:37:43.394345 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:37:43.394324 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dba06d812a3c4d2dd3912a3ebe958762b30e870180ed03dde4ab1f7787202a6d\": container with ID starting with dba06d812a3c4d2dd3912a3ebe958762b30e870180ed03dde4ab1f7787202a6d not found: ID does not exist" containerID="dba06d812a3c4d2dd3912a3ebe958762b30e870180ed03dde4ab1f7787202a6d" Apr 23 16:37:43.394435 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.394374 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dba06d812a3c4d2dd3912a3ebe958762b30e870180ed03dde4ab1f7787202a6d"} err="failed to get container status \"dba06d812a3c4d2dd3912a3ebe958762b30e870180ed03dde4ab1f7787202a6d\": rpc error: code = NotFound desc = could not find container \"dba06d812a3c4d2dd3912a3ebe958762b30e870180ed03dde4ab1f7787202a6d\": container with ID starting with dba06d812a3c4d2dd3912a3ebe958762b30e870180ed03dde4ab1f7787202a6d not found: ID does not exist" Apr 23 16:37:43.394435 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.394389 2562 scope.go:117] "RemoveContainer" containerID="60242ccdc09d321ac02426ef69b0c8d1270c3397c772c238ecc7cefe5b36957b" Apr 23 16:37:43.394669 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:37:43.394646 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60242ccdc09d321ac02426ef69b0c8d1270c3397c772c238ecc7cefe5b36957b\": container with ID starting with 60242ccdc09d321ac02426ef69b0c8d1270c3397c772c238ecc7cefe5b36957b not found: ID does not exist" containerID="60242ccdc09d321ac02426ef69b0c8d1270c3397c772c238ecc7cefe5b36957b" Apr 23 16:37:43.394723 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.394677 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60242ccdc09d321ac02426ef69b0c8d1270c3397c772c238ecc7cefe5b36957b"} err="failed to get container status \"60242ccdc09d321ac02426ef69b0c8d1270c3397c772c238ecc7cefe5b36957b\": rpc error: code = NotFound desc = could not find container \"60242ccdc09d321ac02426ef69b0c8d1270c3397c772c238ecc7cefe5b36957b\": container with ID starting with 60242ccdc09d321ac02426ef69b0c8d1270c3397c772c238ecc7cefe5b36957b not found: ID does not exist" Apr 23 16:37:43.394723 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.394698 2562 scope.go:117] "RemoveContainer" containerID="fcb955a8997e18ad73021bed8baafba609e9448e5312b6614c345bbdd1c1cfe3" Apr 23 16:37:43.394997 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:37:43.394978 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcb955a8997e18ad73021bed8baafba609e9448e5312b6614c345bbdd1c1cfe3\": container with ID starting with fcb955a8997e18ad73021bed8baafba609e9448e5312b6614c345bbdd1c1cfe3 not found: ID does not exist" containerID="fcb955a8997e18ad73021bed8baafba609e9448e5312b6614c345bbdd1c1cfe3" Apr 23 16:37:43.395073 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.395004 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcb955a8997e18ad73021bed8baafba609e9448e5312b6614c345bbdd1c1cfe3"} err="failed to get container status \"fcb955a8997e18ad73021bed8baafba609e9448e5312b6614c345bbdd1c1cfe3\": rpc error: code = NotFound desc = could not find container \"fcb955a8997e18ad73021bed8baafba609e9448e5312b6614c345bbdd1c1cfe3\": container with ID starting with fcb955a8997e18ad73021bed8baafba609e9448e5312b6614c345bbdd1c1cfe3 not found: ID does not exist" Apr 23 16:37:43.395073 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.395020 2562 scope.go:117] "RemoveContainer" containerID="e7aa6b764629d6c69f1903ce8ae868fbaabdacc0c13bf3ec539385257202be42" Apr 23 16:37:43.395441 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.395417 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7aa6b764629d6c69f1903ce8ae868fbaabdacc0c13bf3ec539385257202be42"} err="failed to get container status \"e7aa6b764629d6c69f1903ce8ae868fbaabdacc0c13bf3ec539385257202be42\": rpc error: code = NotFound desc = could not find container \"e7aa6b764629d6c69f1903ce8ae868fbaabdacc0c13bf3ec539385257202be42\": container with ID starting with e7aa6b764629d6c69f1903ce8ae868fbaabdacc0c13bf3ec539385257202be42 not found: ID does not exist" Apr 23 16:37:43.395482 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.395445 2562 scope.go:117] "RemoveContainer" containerID="d9913196204ce6713caae5213041d1aba6d71f5a727d2b3588dd9e4db9b75f5e" Apr 23 16:37:43.395647 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.395629 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9913196204ce6713caae5213041d1aba6d71f5a727d2b3588dd9e4db9b75f5e"} err="failed to get container status \"d9913196204ce6713caae5213041d1aba6d71f5a727d2b3588dd9e4db9b75f5e\": rpc error: code = NotFound desc = could not find container \"d9913196204ce6713caae5213041d1aba6d71f5a727d2b3588dd9e4db9b75f5e\": container with ID starting with d9913196204ce6713caae5213041d1aba6d71f5a727d2b3588dd9e4db9b75f5e not found: ID does not exist" Apr 23 16:37:43.395702 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.395649 2562 scope.go:117] "RemoveContainer" containerID="9d2067864899f370575c570f30e9f0ce2aed1e5bcff88c221c8c6e62dd1ea393" Apr 23 16:37:43.395912 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.395889 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d2067864899f370575c570f30e9f0ce2aed1e5bcff88c221c8c6e62dd1ea393"} err="failed to get container status \"9d2067864899f370575c570f30e9f0ce2aed1e5bcff88c221c8c6e62dd1ea393\": rpc error: code = NotFound desc = could not find container \"9d2067864899f370575c570f30e9f0ce2aed1e5bcff88c221c8c6e62dd1ea393\": container with ID starting with 9d2067864899f370575c570f30e9f0ce2aed1e5bcff88c221c8c6e62dd1ea393 not found: ID does not exist" Apr 23 16:37:43.395912 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.395911 2562 scope.go:117] "RemoveContainer" containerID="2a87ebf5c434c4390f8b278dd5bfb562bcd5d797b61aa9247251a773e431c8b1" Apr 23 16:37:43.396171 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.396148 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a87ebf5c434c4390f8b278dd5bfb562bcd5d797b61aa9247251a773e431c8b1"} err="failed to get container status \"2a87ebf5c434c4390f8b278dd5bfb562bcd5d797b61aa9247251a773e431c8b1\": rpc error: code = NotFound desc = could not find container \"2a87ebf5c434c4390f8b278dd5bfb562bcd5d797b61aa9247251a773e431c8b1\": container with ID starting with 2a87ebf5c434c4390f8b278dd5bfb562bcd5d797b61aa9247251a773e431c8b1 not found: ID does not exist" Apr 23 16:37:43.396220 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.396173 2562 scope.go:117] "RemoveContainer" containerID="dba06d812a3c4d2dd3912a3ebe958762b30e870180ed03dde4ab1f7787202a6d" Apr 23 16:37:43.396418 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.396391 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dba06d812a3c4d2dd3912a3ebe958762b30e870180ed03dde4ab1f7787202a6d"} err="failed to get container status \"dba06d812a3c4d2dd3912a3ebe958762b30e870180ed03dde4ab1f7787202a6d\": rpc error: code = NotFound desc = could not find container \"dba06d812a3c4d2dd3912a3ebe958762b30e870180ed03dde4ab1f7787202a6d\": container with ID starting with dba06d812a3c4d2dd3912a3ebe958762b30e870180ed03dde4ab1f7787202a6d not found: ID does not exist" Apr 23 16:37:43.396546 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.396421 2562 scope.go:117] "RemoveContainer" containerID="60242ccdc09d321ac02426ef69b0c8d1270c3397c772c238ecc7cefe5b36957b" Apr 23 16:37:43.398533 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.396701 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60242ccdc09d321ac02426ef69b0c8d1270c3397c772c238ecc7cefe5b36957b"} err="failed to get container status \"60242ccdc09d321ac02426ef69b0c8d1270c3397c772c238ecc7cefe5b36957b\": rpc error: code = NotFound desc = could not find container \"60242ccdc09d321ac02426ef69b0c8d1270c3397c772c238ecc7cefe5b36957b\": container with ID starting with 60242ccdc09d321ac02426ef69b0c8d1270c3397c772c238ecc7cefe5b36957b not found: ID does not exist" Apr 23 16:37:43.398533 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.396727 2562 scope.go:117] "RemoveContainer" containerID="fcb955a8997e18ad73021bed8baafba609e9448e5312b6614c345bbdd1c1cfe3" Apr 23 16:37:43.398533 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.397006 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcb955a8997e18ad73021bed8baafba609e9448e5312b6614c345bbdd1c1cfe3"} err="failed to get container status \"fcb955a8997e18ad73021bed8baafba609e9448e5312b6614c345bbdd1c1cfe3\": rpc error: code = NotFound desc = could not find container \"fcb955a8997e18ad73021bed8baafba609e9448e5312b6614c345bbdd1c1cfe3\": container with ID starting with fcb955a8997e18ad73021bed8baafba609e9448e5312b6614c345bbdd1c1cfe3 not found: ID does not exist" Apr 23 16:37:43.398533 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.397056 2562 scope.go:117] "RemoveContainer" containerID="e7aa6b764629d6c69f1903ce8ae868fbaabdacc0c13bf3ec539385257202be42" Apr 23 16:37:43.398533 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.397256 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7aa6b764629d6c69f1903ce8ae868fbaabdacc0c13bf3ec539385257202be42"} err="failed to get container status \"e7aa6b764629d6c69f1903ce8ae868fbaabdacc0c13bf3ec539385257202be42\": rpc error: code = NotFound desc = could not find container \"e7aa6b764629d6c69f1903ce8ae868fbaabdacc0c13bf3ec539385257202be42\": container with ID starting with e7aa6b764629d6c69f1903ce8ae868fbaabdacc0c13bf3ec539385257202be42 not found: ID does not exist" Apr 23 16:37:43.398533 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.397277 2562 scope.go:117] "RemoveContainer" containerID="d9913196204ce6713caae5213041d1aba6d71f5a727d2b3588dd9e4db9b75f5e" Apr 23 16:37:43.398533 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.397465 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9913196204ce6713caae5213041d1aba6d71f5a727d2b3588dd9e4db9b75f5e"} err="failed to get container status \"d9913196204ce6713caae5213041d1aba6d71f5a727d2b3588dd9e4db9b75f5e\": rpc error: code = NotFound desc = could not find container \"d9913196204ce6713caae5213041d1aba6d71f5a727d2b3588dd9e4db9b75f5e\": container with ID starting with d9913196204ce6713caae5213041d1aba6d71f5a727d2b3588dd9e4db9b75f5e not found: ID does not exist" Apr 23 16:37:43.398533 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.397484 2562 scope.go:117] "RemoveContainer" containerID="9d2067864899f370575c570f30e9f0ce2aed1e5bcff88c221c8c6e62dd1ea393" Apr 23 16:37:43.398533 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.397669 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d2067864899f370575c570f30e9f0ce2aed1e5bcff88c221c8c6e62dd1ea393"} err="failed to get container status \"9d2067864899f370575c570f30e9f0ce2aed1e5bcff88c221c8c6e62dd1ea393\": rpc error: code = NotFound desc = could not find container \"9d2067864899f370575c570f30e9f0ce2aed1e5bcff88c221c8c6e62dd1ea393\": container with ID starting with 9d2067864899f370575c570f30e9f0ce2aed1e5bcff88c221c8c6e62dd1ea393 not found: ID does not exist" Apr 23 16:37:43.398533 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.397688 2562 scope.go:117] "RemoveContainer" containerID="2a87ebf5c434c4390f8b278dd5bfb562bcd5d797b61aa9247251a773e431c8b1" Apr 23 16:37:43.398533 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.397901 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a87ebf5c434c4390f8b278dd5bfb562bcd5d797b61aa9247251a773e431c8b1"} err="failed to get container status \"2a87ebf5c434c4390f8b278dd5bfb562bcd5d797b61aa9247251a773e431c8b1\": rpc error: code = NotFound desc = could not find container \"2a87ebf5c434c4390f8b278dd5bfb562bcd5d797b61aa9247251a773e431c8b1\": container with ID starting with 2a87ebf5c434c4390f8b278dd5bfb562bcd5d797b61aa9247251a773e431c8b1 not found: ID does not exist" Apr 23 16:37:43.398533 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.397923 2562 scope.go:117] "RemoveContainer" containerID="dba06d812a3c4d2dd3912a3ebe958762b30e870180ed03dde4ab1f7787202a6d" Apr 23 16:37:43.398533 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.398102 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dba06d812a3c4d2dd3912a3ebe958762b30e870180ed03dde4ab1f7787202a6d"} err="failed to get container status \"dba06d812a3c4d2dd3912a3ebe958762b30e870180ed03dde4ab1f7787202a6d\": rpc error: code = NotFound desc = could not find container \"dba06d812a3c4d2dd3912a3ebe958762b30e870180ed03dde4ab1f7787202a6d\": container with ID starting with dba06d812a3c4d2dd3912a3ebe958762b30e870180ed03dde4ab1f7787202a6d not found: ID does not exist" Apr 23 16:37:43.398533 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.398120 2562 scope.go:117] "RemoveContainer" containerID="60242ccdc09d321ac02426ef69b0c8d1270c3397c772c238ecc7cefe5b36957b" Apr 23 16:37:43.398533 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.398290 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60242ccdc09d321ac02426ef69b0c8d1270c3397c772c238ecc7cefe5b36957b"} err="failed to get container status \"60242ccdc09d321ac02426ef69b0c8d1270c3397c772c238ecc7cefe5b36957b\": rpc error: code = NotFound desc = could not find container \"60242ccdc09d321ac02426ef69b0c8d1270c3397c772c238ecc7cefe5b36957b\": container with ID starting with 60242ccdc09d321ac02426ef69b0c8d1270c3397c772c238ecc7cefe5b36957b not found: ID does not exist" Apr 23 16:37:43.398533 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.398307 2562 scope.go:117] "RemoveContainer" containerID="fcb955a8997e18ad73021bed8baafba609e9448e5312b6614c345bbdd1c1cfe3" Apr 23 16:37:43.398533 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.398524 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcb955a8997e18ad73021bed8baafba609e9448e5312b6614c345bbdd1c1cfe3"} err="failed to get container status \"fcb955a8997e18ad73021bed8baafba609e9448e5312b6614c345bbdd1c1cfe3\": rpc error: code = NotFound desc = could not find container \"fcb955a8997e18ad73021bed8baafba609e9448e5312b6614c345bbdd1c1cfe3\": container with ID starting with fcb955a8997e18ad73021bed8baafba609e9448e5312b6614c345bbdd1c1cfe3 not found: ID does not exist" Apr 23 16:37:43.399514 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.398549 2562 scope.go:117] "RemoveContainer" containerID="e7aa6b764629d6c69f1903ce8ae868fbaabdacc0c13bf3ec539385257202be42" Apr 23 16:37:43.399514 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.398848 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7aa6b764629d6c69f1903ce8ae868fbaabdacc0c13bf3ec539385257202be42"} err="failed to get container status \"e7aa6b764629d6c69f1903ce8ae868fbaabdacc0c13bf3ec539385257202be42\": rpc error: code = NotFound desc = could not find container \"e7aa6b764629d6c69f1903ce8ae868fbaabdacc0c13bf3ec539385257202be42\": container with ID starting with e7aa6b764629d6c69f1903ce8ae868fbaabdacc0c13bf3ec539385257202be42 not found: ID does not exist" Apr 23 16:37:43.399514 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.398891 2562 scope.go:117] "RemoveContainer" containerID="d9913196204ce6713caae5213041d1aba6d71f5a727d2b3588dd9e4db9b75f5e" Apr 23 16:37:43.399514 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.399097 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9913196204ce6713caae5213041d1aba6d71f5a727d2b3588dd9e4db9b75f5e"} err="failed to get container status \"d9913196204ce6713caae5213041d1aba6d71f5a727d2b3588dd9e4db9b75f5e\": rpc error: code = NotFound desc = could not find container \"d9913196204ce6713caae5213041d1aba6d71f5a727d2b3588dd9e4db9b75f5e\": container with ID starting with d9913196204ce6713caae5213041d1aba6d71f5a727d2b3588dd9e4db9b75f5e not found: ID does not exist" Apr 23 16:37:43.399514 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.399117 2562 scope.go:117] "RemoveContainer" containerID="9d2067864899f370575c570f30e9f0ce2aed1e5bcff88c221c8c6e62dd1ea393" Apr 23 16:37:43.399514 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.399368 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d2067864899f370575c570f30e9f0ce2aed1e5bcff88c221c8c6e62dd1ea393"} err="failed to get container status \"9d2067864899f370575c570f30e9f0ce2aed1e5bcff88c221c8c6e62dd1ea393\": rpc error: code = NotFound desc = could not find container \"9d2067864899f370575c570f30e9f0ce2aed1e5bcff88c221c8c6e62dd1ea393\": container with ID starting with 9d2067864899f370575c570f30e9f0ce2aed1e5bcff88c221c8c6e62dd1ea393 not found: ID does not exist" Apr 23 16:37:43.399514 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.399383 2562 scope.go:117] "RemoveContainer" containerID="2a87ebf5c434c4390f8b278dd5bfb562bcd5d797b61aa9247251a773e431c8b1" Apr 23 16:37:43.399768 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.399645 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a87ebf5c434c4390f8b278dd5bfb562bcd5d797b61aa9247251a773e431c8b1"} err="failed to get container status \"2a87ebf5c434c4390f8b278dd5bfb562bcd5d797b61aa9247251a773e431c8b1\": rpc error: code = NotFound desc = could not find container \"2a87ebf5c434c4390f8b278dd5bfb562bcd5d797b61aa9247251a773e431c8b1\": container with ID starting with 2a87ebf5c434c4390f8b278dd5bfb562bcd5d797b61aa9247251a773e431c8b1 not found: ID does not exist" Apr 23 16:37:43.399768 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.399664 2562 scope.go:117] "RemoveContainer" containerID="dba06d812a3c4d2dd3912a3ebe958762b30e870180ed03dde4ab1f7787202a6d" Apr 23 16:37:43.399942 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.399911 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dba06d812a3c4d2dd3912a3ebe958762b30e870180ed03dde4ab1f7787202a6d"} err="failed to get container status \"dba06d812a3c4d2dd3912a3ebe958762b30e870180ed03dde4ab1f7787202a6d\": rpc error: code = NotFound desc = could not find container \"dba06d812a3c4d2dd3912a3ebe958762b30e870180ed03dde4ab1f7787202a6d\": container with ID starting with dba06d812a3c4d2dd3912a3ebe958762b30e870180ed03dde4ab1f7787202a6d not found: ID does not exist" Apr 23 16:37:43.400001 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.399943 2562 scope.go:117] "RemoveContainer" containerID="60242ccdc09d321ac02426ef69b0c8d1270c3397c772c238ecc7cefe5b36957b" Apr 23 16:37:43.400159 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.400143 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60242ccdc09d321ac02426ef69b0c8d1270c3397c772c238ecc7cefe5b36957b"} err="failed to get container status \"60242ccdc09d321ac02426ef69b0c8d1270c3397c772c238ecc7cefe5b36957b\": rpc error: code = NotFound desc = could not find container \"60242ccdc09d321ac02426ef69b0c8d1270c3397c772c238ecc7cefe5b36957b\": container with ID starting with 60242ccdc09d321ac02426ef69b0c8d1270c3397c772c238ecc7cefe5b36957b not found: ID does not exist" Apr 23 16:37:43.400209 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.400162 2562 scope.go:117] "RemoveContainer" containerID="fcb955a8997e18ad73021bed8baafba609e9448e5312b6614c345bbdd1c1cfe3" Apr 23 16:37:43.400356 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.400340 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcb955a8997e18ad73021bed8baafba609e9448e5312b6614c345bbdd1c1cfe3"} err="failed to get container status \"fcb955a8997e18ad73021bed8baafba609e9448e5312b6614c345bbdd1c1cfe3\": rpc error: code = NotFound desc = could not find container \"fcb955a8997e18ad73021bed8baafba609e9448e5312b6614c345bbdd1c1cfe3\": container with ID starting with fcb955a8997e18ad73021bed8baafba609e9448e5312b6614c345bbdd1c1cfe3 not found: ID does not exist" Apr 23 16:37:43.400356 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.400356 2562 scope.go:117] "RemoveContainer" containerID="e7aa6b764629d6c69f1903ce8ae868fbaabdacc0c13bf3ec539385257202be42" Apr 23 16:37:43.400599 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.400583 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7aa6b764629d6c69f1903ce8ae868fbaabdacc0c13bf3ec539385257202be42"} err="failed to get container status \"e7aa6b764629d6c69f1903ce8ae868fbaabdacc0c13bf3ec539385257202be42\": rpc error: code = NotFound desc = could not find container \"e7aa6b764629d6c69f1903ce8ae868fbaabdacc0c13bf3ec539385257202be42\": container with ID starting with e7aa6b764629d6c69f1903ce8ae868fbaabdacc0c13bf3ec539385257202be42 not found: ID does not exist" Apr 23 16:37:43.400648 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.400599 2562 scope.go:117] "RemoveContainer" containerID="d9913196204ce6713caae5213041d1aba6d71f5a727d2b3588dd9e4db9b75f5e" Apr 23 16:37:43.400838 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.400819 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9913196204ce6713caae5213041d1aba6d71f5a727d2b3588dd9e4db9b75f5e"} err="failed to get container status \"d9913196204ce6713caae5213041d1aba6d71f5a727d2b3588dd9e4db9b75f5e\": rpc error: code = NotFound desc = could not find container \"d9913196204ce6713caae5213041d1aba6d71f5a727d2b3588dd9e4db9b75f5e\": container with ID starting with d9913196204ce6713caae5213041d1aba6d71f5a727d2b3588dd9e4db9b75f5e not found: ID does not exist" Apr 23 16:37:43.400918 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.400840 2562 scope.go:117] "RemoveContainer" containerID="9d2067864899f370575c570f30e9f0ce2aed1e5bcff88c221c8c6e62dd1ea393" Apr 23 16:37:43.401076 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.401050 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d2067864899f370575c570f30e9f0ce2aed1e5bcff88c221c8c6e62dd1ea393"} err="failed to get container status \"9d2067864899f370575c570f30e9f0ce2aed1e5bcff88c221c8c6e62dd1ea393\": rpc error: code = NotFound desc = could not find container \"9d2067864899f370575c570f30e9f0ce2aed1e5bcff88c221c8c6e62dd1ea393\": container with ID starting with 9d2067864899f370575c570f30e9f0ce2aed1e5bcff88c221c8c6e62dd1ea393 not found: ID does not exist" Apr 23 16:37:43.401132 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.401079 2562 scope.go:117] "RemoveContainer" containerID="2a87ebf5c434c4390f8b278dd5bfb562bcd5d797b61aa9247251a773e431c8b1" Apr 23 16:37:43.401344 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.401326 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a87ebf5c434c4390f8b278dd5bfb562bcd5d797b61aa9247251a773e431c8b1"} err="failed to get container status \"2a87ebf5c434c4390f8b278dd5bfb562bcd5d797b61aa9247251a773e431c8b1\": rpc error: code = NotFound desc = could not find container \"2a87ebf5c434c4390f8b278dd5bfb562bcd5d797b61aa9247251a773e431c8b1\": container with ID starting with 2a87ebf5c434c4390f8b278dd5bfb562bcd5d797b61aa9247251a773e431c8b1 not found: ID does not exist" Apr 23 16:37:43.401400 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.401345 2562 scope.go:117] "RemoveContainer" containerID="dba06d812a3c4d2dd3912a3ebe958762b30e870180ed03dde4ab1f7787202a6d" Apr 23 16:37:43.401561 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.401543 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dba06d812a3c4d2dd3912a3ebe958762b30e870180ed03dde4ab1f7787202a6d"} err="failed to get container status \"dba06d812a3c4d2dd3912a3ebe958762b30e870180ed03dde4ab1f7787202a6d\": rpc error: code = NotFound desc = could not find container \"dba06d812a3c4d2dd3912a3ebe958762b30e870180ed03dde4ab1f7787202a6d\": container with ID starting with dba06d812a3c4d2dd3912a3ebe958762b30e870180ed03dde4ab1f7787202a6d not found: ID does not exist" Apr 23 16:37:43.401612 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.401562 2562 scope.go:117] "RemoveContainer" containerID="60242ccdc09d321ac02426ef69b0c8d1270c3397c772c238ecc7cefe5b36957b" Apr 23 16:37:43.401816 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.401798 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60242ccdc09d321ac02426ef69b0c8d1270c3397c772c238ecc7cefe5b36957b"} err="failed to get container status \"60242ccdc09d321ac02426ef69b0c8d1270c3397c772c238ecc7cefe5b36957b\": rpc error: code = NotFound desc = could not find container \"60242ccdc09d321ac02426ef69b0c8d1270c3397c772c238ecc7cefe5b36957b\": container with ID starting with 60242ccdc09d321ac02426ef69b0c8d1270c3397c772c238ecc7cefe5b36957b not found: ID does not exist" Apr 23 16:37:43.401869 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.401817 2562 scope.go:117] "RemoveContainer" containerID="fcb955a8997e18ad73021bed8baafba609e9448e5312b6614c345bbdd1c1cfe3" Apr 23 16:37:43.402065 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.402047 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcb955a8997e18ad73021bed8baafba609e9448e5312b6614c345bbdd1c1cfe3"} err="failed to get container status \"fcb955a8997e18ad73021bed8baafba609e9448e5312b6614c345bbdd1c1cfe3\": rpc error: code = NotFound desc = could not find container \"fcb955a8997e18ad73021bed8baafba609e9448e5312b6614c345bbdd1c1cfe3\": container with ID starting with fcb955a8997e18ad73021bed8baafba609e9448e5312b6614c345bbdd1c1cfe3 not found: ID does not exist" Apr 23 16:37:43.402113 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.402068 2562 scope.go:117] "RemoveContainer" containerID="e7aa6b764629d6c69f1903ce8ae868fbaabdacc0c13bf3ec539385257202be42" Apr 23 16:37:43.402309 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.402284 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7aa6b764629d6c69f1903ce8ae868fbaabdacc0c13bf3ec539385257202be42"} err="failed to get container status \"e7aa6b764629d6c69f1903ce8ae868fbaabdacc0c13bf3ec539385257202be42\": rpc error: code = NotFound desc = could not find container \"e7aa6b764629d6c69f1903ce8ae868fbaabdacc0c13bf3ec539385257202be42\": container with ID starting with e7aa6b764629d6c69f1903ce8ae868fbaabdacc0c13bf3ec539385257202be42 not found: ID does not exist" Apr 23 16:37:43.402367 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.402311 2562 scope.go:117] "RemoveContainer" containerID="d9913196204ce6713caae5213041d1aba6d71f5a727d2b3588dd9e4db9b75f5e" Apr 23 16:37:43.402560 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.402542 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9913196204ce6713caae5213041d1aba6d71f5a727d2b3588dd9e4db9b75f5e"} err="failed to get container status \"d9913196204ce6713caae5213041d1aba6d71f5a727d2b3588dd9e4db9b75f5e\": rpc error: code = NotFound desc = could not find container \"d9913196204ce6713caae5213041d1aba6d71f5a727d2b3588dd9e4db9b75f5e\": container with ID starting with d9913196204ce6713caae5213041d1aba6d71f5a727d2b3588dd9e4db9b75f5e not found: ID does not exist" Apr 23 16:37:43.402616 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.402562 2562 scope.go:117] "RemoveContainer" containerID="9d2067864899f370575c570f30e9f0ce2aed1e5bcff88c221c8c6e62dd1ea393" Apr 23 16:37:43.402810 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.402790 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d2067864899f370575c570f30e9f0ce2aed1e5bcff88c221c8c6e62dd1ea393"} err="failed to get container status \"9d2067864899f370575c570f30e9f0ce2aed1e5bcff88c221c8c6e62dd1ea393\": rpc error: code = NotFound desc = could not find container \"9d2067864899f370575c570f30e9f0ce2aed1e5bcff88c221c8c6e62dd1ea393\": container with ID starting with 9d2067864899f370575c570f30e9f0ce2aed1e5bcff88c221c8c6e62dd1ea393 not found: ID does not exist" Apr 23 16:37:43.402871 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.402811 2562 scope.go:117] "RemoveContainer" containerID="2a87ebf5c434c4390f8b278dd5bfb562bcd5d797b61aa9247251a773e431c8b1" Apr 23 16:37:43.403030 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.403009 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a87ebf5c434c4390f8b278dd5bfb562bcd5d797b61aa9247251a773e431c8b1"} err="failed to get container status \"2a87ebf5c434c4390f8b278dd5bfb562bcd5d797b61aa9247251a773e431c8b1\": rpc error: code = NotFound desc = could not find container \"2a87ebf5c434c4390f8b278dd5bfb562bcd5d797b61aa9247251a773e431c8b1\": container with ID starting with 2a87ebf5c434c4390f8b278dd5bfb562bcd5d797b61aa9247251a773e431c8b1 not found: ID does not exist" Apr 23 16:37:43.403073 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.403033 2562 scope.go:117] "RemoveContainer" containerID="dba06d812a3c4d2dd3912a3ebe958762b30e870180ed03dde4ab1f7787202a6d" Apr 23 16:37:43.403255 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.403237 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dba06d812a3c4d2dd3912a3ebe958762b30e870180ed03dde4ab1f7787202a6d"} err="failed to get container status \"dba06d812a3c4d2dd3912a3ebe958762b30e870180ed03dde4ab1f7787202a6d\": rpc error: code = NotFound desc = could not find container \"dba06d812a3c4d2dd3912a3ebe958762b30e870180ed03dde4ab1f7787202a6d\": container with ID starting with dba06d812a3c4d2dd3912a3ebe958762b30e870180ed03dde4ab1f7787202a6d not found: ID does not exist" Apr 23 16:37:43.403309 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.403257 2562 scope.go:117] "RemoveContainer" containerID="60242ccdc09d321ac02426ef69b0c8d1270c3397c772c238ecc7cefe5b36957b" Apr 23 16:37:43.403595 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.403559 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60242ccdc09d321ac02426ef69b0c8d1270c3397c772c238ecc7cefe5b36957b"} err="failed to get container status \"60242ccdc09d321ac02426ef69b0c8d1270c3397c772c238ecc7cefe5b36957b\": rpc error: code = NotFound desc = could not find container \"60242ccdc09d321ac02426ef69b0c8d1270c3397c772c238ecc7cefe5b36957b\": container with ID starting with 60242ccdc09d321ac02426ef69b0c8d1270c3397c772c238ecc7cefe5b36957b not found: ID does not exist" Apr 23 16:37:43.403595 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.403593 2562 scope.go:117] "RemoveContainer" containerID="fcb955a8997e18ad73021bed8baafba609e9448e5312b6614c345bbdd1c1cfe3" Apr 23 16:37:43.403832 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.403811 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcb955a8997e18ad73021bed8baafba609e9448e5312b6614c345bbdd1c1cfe3"} err="failed to get container status \"fcb955a8997e18ad73021bed8baafba609e9448e5312b6614c345bbdd1c1cfe3\": rpc error: code = NotFound desc = could not find container \"fcb955a8997e18ad73021bed8baafba609e9448e5312b6614c345bbdd1c1cfe3\": container with ID starting with fcb955a8997e18ad73021bed8baafba609e9448e5312b6614c345bbdd1c1cfe3 not found: ID does not exist" Apr 23 16:37:43.409807 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.408298 2562 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e86cbe0f-1d12-44e9-8b77-a009364ca559-prometheus-k8s-db\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 16:37:43.409807 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.408327 2562 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e86cbe0f-1d12-44e9-8b77-a009364ca559-config\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 16:37:43.409807 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.408342 2562 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e86cbe0f-1d12-44e9-8b77-a009364ca559-secret-grpc-tls\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 16:37:43.409807 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.408356 2562 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e86cbe0f-1d12-44e9-8b77-a009364ca559-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 16:37:43.409807 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.408371 2562 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e86cbe0f-1d12-44e9-8b77-a009364ca559-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 16:37:43.409807 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.408385 2562 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e86cbe0f-1d12-44e9-8b77-a009364ca559-tls-assets\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 16:37:43.409807 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.408398 2562 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e86cbe0f-1d12-44e9-8b77-a009364ca559-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 16:37:43.409807 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.408413 2562 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e86cbe0f-1d12-44e9-8b77-a009364ca559-thanos-prometheus-http-client-file\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 16:37:43.409807 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.408426 2562 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e86cbe0f-1d12-44e9-8b77-a009364ca559-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 16:37:43.409807 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.408440 2562 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e86cbe0f-1d12-44e9-8b77-a009364ca559-secret-metrics-client-certs\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 16:37:43.409807 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.408454 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2nc75\" (UniqueName: \"kubernetes.io/projected/e86cbe0f-1d12-44e9-8b77-a009364ca559-kube-api-access-2nc75\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 16:37:43.409807 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.408466 2562 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e86cbe0f-1d12-44e9-8b77-a009364ca559-secret-kube-rbac-proxy\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 16:37:43.409807 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.408479 2562 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e86cbe0f-1d12-44e9-8b77-a009364ca559-web-config\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 16:37:43.409807 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.408494 2562 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e86cbe0f-1d12-44e9-8b77-a009364ca559-config-out\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 16:37:43.409807 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.408505 2562 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e86cbe0f-1d12-44e9-8b77-a009364ca559-secret-prometheus-k8s-tls\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 16:37:43.409807 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.408517 2562 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e86cbe0f-1d12-44e9-8b77-a009364ca559-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 16:37:43.426342 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.426309 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 16:37:43.426635 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.426622 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e86cbe0f-1d12-44e9-8b77-a009364ca559" containerName="prometheus" Apr 23 16:37:43.426683 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.426637 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="e86cbe0f-1d12-44e9-8b77-a009364ca559" containerName="prometheus" Apr 23 16:37:43.426683 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.426648 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e86cbe0f-1d12-44e9-8b77-a009364ca559" containerName="thanos-sidecar" Apr 23 16:37:43.426683 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.426653 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="e86cbe0f-1d12-44e9-8b77-a009364ca559" containerName="thanos-sidecar" Apr 23 16:37:43.426683 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.426660 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e86cbe0f-1d12-44e9-8b77-a009364ca559" containerName="kube-rbac-proxy-web" Apr 23 16:37:43.426683 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.426665 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="e86cbe0f-1d12-44e9-8b77-a009364ca559" containerName="kube-rbac-proxy-web" Apr 23 16:37:43.426683 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.426677 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e86cbe0f-1d12-44e9-8b77-a009364ca559" containerName="kube-rbac-proxy" Apr 23 16:37:43.426683 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.426682 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="e86cbe0f-1d12-44e9-8b77-a009364ca559" containerName="kube-rbac-proxy" Apr 23 16:37:43.426903 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.426690 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d1b9341c-ba96-4e86-80f2-5260ed07f0a1" containerName="console" Apr 23 16:37:43.426903 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.426695 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1b9341c-ba96-4e86-80f2-5260ed07f0a1" containerName="console" Apr 23 16:37:43.426903 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.426701 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e86cbe0f-1d12-44e9-8b77-a009364ca559" containerName="init-config-reloader" Apr 23 16:37:43.426903 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.426706 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="e86cbe0f-1d12-44e9-8b77-a009364ca559" containerName="init-config-reloader" Apr 23 16:37:43.426903 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.426712 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e86cbe0f-1d12-44e9-8b77-a009364ca559" containerName="config-reloader" Apr 23 16:37:43.426903 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.426718 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="e86cbe0f-1d12-44e9-8b77-a009364ca559" containerName="config-reloader" Apr 23 16:37:43.426903 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.426727 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e86cbe0f-1d12-44e9-8b77-a009364ca559" containerName="kube-rbac-proxy-thanos" Apr 23 16:37:43.426903 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.426733 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="e86cbe0f-1d12-44e9-8b77-a009364ca559" containerName="kube-rbac-proxy-thanos" Apr 23 16:37:43.426903 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.426804 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="e86cbe0f-1d12-44e9-8b77-a009364ca559" containerName="thanos-sidecar" Apr 23 16:37:43.426903 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.426813 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="d1b9341c-ba96-4e86-80f2-5260ed07f0a1" containerName="console" Apr 23 16:37:43.426903 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.426819 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="e86cbe0f-1d12-44e9-8b77-a009364ca559" containerName="config-reloader" Apr 23 16:37:43.426903 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.426825 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="e86cbe0f-1d12-44e9-8b77-a009364ca559" containerName="prometheus" Apr 23 16:37:43.426903 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.426832 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="e86cbe0f-1d12-44e9-8b77-a009364ca559" containerName="kube-rbac-proxy-web" Apr 23 16:37:43.426903 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.426838 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="e86cbe0f-1d12-44e9-8b77-a009364ca559" containerName="kube-rbac-proxy" Apr 23 16:37:43.426903 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.426843 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="e86cbe0f-1d12-44e9-8b77-a009364ca559" containerName="kube-rbac-proxy-thanos" Apr 23 16:37:43.431973 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.431954 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.434938 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.434915 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 23 16:37:43.436513 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.436495 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 23 16:37:43.437647 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.437602 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 23 16:37:43.437989 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.437974 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-4chnbk5ae574f\"" Apr 23 16:37:43.438166 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.438149 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 23 16:37:43.438257 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.438198 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 23 16:37:43.438339 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.438321 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 23 16:37:43.438883 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.438863 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-tdx6x\"" Apr 23 16:37:43.438971 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.438953 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 23 16:37:43.439499 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.439479 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 23 16:37:43.440273 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.440254 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 23 16:37:43.440671 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.440657 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 23 16:37:43.443560 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.443542 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 23 16:37:43.447209 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.447186 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 23 16:37:43.455631 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.455606 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 16:37:43.509487 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.509451 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/578680f3-31a4-4c7d-9df3-703f5b279c9a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"578680f3-31a4-4c7d-9df3-703f5b279c9a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.509688 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.509496 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/578680f3-31a4-4c7d-9df3-703f5b279c9a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"578680f3-31a4-4c7d-9df3-703f5b279c9a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.509688 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.509520 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/578680f3-31a4-4c7d-9df3-703f5b279c9a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"578680f3-31a4-4c7d-9df3-703f5b279c9a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.509688 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.509547 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/578680f3-31a4-4c7d-9df3-703f5b279c9a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"578680f3-31a4-4c7d-9df3-703f5b279c9a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.509688 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.509607 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/578680f3-31a4-4c7d-9df3-703f5b279c9a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"578680f3-31a4-4c7d-9df3-703f5b279c9a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.509688 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.509638 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/578680f3-31a4-4c7d-9df3-703f5b279c9a-config\") pod \"prometheus-k8s-0\" (UID: \"578680f3-31a4-4c7d-9df3-703f5b279c9a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.509688 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.509669 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/578680f3-31a4-4c7d-9df3-703f5b279c9a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"578680f3-31a4-4c7d-9df3-703f5b279c9a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.510016 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.509691 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/578680f3-31a4-4c7d-9df3-703f5b279c9a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"578680f3-31a4-4c7d-9df3-703f5b279c9a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.510016 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.509723 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/578680f3-31a4-4c7d-9df3-703f5b279c9a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"578680f3-31a4-4c7d-9df3-703f5b279c9a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.510016 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.509780 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/578680f3-31a4-4c7d-9df3-703f5b279c9a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"578680f3-31a4-4c7d-9df3-703f5b279c9a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.510016 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.509805 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/578680f3-31a4-4c7d-9df3-703f5b279c9a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"578680f3-31a4-4c7d-9df3-703f5b279c9a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.510016 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.509827 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/578680f3-31a4-4c7d-9df3-703f5b279c9a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"578680f3-31a4-4c7d-9df3-703f5b279c9a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.510016 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.509856 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/578680f3-31a4-4c7d-9df3-703f5b279c9a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"578680f3-31a4-4c7d-9df3-703f5b279c9a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.510016 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.509903 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/578680f3-31a4-4c7d-9df3-703f5b279c9a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"578680f3-31a4-4c7d-9df3-703f5b279c9a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.510016 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.509952 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/578680f3-31a4-4c7d-9df3-703f5b279c9a-web-config\") pod \"prometheus-k8s-0\" (UID: \"578680f3-31a4-4c7d-9df3-703f5b279c9a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.510016 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.509979 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-958hs\" (UniqueName: \"kubernetes.io/projected/578680f3-31a4-4c7d-9df3-703f5b279c9a-kube-api-access-958hs\") pod \"prometheus-k8s-0\" (UID: \"578680f3-31a4-4c7d-9df3-703f5b279c9a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.510016 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.510005 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/578680f3-31a4-4c7d-9df3-703f5b279c9a-config-out\") pod \"prometheus-k8s-0\" (UID: \"578680f3-31a4-4c7d-9df3-703f5b279c9a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.510445 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.510036 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/578680f3-31a4-4c7d-9df3-703f5b279c9a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"578680f3-31a4-4c7d-9df3-703f5b279c9a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.611118 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.611020 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/578680f3-31a4-4c7d-9df3-703f5b279c9a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"578680f3-31a4-4c7d-9df3-703f5b279c9a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.611118 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.611069 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/578680f3-31a4-4c7d-9df3-703f5b279c9a-config\") pod \"prometheus-k8s-0\" (UID: \"578680f3-31a4-4c7d-9df3-703f5b279c9a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.611118 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.611099 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/578680f3-31a4-4c7d-9df3-703f5b279c9a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"578680f3-31a4-4c7d-9df3-703f5b279c9a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.611380 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.611125 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/578680f3-31a4-4c7d-9df3-703f5b279c9a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"578680f3-31a4-4c7d-9df3-703f5b279c9a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.611380 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.611148 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/578680f3-31a4-4c7d-9df3-703f5b279c9a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"578680f3-31a4-4c7d-9df3-703f5b279c9a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.611380 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.611173 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/578680f3-31a4-4c7d-9df3-703f5b279c9a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"578680f3-31a4-4c7d-9df3-703f5b279c9a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.611380 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.611198 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/578680f3-31a4-4c7d-9df3-703f5b279c9a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"578680f3-31a4-4c7d-9df3-703f5b279c9a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.611380 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.611223 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/578680f3-31a4-4c7d-9df3-703f5b279c9a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"578680f3-31a4-4c7d-9df3-703f5b279c9a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.611380 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.611255 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/578680f3-31a4-4c7d-9df3-703f5b279c9a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"578680f3-31a4-4c7d-9df3-703f5b279c9a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.611380 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.611278 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/578680f3-31a4-4c7d-9df3-703f5b279c9a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"578680f3-31a4-4c7d-9df3-703f5b279c9a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.611380 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.611303 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/578680f3-31a4-4c7d-9df3-703f5b279c9a-web-config\") pod \"prometheus-k8s-0\" (UID: \"578680f3-31a4-4c7d-9df3-703f5b279c9a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.611848 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.611824 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/578680f3-31a4-4c7d-9df3-703f5b279c9a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"578680f3-31a4-4c7d-9df3-703f5b279c9a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.611933 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.611885 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-958hs\" (UniqueName: \"kubernetes.io/projected/578680f3-31a4-4c7d-9df3-703f5b279c9a-kube-api-access-958hs\") pod \"prometheus-k8s-0\" (UID: \"578680f3-31a4-4c7d-9df3-703f5b279c9a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.611933 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.611899 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/578680f3-31a4-4c7d-9df3-703f5b279c9a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"578680f3-31a4-4c7d-9df3-703f5b279c9a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.611933 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.611924 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/578680f3-31a4-4c7d-9df3-703f5b279c9a-config-out\") pod \"prometheus-k8s-0\" (UID: \"578680f3-31a4-4c7d-9df3-703f5b279c9a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.612077 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.611980 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/578680f3-31a4-4c7d-9df3-703f5b279c9a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"578680f3-31a4-4c7d-9df3-703f5b279c9a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.612077 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.612027 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/578680f3-31a4-4c7d-9df3-703f5b279c9a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"578680f3-31a4-4c7d-9df3-703f5b279c9a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.612077 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.612073 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/578680f3-31a4-4c7d-9df3-703f5b279c9a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"578680f3-31a4-4c7d-9df3-703f5b279c9a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.612224 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.612118 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/578680f3-31a4-4c7d-9df3-703f5b279c9a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"578680f3-31a4-4c7d-9df3-703f5b279c9a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.612224 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.612122 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/578680f3-31a4-4c7d-9df3-703f5b279c9a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"578680f3-31a4-4c7d-9df3-703f5b279c9a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.612224 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.612166 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/578680f3-31a4-4c7d-9df3-703f5b279c9a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"578680f3-31a4-4c7d-9df3-703f5b279c9a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.613442 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.611899 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/578680f3-31a4-4c7d-9df3-703f5b279c9a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"578680f3-31a4-4c7d-9df3-703f5b279c9a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.614227 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.614112 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/578680f3-31a4-4c7d-9df3-703f5b279c9a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"578680f3-31a4-4c7d-9df3-703f5b279c9a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.614364 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.614272 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/578680f3-31a4-4c7d-9df3-703f5b279c9a-config\") pod \"prometheus-k8s-0\" (UID: \"578680f3-31a4-4c7d-9df3-703f5b279c9a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.615012 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.614692 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/578680f3-31a4-4c7d-9df3-703f5b279c9a-web-config\") pod \"prometheus-k8s-0\" (UID: \"578680f3-31a4-4c7d-9df3-703f5b279c9a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.615012 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.614783 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/578680f3-31a4-4c7d-9df3-703f5b279c9a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"578680f3-31a4-4c7d-9df3-703f5b279c9a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.615012 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.614998 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/578680f3-31a4-4c7d-9df3-703f5b279c9a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"578680f3-31a4-4c7d-9df3-703f5b279c9a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.615261 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.615238 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/578680f3-31a4-4c7d-9df3-703f5b279c9a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"578680f3-31a4-4c7d-9df3-703f5b279c9a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.615457 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.615438 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/578680f3-31a4-4c7d-9df3-703f5b279c9a-config-out\") pod \"prometheus-k8s-0\" (UID: \"578680f3-31a4-4c7d-9df3-703f5b279c9a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.615972 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.615952 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/578680f3-31a4-4c7d-9df3-703f5b279c9a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"578680f3-31a4-4c7d-9df3-703f5b279c9a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.616238 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.616219 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/578680f3-31a4-4c7d-9df3-703f5b279c9a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"578680f3-31a4-4c7d-9df3-703f5b279c9a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.616281 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.616225 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/578680f3-31a4-4c7d-9df3-703f5b279c9a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"578680f3-31a4-4c7d-9df3-703f5b279c9a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.616832 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.616809 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/578680f3-31a4-4c7d-9df3-703f5b279c9a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"578680f3-31a4-4c7d-9df3-703f5b279c9a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.617071 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.617058 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/578680f3-31a4-4c7d-9df3-703f5b279c9a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"578680f3-31a4-4c7d-9df3-703f5b279c9a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.617263 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.617248 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/578680f3-31a4-4c7d-9df3-703f5b279c9a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"578680f3-31a4-4c7d-9df3-703f5b279c9a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.627403 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.627380 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-958hs\" (UniqueName: \"kubernetes.io/projected/578680f3-31a4-4c7d-9df3-703f5b279c9a-kube-api-access-958hs\") pod \"prometheus-k8s-0\" (UID: \"578680f3-31a4-4c7d-9df3-703f5b279c9a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.742082 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.742037 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:43.870598 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:43.870571 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 16:37:43.873194 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:37:43.873171 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod578680f3_31a4_4c7d_9df3_703f5b279c9a.slice/crio-b2e611bfd24e7ffb3553ec320131c3ae43fa42fcae94d032059047e996acd0df WatchSource:0}: Error finding container b2e611bfd24e7ffb3553ec320131c3ae43fa42fcae94d032059047e996acd0df: Status 404 returned error can't find the container with id b2e611bfd24e7ffb3553ec320131c3ae43fa42fcae94d032059047e996acd0df Apr 23 16:37:44.331452 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:44.331413 2562 generic.go:358] "Generic (PLEG): container finished" podID="578680f3-31a4-4c7d-9df3-703f5b279c9a" containerID="53429ddf60263e4ae7eb4099a4c72f99565abe00c245651eaff1866d88e98a4d" exitCode=0 Apr 23 16:37:44.331452 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:44.331454 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"578680f3-31a4-4c7d-9df3-703f5b279c9a","Type":"ContainerDied","Data":"53429ddf60263e4ae7eb4099a4c72f99565abe00c245651eaff1866d88e98a4d"} Apr 23 16:37:44.331882 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:44.331474 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"578680f3-31a4-4c7d-9df3-703f5b279c9a","Type":"ContainerStarted","Data":"b2e611bfd24e7ffb3553ec320131c3ae43fa42fcae94d032059047e996acd0df"} Apr 23 16:37:44.742117 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:44.742085 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e86cbe0f-1d12-44e9-8b77-a009364ca559" path="/var/lib/kubelet/pods/e86cbe0f-1d12-44e9-8b77-a009364ca559/volumes" Apr 23 16:37:45.337594 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:45.337561 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"578680f3-31a4-4c7d-9df3-703f5b279c9a","Type":"ContainerStarted","Data":"aef3eac3e623393bd63d94db8502286f21ddbdd3851afc0c99c566ad7854f4bd"} Apr 23 16:37:45.337594 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:45.337598 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"578680f3-31a4-4c7d-9df3-703f5b279c9a","Type":"ContainerStarted","Data":"c4a4f4d84e443df2342858b73eb301933005bcb1632a6ade6de09e2c19d32400"} Apr 23 16:37:45.338012 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:45.337608 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"578680f3-31a4-4c7d-9df3-703f5b279c9a","Type":"ContainerStarted","Data":"277d5e7a5cb6361ebe8d1e9da313a5b572e0c5b58381da72e110a4db0340833e"} Apr 23 16:37:45.338012 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:45.337617 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"578680f3-31a4-4c7d-9df3-703f5b279c9a","Type":"ContainerStarted","Data":"7c0b1c038efcbf1a2cd97856efef643eff0621375d9eb562c1deca6412cda21d"} Apr 23 16:37:45.338012 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:45.337625 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"578680f3-31a4-4c7d-9df3-703f5b279c9a","Type":"ContainerStarted","Data":"1c2f56192b79e48dba3671cf780b19a1c2e24e52fb56aabd2eb1b6941fa8b5e1"} Apr 23 16:37:45.338012 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:45.337633 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"578680f3-31a4-4c7d-9df3-703f5b279c9a","Type":"ContainerStarted","Data":"7e5b424cf63bc016d00d2c6851ecb934197114f8abc23ef3faac22fa017f6576"} Apr 23 16:37:45.372350 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:45.372299 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.372285874 podStartE2EDuration="2.372285874s" podCreationTimestamp="2026-04-23 16:37:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:37:45.371008035 +0000 UTC m=+145.223823113" watchObservedRunningTime="2026-04-23 16:37:45.372285874 +0000 UTC m=+145.225100938" Apr 23 16:37:48.742520 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:37:48.742489 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:43.742954 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:38:43.742920 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:43.758193 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:38:43.758166 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:38:44.519470 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:38:44.519442 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:40:20.633655 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:40:20.633624 2562 kubelet.go:1628] "Image garbage collection succeeded" Apr 23 16:41:34.914302 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:41:34.914267 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6799bffdd6-d45jf"] Apr 23 16:41:34.917463 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:41:34.917446 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6799bffdd6-d45jf" Apr 23 16:41:34.920265 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:41:34.920232 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 23 16:41:34.920265 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:41:34.920260 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-g68ff\"" Apr 23 16:41:34.920512 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:41:34.920235 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 23 16:41:34.921619 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:41:34.921600 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 23 16:41:34.921619 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:41:34.921602 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 23 16:41:34.921819 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:41:34.921623 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 23 16:41:34.921819 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:41:34.921652 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 23 16:41:34.921954 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:41:34.921880 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 23 16:41:34.927297 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:41:34.927280 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 23 16:41:34.931632 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:41:34.931610 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6799bffdd6-d45jf"] Apr 23 16:41:35.098205 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:41:35.098166 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0299958-18f0-43ec-a2a3-24e19c7d3c8d-trusted-ca-bundle\") pod \"console-6799bffdd6-d45jf\" (UID: \"c0299958-18f0-43ec-a2a3-24e19c7d3c8d\") " pod="openshift-console/console-6799bffdd6-d45jf" Apr 23 16:41:35.098205 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:41:35.098210 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c0299958-18f0-43ec-a2a3-24e19c7d3c8d-console-serving-cert\") pod \"console-6799bffdd6-d45jf\" (UID: \"c0299958-18f0-43ec-a2a3-24e19c7d3c8d\") " pod="openshift-console/console-6799bffdd6-d45jf" Apr 23 16:41:35.098422 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:41:35.098236 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjpzb\" (UniqueName: \"kubernetes.io/projected/c0299958-18f0-43ec-a2a3-24e19c7d3c8d-kube-api-access-sjpzb\") pod \"console-6799bffdd6-d45jf\" (UID: \"c0299958-18f0-43ec-a2a3-24e19c7d3c8d\") " pod="openshift-console/console-6799bffdd6-d45jf" Apr 23 16:41:35.098422 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:41:35.098287 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c0299958-18f0-43ec-a2a3-24e19c7d3c8d-console-oauth-config\") pod \"console-6799bffdd6-d45jf\" (UID: \"c0299958-18f0-43ec-a2a3-24e19c7d3c8d\") " pod="openshift-console/console-6799bffdd6-d45jf" Apr 23 16:41:35.098422 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:41:35.098306 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c0299958-18f0-43ec-a2a3-24e19c7d3c8d-service-ca\") pod \"console-6799bffdd6-d45jf\" (UID: \"c0299958-18f0-43ec-a2a3-24e19c7d3c8d\") " pod="openshift-console/console-6799bffdd6-d45jf" Apr 23 16:41:35.098422 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:41:35.098346 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c0299958-18f0-43ec-a2a3-24e19c7d3c8d-oauth-serving-cert\") pod \"console-6799bffdd6-d45jf\" (UID: \"c0299958-18f0-43ec-a2a3-24e19c7d3c8d\") " pod="openshift-console/console-6799bffdd6-d45jf" Apr 23 16:41:35.098422 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:41:35.098375 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c0299958-18f0-43ec-a2a3-24e19c7d3c8d-console-config\") pod \"console-6799bffdd6-d45jf\" (UID: \"c0299958-18f0-43ec-a2a3-24e19c7d3c8d\") " pod="openshift-console/console-6799bffdd6-d45jf" Apr 23 16:41:35.199471 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:41:35.199389 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c0299958-18f0-43ec-a2a3-24e19c7d3c8d-oauth-serving-cert\") pod \"console-6799bffdd6-d45jf\" (UID: \"c0299958-18f0-43ec-a2a3-24e19c7d3c8d\") " pod="openshift-console/console-6799bffdd6-d45jf" Apr 23 16:41:35.199471 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:41:35.199427 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c0299958-18f0-43ec-a2a3-24e19c7d3c8d-console-config\") pod \"console-6799bffdd6-d45jf\" (UID: \"c0299958-18f0-43ec-a2a3-24e19c7d3c8d\") " pod="openshift-console/console-6799bffdd6-d45jf" Apr 23 16:41:35.199471 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:41:35.199455 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0299958-18f0-43ec-a2a3-24e19c7d3c8d-trusted-ca-bundle\") pod \"console-6799bffdd6-d45jf\" (UID: \"c0299958-18f0-43ec-a2a3-24e19c7d3c8d\") " pod="openshift-console/console-6799bffdd6-d45jf" Apr 23 16:41:35.199734 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:41:35.199692 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c0299958-18f0-43ec-a2a3-24e19c7d3c8d-console-serving-cert\") pod \"console-6799bffdd6-d45jf\" (UID: \"c0299958-18f0-43ec-a2a3-24e19c7d3c8d\") " pod="openshift-console/console-6799bffdd6-d45jf" Apr 23 16:41:35.199825 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:41:35.199738 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sjpzb\" (UniqueName: \"kubernetes.io/projected/c0299958-18f0-43ec-a2a3-24e19c7d3c8d-kube-api-access-sjpzb\") pod \"console-6799bffdd6-d45jf\" (UID: \"c0299958-18f0-43ec-a2a3-24e19c7d3c8d\") " pod="openshift-console/console-6799bffdd6-d45jf" Apr 23 16:41:35.199825 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:41:35.199816 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c0299958-18f0-43ec-a2a3-24e19c7d3c8d-console-oauth-config\") pod \"console-6799bffdd6-d45jf\" (UID: \"c0299958-18f0-43ec-a2a3-24e19c7d3c8d\") " pod="openshift-console/console-6799bffdd6-d45jf" Apr 23 16:41:35.199932 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:41:35.199838 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c0299958-18f0-43ec-a2a3-24e19c7d3c8d-service-ca\") pod \"console-6799bffdd6-d45jf\" (UID: \"c0299958-18f0-43ec-a2a3-24e19c7d3c8d\") " pod="openshift-console/console-6799bffdd6-d45jf" Apr 23 16:41:35.200110 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:41:35.200087 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c0299958-18f0-43ec-a2a3-24e19c7d3c8d-oauth-serving-cert\") pod \"console-6799bffdd6-d45jf\" (UID: \"c0299958-18f0-43ec-a2a3-24e19c7d3c8d\") " pod="openshift-console/console-6799bffdd6-d45jf" Apr 23 16:41:35.200208 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:41:35.200186 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c0299958-18f0-43ec-a2a3-24e19c7d3c8d-console-config\") pod \"console-6799bffdd6-d45jf\" (UID: \"c0299958-18f0-43ec-a2a3-24e19c7d3c8d\") " pod="openshift-console/console-6799bffdd6-d45jf" Apr 23 16:41:35.200383 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:41:35.200360 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c0299958-18f0-43ec-a2a3-24e19c7d3c8d-trusted-ca-bundle\") pod \"console-6799bffdd6-d45jf\" (UID: \"c0299958-18f0-43ec-a2a3-24e19c7d3c8d\") " pod="openshift-console/console-6799bffdd6-d45jf" Apr 23 16:41:35.200484 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:41:35.200388 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c0299958-18f0-43ec-a2a3-24e19c7d3c8d-service-ca\") pod \"console-6799bffdd6-d45jf\" (UID: \"c0299958-18f0-43ec-a2a3-24e19c7d3c8d\") " pod="openshift-console/console-6799bffdd6-d45jf" Apr 23 16:41:35.202163 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:41:35.202145 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c0299958-18f0-43ec-a2a3-24e19c7d3c8d-console-serving-cert\") pod \"console-6799bffdd6-d45jf\" (UID: \"c0299958-18f0-43ec-a2a3-24e19c7d3c8d\") " pod="openshift-console/console-6799bffdd6-d45jf" Apr 23 16:41:35.202256 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:41:35.202199 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c0299958-18f0-43ec-a2a3-24e19c7d3c8d-console-oauth-config\") pod \"console-6799bffdd6-d45jf\" (UID: \"c0299958-18f0-43ec-a2a3-24e19c7d3c8d\") " pod="openshift-console/console-6799bffdd6-d45jf" Apr 23 16:41:35.209208 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:41:35.209186 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjpzb\" (UniqueName: \"kubernetes.io/projected/c0299958-18f0-43ec-a2a3-24e19c7d3c8d-kube-api-access-sjpzb\") pod \"console-6799bffdd6-d45jf\" (UID: \"c0299958-18f0-43ec-a2a3-24e19c7d3c8d\") " pod="openshift-console/console-6799bffdd6-d45jf" Apr 23 16:41:35.227997 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:41:35.227976 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6799bffdd6-d45jf" Apr 23 16:41:35.352131 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:41:35.352107 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6799bffdd6-d45jf"] Apr 23 16:41:35.354151 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:41:35.354124 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0299958_18f0_43ec_a2a3_24e19c7d3c8d.slice/crio-3975bcbe873bfe181075d0dd5a74eb40718650cab79dbaf39676a925466070a4 WatchSource:0}: Error finding container 3975bcbe873bfe181075d0dd5a74eb40718650cab79dbaf39676a925466070a4: Status 404 returned error can't find the container with id 3975bcbe873bfe181075d0dd5a74eb40718650cab79dbaf39676a925466070a4 Apr 23 16:41:35.355867 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:41:35.355850 2562 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 16:41:35.973065 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:41:35.973032 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6799bffdd6-d45jf" event={"ID":"c0299958-18f0-43ec-a2a3-24e19c7d3c8d","Type":"ContainerStarted","Data":"0623c008e8c4cad26cd24a154f76ebe29c463ffd7e28b0bd77a0253c8dda96dc"} Apr 23 16:41:35.973065 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:41:35.973065 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6799bffdd6-d45jf" event={"ID":"c0299958-18f0-43ec-a2a3-24e19c7d3c8d","Type":"ContainerStarted","Data":"3975bcbe873bfe181075d0dd5a74eb40718650cab79dbaf39676a925466070a4"} Apr 23 16:41:35.997162 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:41:35.997112 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6799bffdd6-d45jf" podStartSLOduration=1.99709723 podStartE2EDuration="1.99709723s" podCreationTimestamp="2026-04-23 16:41:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:41:35.995372827 +0000 UTC m=+375.848187892" watchObservedRunningTime="2026-04-23 16:41:35.99709723 +0000 UTC m=+375.849912291" Apr 23 16:41:45.228945 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:41:45.228893 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6799bffdd6-d45jf" Apr 23 16:41:45.228945 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:41:45.228948 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6799bffdd6-d45jf" Apr 23 16:41:45.233787 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:41:45.233761 2562 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6799bffdd6-d45jf" Apr 23 16:41:46.007411 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:41:46.007382 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6799bffdd6-d45jf" Apr 23 16:42:12.174801 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:12.174765 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-gjjcv"] Apr 23 16:42:12.177690 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:12.177675 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-gjjcv" Apr 23 16:42:12.180414 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:12.180389 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-gjjlh\"" Apr 23 16:42:12.180521 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:12.180393 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 23 16:42:12.181843 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:12.181825 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 23 16:42:12.181925 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:12.181839 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 23 16:42:12.191637 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:12.191613 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-gjjcv"] Apr 23 16:42:12.307632 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:12.307596 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqjzs\" (UniqueName: \"kubernetes.io/projected/ceaa651d-2fe1-4eca-9d70-0f0ecb8e5433-kube-api-access-qqjzs\") pod \"model-serving-api-86f7b4b499-gjjcv\" (UID: \"ceaa651d-2fe1-4eca-9d70-0f0ecb8e5433\") " pod="kserve/model-serving-api-86f7b4b499-gjjcv" Apr 23 16:42:12.307819 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:12.307698 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ceaa651d-2fe1-4eca-9d70-0f0ecb8e5433-tls-certs\") pod \"model-serving-api-86f7b4b499-gjjcv\" (UID: \"ceaa651d-2fe1-4eca-9d70-0f0ecb8e5433\") " pod="kserve/model-serving-api-86f7b4b499-gjjcv" Apr 23 16:42:12.408630 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:12.408595 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qqjzs\" (UniqueName: \"kubernetes.io/projected/ceaa651d-2fe1-4eca-9d70-0f0ecb8e5433-kube-api-access-qqjzs\") pod \"model-serving-api-86f7b4b499-gjjcv\" (UID: \"ceaa651d-2fe1-4eca-9d70-0f0ecb8e5433\") " pod="kserve/model-serving-api-86f7b4b499-gjjcv" Apr 23 16:42:12.408764 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:12.408658 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ceaa651d-2fe1-4eca-9d70-0f0ecb8e5433-tls-certs\") pod \"model-serving-api-86f7b4b499-gjjcv\" (UID: \"ceaa651d-2fe1-4eca-9d70-0f0ecb8e5433\") " pod="kserve/model-serving-api-86f7b4b499-gjjcv" Apr 23 16:42:12.408810 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:42:12.408763 2562 secret.go:189] Couldn't get secret kserve/model-serving-api-tls: secret "model-serving-api-tls" not found Apr 23 16:42:12.408847 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:42:12.408816 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ceaa651d-2fe1-4eca-9d70-0f0ecb8e5433-tls-certs podName:ceaa651d-2fe1-4eca-9d70-0f0ecb8e5433 nodeName:}" failed. No retries permitted until 2026-04-23 16:42:12.908800636 +0000 UTC m=+412.761615678 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/ceaa651d-2fe1-4eca-9d70-0f0ecb8e5433-tls-certs") pod "model-serving-api-86f7b4b499-gjjcv" (UID: "ceaa651d-2fe1-4eca-9d70-0f0ecb8e5433") : secret "model-serving-api-tls" not found Apr 23 16:42:12.420875 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:12.420819 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqjzs\" (UniqueName: \"kubernetes.io/projected/ceaa651d-2fe1-4eca-9d70-0f0ecb8e5433-kube-api-access-qqjzs\") pod \"model-serving-api-86f7b4b499-gjjcv\" (UID: \"ceaa651d-2fe1-4eca-9d70-0f0ecb8e5433\") " pod="kserve/model-serving-api-86f7b4b499-gjjcv" Apr 23 16:42:12.913252 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:12.913212 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ceaa651d-2fe1-4eca-9d70-0f0ecb8e5433-tls-certs\") pod \"model-serving-api-86f7b4b499-gjjcv\" (UID: \"ceaa651d-2fe1-4eca-9d70-0f0ecb8e5433\") " pod="kserve/model-serving-api-86f7b4b499-gjjcv" Apr 23 16:42:12.915624 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:12.915596 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ceaa651d-2fe1-4eca-9d70-0f0ecb8e5433-tls-certs\") pod \"model-serving-api-86f7b4b499-gjjcv\" (UID: \"ceaa651d-2fe1-4eca-9d70-0f0ecb8e5433\") " pod="kserve/model-serving-api-86f7b4b499-gjjcv" Apr 23 16:42:13.088107 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:13.088074 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-gjjcv" Apr 23 16:42:13.203691 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:13.203666 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-gjjcv"] Apr 23 16:42:13.206057 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:42:13.206028 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podceaa651d_2fe1_4eca_9d70_0f0ecb8e5433.slice/crio-e98ff877252120a8597ca3c01b94817fd3e3679bbb2602ccc03deb096375e8ad WatchSource:0}: Error finding container e98ff877252120a8597ca3c01b94817fd3e3679bbb2602ccc03deb096375e8ad: Status 404 returned error can't find the container with id e98ff877252120a8597ca3c01b94817fd3e3679bbb2602ccc03deb096375e8ad Apr 23 16:42:14.083489 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:14.083448 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-gjjcv" event={"ID":"ceaa651d-2fe1-4eca-9d70-0f0ecb8e5433","Type":"ContainerStarted","Data":"e98ff877252120a8597ca3c01b94817fd3e3679bbb2602ccc03deb096375e8ad"} Apr 23 16:42:16.090764 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:16.090714 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-gjjcv" event={"ID":"ceaa651d-2fe1-4eca-9d70-0f0ecb8e5433","Type":"ContainerStarted","Data":"050260f38b1f0437b35073b9e4b3b9b27e2a3dce693fb1541c5af41b6ea838d1"} Apr 23 16:42:16.091176 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:16.090813 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-gjjcv" Apr 23 16:42:16.113245 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:16.113187 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-gjjcv" podStartSLOduration=1.7378768789999999 podStartE2EDuration="4.113169202s" podCreationTimestamp="2026-04-23 16:42:12 +0000 UTC" firstStartedPulling="2026-04-23 16:42:13.207786208 +0000 UTC m=+413.060601250" lastFinishedPulling="2026-04-23 16:42:15.583078526 +0000 UTC m=+415.435893573" observedRunningTime="2026-04-23 16:42:16.11218657 +0000 UTC m=+415.965001637" watchObservedRunningTime="2026-04-23 16:42:16.113169202 +0000 UTC m=+415.965984267" Apr 23 16:42:27.097277 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:27.097249 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-gjjcv" Apr 23 16:42:28.027257 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:28.027226 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-qpp47"] Apr 23 16:42:28.030558 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:28.030542 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-qpp47" Apr 23 16:42:28.033467 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:28.033442 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-5mftf\"" Apr 23 16:42:28.033587 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:28.033496 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 23 16:42:28.038204 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:28.038182 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-qpp47"] Apr 23 16:42:28.145699 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:28.145656 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5mqx\" (UniqueName: \"kubernetes.io/projected/316b0480-e2ec-4d32-b14d-6bd0f8560aba-kube-api-access-r5mqx\") pod \"s3-init-qpp47\" (UID: \"316b0480-e2ec-4d32-b14d-6bd0f8560aba\") " pod="kserve/s3-init-qpp47" Apr 23 16:42:28.246947 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:28.246908 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r5mqx\" (UniqueName: \"kubernetes.io/projected/316b0480-e2ec-4d32-b14d-6bd0f8560aba-kube-api-access-r5mqx\") pod \"s3-init-qpp47\" (UID: \"316b0480-e2ec-4d32-b14d-6bd0f8560aba\") " pod="kserve/s3-init-qpp47" Apr 23 16:42:28.257311 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:28.257278 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5mqx\" (UniqueName: \"kubernetes.io/projected/316b0480-e2ec-4d32-b14d-6bd0f8560aba-kube-api-access-r5mqx\") pod \"s3-init-qpp47\" (UID: \"316b0480-e2ec-4d32-b14d-6bd0f8560aba\") " pod="kserve/s3-init-qpp47" Apr 23 16:42:28.349319 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:28.349236 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-qpp47" Apr 23 16:42:28.467962 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:28.467939 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-qpp47"] Apr 23 16:42:28.470143 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:42:28.470117 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod316b0480_e2ec_4d32_b14d_6bd0f8560aba.slice/crio-1087b420d13b76ee69d47020ee8650b0010f6c2104237b413f871e78ffb0456c WatchSource:0}: Error finding container 1087b420d13b76ee69d47020ee8650b0010f6c2104237b413f871e78ffb0456c: Status 404 returned error can't find the container with id 1087b420d13b76ee69d47020ee8650b0010f6c2104237b413f871e78ffb0456c Apr 23 16:42:29.129410 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:29.129370 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-qpp47" event={"ID":"316b0480-e2ec-4d32-b14d-6bd0f8560aba","Type":"ContainerStarted","Data":"1087b420d13b76ee69d47020ee8650b0010f6c2104237b413f871e78ffb0456c"} Apr 23 16:42:33.142321 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:33.142287 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-qpp47" event={"ID":"316b0480-e2ec-4d32-b14d-6bd0f8560aba","Type":"ContainerStarted","Data":"732e644d4a51743d1fe7318b91781285521e5bc40a56897d4f0c15761f937bdc"} Apr 23 16:42:33.159605 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:33.159505 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-qpp47" podStartSLOduration=0.724209705 podStartE2EDuration="5.159486854s" podCreationTimestamp="2026-04-23 16:42:28 +0000 UTC" firstStartedPulling="2026-04-23 16:42:28.471939072 +0000 UTC m=+428.324754114" lastFinishedPulling="2026-04-23 16:42:32.907216217 +0000 UTC m=+432.760031263" observedRunningTime="2026-04-23 16:42:33.159155116 +0000 UTC m=+433.011970178" watchObservedRunningTime="2026-04-23 16:42:33.159486854 +0000 UTC m=+433.012301919" Apr 23 16:42:36.154045 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:36.154008 2562 generic.go:358] "Generic (PLEG): container finished" podID="316b0480-e2ec-4d32-b14d-6bd0f8560aba" containerID="732e644d4a51743d1fe7318b91781285521e5bc40a56897d4f0c15761f937bdc" exitCode=0 Apr 23 16:42:36.154425 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:36.154086 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-qpp47" event={"ID":"316b0480-e2ec-4d32-b14d-6bd0f8560aba","Type":"ContainerDied","Data":"732e644d4a51743d1fe7318b91781285521e5bc40a56897d4f0c15761f937bdc"} Apr 23 16:42:37.285418 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:37.285395 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-qpp47" Apr 23 16:42:37.321482 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:37.321443 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5mqx\" (UniqueName: \"kubernetes.io/projected/316b0480-e2ec-4d32-b14d-6bd0f8560aba-kube-api-access-r5mqx\") pod \"316b0480-e2ec-4d32-b14d-6bd0f8560aba\" (UID: \"316b0480-e2ec-4d32-b14d-6bd0f8560aba\") " Apr 23 16:42:37.323684 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:37.323646 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/316b0480-e2ec-4d32-b14d-6bd0f8560aba-kube-api-access-r5mqx" (OuterVolumeSpecName: "kube-api-access-r5mqx") pod "316b0480-e2ec-4d32-b14d-6bd0f8560aba" (UID: "316b0480-e2ec-4d32-b14d-6bd0f8560aba"). InnerVolumeSpecName "kube-api-access-r5mqx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:42:37.423011 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:37.422920 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r5mqx\" (UniqueName: \"kubernetes.io/projected/316b0480-e2ec-4d32-b14d-6bd0f8560aba-kube-api-access-r5mqx\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 16:42:38.160209 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:38.160165 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-qpp47" event={"ID":"316b0480-e2ec-4d32-b14d-6bd0f8560aba","Type":"ContainerDied","Data":"1087b420d13b76ee69d47020ee8650b0010f6c2104237b413f871e78ffb0456c"} Apr 23 16:42:38.160209 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:38.160201 2562 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1087b420d13b76ee69d47020ee8650b0010f6c2104237b413f871e78ffb0456c" Apr 23 16:42:38.160209 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:38.160207 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-qpp47" Apr 23 16:42:38.902109 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:38.902078 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-ldsbf"] Apr 23 16:42:38.902485 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:38.902436 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="316b0480-e2ec-4d32-b14d-6bd0f8560aba" containerName="s3-init" Apr 23 16:42:38.902485 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:38.902447 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="316b0480-e2ec-4d32-b14d-6bd0f8560aba" containerName="s3-init" Apr 23 16:42:38.902555 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:38.902500 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="316b0480-e2ec-4d32-b14d-6bd0f8560aba" containerName="s3-init" Apr 23 16:42:38.904376 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:38.904361 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-ldsbf" Apr 23 16:42:38.907592 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:38.907573 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 23 16:42:38.907712 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:38.907596 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-5mftf\"" Apr 23 16:42:38.913789 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:38.913766 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-ldsbf"] Apr 23 16:42:38.937597 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:38.937569 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/8dbf9b46-3798-49e8-a2f8-477e082d668b-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-ldsbf\" (UID: \"8dbf9b46-3798-49e8-a2f8-477e082d668b\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-ldsbf" Apr 23 16:42:38.937773 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:38.937624 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxkf7\" (UniqueName: \"kubernetes.io/projected/8dbf9b46-3798-49e8-a2f8-477e082d668b-kube-api-access-nxkf7\") pod \"seaweedfs-tls-custom-ddd4dbfd-ldsbf\" (UID: \"8dbf9b46-3798-49e8-a2f8-477e082d668b\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-ldsbf" Apr 23 16:42:39.038636 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:39.038597 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nxkf7\" (UniqueName: \"kubernetes.io/projected/8dbf9b46-3798-49e8-a2f8-477e082d668b-kube-api-access-nxkf7\") pod \"seaweedfs-tls-custom-ddd4dbfd-ldsbf\" (UID: \"8dbf9b46-3798-49e8-a2f8-477e082d668b\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-ldsbf" Apr 23 16:42:39.038832 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:39.038695 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/8dbf9b46-3798-49e8-a2f8-477e082d668b-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-ldsbf\" (UID: \"8dbf9b46-3798-49e8-a2f8-477e082d668b\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-ldsbf" Apr 23 16:42:39.039127 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:39.039107 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/8dbf9b46-3798-49e8-a2f8-477e082d668b-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-ldsbf\" (UID: \"8dbf9b46-3798-49e8-a2f8-477e082d668b\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-ldsbf" Apr 23 16:42:39.047801 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:39.047765 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxkf7\" (UniqueName: \"kubernetes.io/projected/8dbf9b46-3798-49e8-a2f8-477e082d668b-kube-api-access-nxkf7\") pod \"seaweedfs-tls-custom-ddd4dbfd-ldsbf\" (UID: \"8dbf9b46-3798-49e8-a2f8-477e082d668b\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-ldsbf" Apr 23 16:42:39.214570 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:39.214487 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-ldsbf" Apr 23 16:42:39.333832 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:39.333803 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-ldsbf"] Apr 23 16:42:39.336216 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:42:39.336178 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8dbf9b46_3798_49e8_a2f8_477e082d668b.slice/crio-18eba48188acecfc963bd99c13f03735262294a75fb349aeb95ef0d6eb00c882 WatchSource:0}: Error finding container 18eba48188acecfc963bd99c13f03735262294a75fb349aeb95ef0d6eb00c882: Status 404 returned error can't find the container with id 18eba48188acecfc963bd99c13f03735262294a75fb349aeb95ef0d6eb00c882 Apr 23 16:42:40.166956 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:40.166918 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-ldsbf" event={"ID":"8dbf9b46-3798-49e8-a2f8-477e082d668b","Type":"ContainerStarted","Data":"18eba48188acecfc963bd99c13f03735262294a75fb349aeb95ef0d6eb00c882"} Apr 23 16:42:42.173851 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:42.173823 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-ldsbf" event={"ID":"8dbf9b46-3798-49e8-a2f8-477e082d668b","Type":"ContainerStarted","Data":"859e62fa2920ee9c46c189084160740a7140c89c40d1e078b892fcbc7c8008c9"} Apr 23 16:42:42.191256 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:42.191197 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-ldsbf" podStartSLOduration=1.4203553580000001 podStartE2EDuration="4.191178227s" podCreationTimestamp="2026-04-23 16:42:38 +0000 UTC" firstStartedPulling="2026-04-23 16:42:39.33743854 +0000 UTC m=+439.190253583" lastFinishedPulling="2026-04-23 16:42:42.108261406 +0000 UTC m=+441.961076452" observedRunningTime="2026-04-23 16:42:42.189783455 +0000 UTC m=+442.042598512" watchObservedRunningTime="2026-04-23 16:42:42.191178227 +0000 UTC m=+442.043993292" Apr 23 16:42:43.224835 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:43.224800 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-ldsbf"] Apr 23 16:42:44.179384 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:44.179346 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-ldsbf" podUID="8dbf9b46-3798-49e8-a2f8-477e082d668b" containerName="seaweedfs-tls-custom" containerID="cri-o://859e62fa2920ee9c46c189084160740a7140c89c40d1e078b892fcbc7c8008c9" gracePeriod=30 Apr 23 16:42:45.408556 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:45.408532 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-ldsbf" Apr 23 16:42:45.500318 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:45.500232 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxkf7\" (UniqueName: \"kubernetes.io/projected/8dbf9b46-3798-49e8-a2f8-477e082d668b-kube-api-access-nxkf7\") pod \"8dbf9b46-3798-49e8-a2f8-477e082d668b\" (UID: \"8dbf9b46-3798-49e8-a2f8-477e082d668b\") " Apr 23 16:42:45.500463 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:45.500363 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/8dbf9b46-3798-49e8-a2f8-477e082d668b-data\") pod \"8dbf9b46-3798-49e8-a2f8-477e082d668b\" (UID: \"8dbf9b46-3798-49e8-a2f8-477e082d668b\") " Apr 23 16:42:45.501636 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:45.501609 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8dbf9b46-3798-49e8-a2f8-477e082d668b-data" (OuterVolumeSpecName: "data") pod "8dbf9b46-3798-49e8-a2f8-477e082d668b" (UID: "8dbf9b46-3798-49e8-a2f8-477e082d668b"). InnerVolumeSpecName "data". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:42:45.502242 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:45.502213 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dbf9b46-3798-49e8-a2f8-477e082d668b-kube-api-access-nxkf7" (OuterVolumeSpecName: "kube-api-access-nxkf7") pod "8dbf9b46-3798-49e8-a2f8-477e082d668b" (UID: "8dbf9b46-3798-49e8-a2f8-477e082d668b"). InnerVolumeSpecName "kube-api-access-nxkf7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:42:45.601113 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:45.601077 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nxkf7\" (UniqueName: \"kubernetes.io/projected/8dbf9b46-3798-49e8-a2f8-477e082d668b-kube-api-access-nxkf7\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 16:42:45.601113 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:45.601107 2562 reconciler_common.go:299] "Volume detached for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/8dbf9b46-3798-49e8-a2f8-477e082d668b-data\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 16:42:46.185921 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:46.185880 2562 generic.go:358] "Generic (PLEG): container finished" podID="8dbf9b46-3798-49e8-a2f8-477e082d668b" containerID="859e62fa2920ee9c46c189084160740a7140c89c40d1e078b892fcbc7c8008c9" exitCode=0 Apr 23 16:42:46.186086 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:46.185940 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-ldsbf" Apr 23 16:42:46.186086 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:46.185960 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-ldsbf" event={"ID":"8dbf9b46-3798-49e8-a2f8-477e082d668b","Type":"ContainerDied","Data":"859e62fa2920ee9c46c189084160740a7140c89c40d1e078b892fcbc7c8008c9"} Apr 23 16:42:46.186086 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:46.185995 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-ldsbf" event={"ID":"8dbf9b46-3798-49e8-a2f8-477e082d668b","Type":"ContainerDied","Data":"18eba48188acecfc963bd99c13f03735262294a75fb349aeb95ef0d6eb00c882"} Apr 23 16:42:46.186086 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:46.186012 2562 scope.go:117] "RemoveContainer" containerID="859e62fa2920ee9c46c189084160740a7140c89c40d1e078b892fcbc7c8008c9" Apr 23 16:42:46.194800 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:46.194785 2562 scope.go:117] "RemoveContainer" containerID="859e62fa2920ee9c46c189084160740a7140c89c40d1e078b892fcbc7c8008c9" Apr 23 16:42:46.195046 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:42:46.195026 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"859e62fa2920ee9c46c189084160740a7140c89c40d1e078b892fcbc7c8008c9\": container with ID starting with 859e62fa2920ee9c46c189084160740a7140c89c40d1e078b892fcbc7c8008c9 not found: ID does not exist" containerID="859e62fa2920ee9c46c189084160740a7140c89c40d1e078b892fcbc7c8008c9" Apr 23 16:42:46.195089 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:46.195054 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"859e62fa2920ee9c46c189084160740a7140c89c40d1e078b892fcbc7c8008c9"} err="failed to get container status \"859e62fa2920ee9c46c189084160740a7140c89c40d1e078b892fcbc7c8008c9\": rpc error: code = NotFound desc = could not find container \"859e62fa2920ee9c46c189084160740a7140c89c40d1e078b892fcbc7c8008c9\": container with ID starting with 859e62fa2920ee9c46c189084160740a7140c89c40d1e078b892fcbc7c8008c9 not found: ID does not exist" Apr 23 16:42:46.216457 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:46.216437 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-ldsbf"] Apr 23 16:42:46.220195 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:46.220176 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-ldsbf"] Apr 23 16:42:46.741627 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:46.741589 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dbf9b46-3798-49e8-a2f8-477e082d668b" path="/var/lib/kubelet/pods/8dbf9b46-3798-49e8-a2f8-477e082d668b/volumes" Apr 23 16:42:47.617953 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:47.617926 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-custom-js6s4"] Apr 23 16:42:47.618225 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:47.618211 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8dbf9b46-3798-49e8-a2f8-477e082d668b" containerName="seaweedfs-tls-custom" Apr 23 16:42:47.618271 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:47.618226 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dbf9b46-3798-49e8-a2f8-477e082d668b" containerName="seaweedfs-tls-custom" Apr 23 16:42:47.618301 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:47.618284 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="8dbf9b46-3798-49e8-a2f8-477e082d668b" containerName="seaweedfs-tls-custom" Apr 23 16:42:47.621116 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:47.621099 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-js6s4" Apr 23 16:42:47.623768 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:47.623734 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 23 16:42:47.623880 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:47.623863 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-5mftf\"" Apr 23 16:42:47.629357 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:47.629335 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-js6s4"] Apr 23 16:42:47.720004 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:47.719950 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmcsw\" (UniqueName: \"kubernetes.io/projected/caa3c9cf-12d6-4cea-8453-20bcb03f5c31-kube-api-access-rmcsw\") pod \"s3-tls-init-custom-js6s4\" (UID: \"caa3c9cf-12d6-4cea-8453-20bcb03f5c31\") " pod="kserve/s3-tls-init-custom-js6s4" Apr 23 16:42:47.820928 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:47.820891 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rmcsw\" (UniqueName: \"kubernetes.io/projected/caa3c9cf-12d6-4cea-8453-20bcb03f5c31-kube-api-access-rmcsw\") pod \"s3-tls-init-custom-js6s4\" (UID: \"caa3c9cf-12d6-4cea-8453-20bcb03f5c31\") " pod="kserve/s3-tls-init-custom-js6s4" Apr 23 16:42:47.829650 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:47.829625 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmcsw\" (UniqueName: \"kubernetes.io/projected/caa3c9cf-12d6-4cea-8453-20bcb03f5c31-kube-api-access-rmcsw\") pod \"s3-tls-init-custom-js6s4\" (UID: \"caa3c9cf-12d6-4cea-8453-20bcb03f5c31\") " pod="kserve/s3-tls-init-custom-js6s4" Apr 23 16:42:47.941448 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:47.941413 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-js6s4" Apr 23 16:42:48.060614 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:48.060574 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-js6s4"] Apr 23 16:42:48.063669 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:42:48.063641 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcaa3c9cf_12d6_4cea_8453_20bcb03f5c31.slice/crio-a813db2131c848f81563cff6c56bb55dc9de4635f2c3977c18b0221d78676d96 WatchSource:0}: Error finding container a813db2131c848f81563cff6c56bb55dc9de4635f2c3977c18b0221d78676d96: Status 404 returned error can't find the container with id a813db2131c848f81563cff6c56bb55dc9de4635f2c3977c18b0221d78676d96 Apr 23 16:42:48.193952 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:48.193871 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-js6s4" event={"ID":"caa3c9cf-12d6-4cea-8453-20bcb03f5c31","Type":"ContainerStarted","Data":"85b018d3b1d0ebdda41d4163be4f38fcafed4fdcd884b2646525199fcbe79037"} Apr 23 16:42:48.193952 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:48.193909 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-js6s4" event={"ID":"caa3c9cf-12d6-4cea-8453-20bcb03f5c31","Type":"ContainerStarted","Data":"a813db2131c848f81563cff6c56bb55dc9de4635f2c3977c18b0221d78676d96"} Apr 23 16:42:48.210886 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:48.210833 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-custom-js6s4" podStartSLOduration=1.21081625 podStartE2EDuration="1.21081625s" podCreationTimestamp="2026-04-23 16:42:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:42:48.209989407 +0000 UTC m=+448.062804472" watchObservedRunningTime="2026-04-23 16:42:48.21081625 +0000 UTC m=+448.063631316" Apr 23 16:42:54.214673 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:54.214637 2562 generic.go:358] "Generic (PLEG): container finished" podID="caa3c9cf-12d6-4cea-8453-20bcb03f5c31" containerID="85b018d3b1d0ebdda41d4163be4f38fcafed4fdcd884b2646525199fcbe79037" exitCode=0 Apr 23 16:42:54.215074 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:54.214700 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-js6s4" event={"ID":"caa3c9cf-12d6-4cea-8453-20bcb03f5c31","Type":"ContainerDied","Data":"85b018d3b1d0ebdda41d4163be4f38fcafed4fdcd884b2646525199fcbe79037"} Apr 23 16:42:55.338198 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:55.338169 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-js6s4" Apr 23 16:42:55.487107 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:55.487012 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmcsw\" (UniqueName: \"kubernetes.io/projected/caa3c9cf-12d6-4cea-8453-20bcb03f5c31-kube-api-access-rmcsw\") pod \"caa3c9cf-12d6-4cea-8453-20bcb03f5c31\" (UID: \"caa3c9cf-12d6-4cea-8453-20bcb03f5c31\") " Apr 23 16:42:55.489063 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:55.489036 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/caa3c9cf-12d6-4cea-8453-20bcb03f5c31-kube-api-access-rmcsw" (OuterVolumeSpecName: "kube-api-access-rmcsw") pod "caa3c9cf-12d6-4cea-8453-20bcb03f5c31" (UID: "caa3c9cf-12d6-4cea-8453-20bcb03f5c31"). InnerVolumeSpecName "kube-api-access-rmcsw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:42:55.588426 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:55.588380 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rmcsw\" (UniqueName: \"kubernetes.io/projected/caa3c9cf-12d6-4cea-8453-20bcb03f5c31-kube-api-access-rmcsw\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 16:42:56.220942 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:56.220918 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-js6s4" Apr 23 16:42:56.220942 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:56.220927 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-js6s4" event={"ID":"caa3c9cf-12d6-4cea-8453-20bcb03f5c31","Type":"ContainerDied","Data":"a813db2131c848f81563cff6c56bb55dc9de4635f2c3977c18b0221d78676d96"} Apr 23 16:42:56.221150 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:56.220953 2562 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a813db2131c848f81563cff6c56bb55dc9de4635f2c3977c18b0221d78676d96" Apr 23 16:42:56.827955 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:56.827911 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-fg2d5"] Apr 23 16:42:56.828403 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:56.828332 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="caa3c9cf-12d6-4cea-8453-20bcb03f5c31" containerName="s3-tls-init-custom" Apr 23 16:42:56.828403 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:56.828350 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="caa3c9cf-12d6-4cea-8453-20bcb03f5c31" containerName="s3-tls-init-custom" Apr 23 16:42:56.828515 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:56.828435 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="caa3c9cf-12d6-4cea-8453-20bcb03f5c31" containerName="s3-tls-init-custom" Apr 23 16:42:56.830823 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:56.830801 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-fg2d5" Apr 23 16:42:56.833981 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:56.833963 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving-artifact\"" Apr 23 16:42:56.834114 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:56.834097 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving\"" Apr 23 16:42:56.834220 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:56.834201 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-5mftf\"" Apr 23 16:42:56.841592 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:56.841571 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-fg2d5"] Apr 23 16:42:56.900469 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:56.900435 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjf9r\" (UniqueName: \"kubernetes.io/projected/59e606bd-cff2-4904-bd7b-81a0fa5b15d5-kube-api-access-hjf9r\") pod \"seaweedfs-tls-serving-7fd5766db9-fg2d5\" (UID: \"59e606bd-cff2-4904-bd7b-81a0fa5b15d5\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-fg2d5" Apr 23 16:42:56.900626 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:56.900510 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/59e606bd-cff2-4904-bd7b-81a0fa5b15d5-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-fg2d5\" (UID: \"59e606bd-cff2-4904-bd7b-81a0fa5b15d5\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-fg2d5" Apr 23 16:42:56.900721 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:56.900632 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/59e606bd-cff2-4904-bd7b-81a0fa5b15d5-data\") pod \"seaweedfs-tls-serving-7fd5766db9-fg2d5\" (UID: \"59e606bd-cff2-4904-bd7b-81a0fa5b15d5\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-fg2d5" Apr 23 16:42:57.001526 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:57.001488 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hjf9r\" (UniqueName: \"kubernetes.io/projected/59e606bd-cff2-4904-bd7b-81a0fa5b15d5-kube-api-access-hjf9r\") pod \"seaweedfs-tls-serving-7fd5766db9-fg2d5\" (UID: \"59e606bd-cff2-4904-bd7b-81a0fa5b15d5\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-fg2d5" Apr 23 16:42:57.001694 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:57.001548 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/59e606bd-cff2-4904-bd7b-81a0fa5b15d5-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-fg2d5\" (UID: \"59e606bd-cff2-4904-bd7b-81a0fa5b15d5\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-fg2d5" Apr 23 16:42:57.001694 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:57.001588 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/59e606bd-cff2-4904-bd7b-81a0fa5b15d5-data\") pod \"seaweedfs-tls-serving-7fd5766db9-fg2d5\" (UID: \"59e606bd-cff2-4904-bd7b-81a0fa5b15d5\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-fg2d5" Apr 23 16:42:57.001813 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:42:57.001796 2562 projected.go:264] Couldn't get secret kserve/seaweedfs-tls-serving: secret "seaweedfs-tls-serving" not found Apr 23 16:42:57.001851 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:42:57.001817 2562 projected.go:194] Error preparing data for projected volume seaweedfs-tls-serving for pod kserve/seaweedfs-tls-serving-7fd5766db9-fg2d5: secret "seaweedfs-tls-serving" not found Apr 23 16:42:57.001887 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:42:57.001881 2562 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/59e606bd-cff2-4904-bd7b-81a0fa5b15d5-seaweedfs-tls-serving podName:59e606bd-cff2-4904-bd7b-81a0fa5b15d5 nodeName:}" failed. No retries permitted until 2026-04-23 16:42:57.501859335 +0000 UTC m=+457.354674398 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "seaweedfs-tls-serving" (UniqueName: "kubernetes.io/projected/59e606bd-cff2-4904-bd7b-81a0fa5b15d5-seaweedfs-tls-serving") pod "seaweedfs-tls-serving-7fd5766db9-fg2d5" (UID: "59e606bd-cff2-4904-bd7b-81a0fa5b15d5") : secret "seaweedfs-tls-serving" not found Apr 23 16:42:57.002101 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:57.002083 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/59e606bd-cff2-4904-bd7b-81a0fa5b15d5-data\") pod \"seaweedfs-tls-serving-7fd5766db9-fg2d5\" (UID: \"59e606bd-cff2-4904-bd7b-81a0fa5b15d5\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-fg2d5" Apr 23 16:42:57.012580 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:57.012546 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjf9r\" (UniqueName: \"kubernetes.io/projected/59e606bd-cff2-4904-bd7b-81a0fa5b15d5-kube-api-access-hjf9r\") pod \"seaweedfs-tls-serving-7fd5766db9-fg2d5\" (UID: \"59e606bd-cff2-4904-bd7b-81a0fa5b15d5\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-fg2d5" Apr 23 16:42:57.507677 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:57.507637 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/59e606bd-cff2-4904-bd7b-81a0fa5b15d5-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-fg2d5\" (UID: \"59e606bd-cff2-4904-bd7b-81a0fa5b15d5\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-fg2d5" Apr 23 16:42:57.509997 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:57.509977 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/59e606bd-cff2-4904-bd7b-81a0fa5b15d5-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-fg2d5\" (UID: \"59e606bd-cff2-4904-bd7b-81a0fa5b15d5\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-fg2d5" Apr 23 16:42:57.739845 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:57.739806 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-fg2d5" Apr 23 16:42:57.858615 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:57.858571 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-fg2d5"] Apr 23 16:42:57.861264 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:42:57.861236 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59e606bd_cff2_4904_bd7b_81a0fa5b15d5.slice/crio-6da7ad5155bdc5aee791cc94387d1119eef9a76a4ce154bbffd8fc3709dc368d WatchSource:0}: Error finding container 6da7ad5155bdc5aee791cc94387d1119eef9a76a4ce154bbffd8fc3709dc368d: Status 404 returned error can't find the container with id 6da7ad5155bdc5aee791cc94387d1119eef9a76a4ce154bbffd8fc3709dc368d Apr 23 16:42:58.229234 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:58.229193 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-fg2d5" event={"ID":"59e606bd-cff2-4904-bd7b-81a0fa5b15d5","Type":"ContainerStarted","Data":"d9b7aec6a55b67f67b9436be34ce901f9a0f0c6f54ba16c3bd5ed275470ef8f8"} Apr 23 16:42:58.229427 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:58.229241 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-fg2d5" event={"ID":"59e606bd-cff2-4904-bd7b-81a0fa5b15d5","Type":"ContainerStarted","Data":"6da7ad5155bdc5aee791cc94387d1119eef9a76a4ce154bbffd8fc3709dc368d"} Apr 23 16:42:58.248549 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:58.248499 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-serving-7fd5766db9-fg2d5" podStartSLOduration=2.021670524 podStartE2EDuration="2.248483852s" podCreationTimestamp="2026-04-23 16:42:56 +0000 UTC" firstStartedPulling="2026-04-23 16:42:57.862533092 +0000 UTC m=+457.715348138" lastFinishedPulling="2026-04-23 16:42:58.089346421 +0000 UTC m=+457.942161466" observedRunningTime="2026-04-23 16:42:58.246819493 +0000 UTC m=+458.099634568" watchObservedRunningTime="2026-04-23 16:42:58.248483852 +0000 UTC m=+458.101298916" Apr 23 16:42:58.811657 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:58.811616 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-serving-z7k4r"] Apr 23 16:42:58.814508 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:58.814486 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-z7k4r" Apr 23 16:42:58.822015 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:58.821990 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-z7k4r"] Apr 23 16:42:58.921079 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:58.921047 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v98xt\" (UniqueName: \"kubernetes.io/projected/766278b4-babe-4f1b-9525-5589a3575cd9-kube-api-access-v98xt\") pod \"s3-tls-init-serving-z7k4r\" (UID: \"766278b4-babe-4f1b-9525-5589a3575cd9\") " pod="kserve/s3-tls-init-serving-z7k4r" Apr 23 16:42:59.022326 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:59.022280 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v98xt\" (UniqueName: \"kubernetes.io/projected/766278b4-babe-4f1b-9525-5589a3575cd9-kube-api-access-v98xt\") pod \"s3-tls-init-serving-z7k4r\" (UID: \"766278b4-babe-4f1b-9525-5589a3575cd9\") " pod="kserve/s3-tls-init-serving-z7k4r" Apr 23 16:42:59.030853 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:59.030830 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v98xt\" (UniqueName: \"kubernetes.io/projected/766278b4-babe-4f1b-9525-5589a3575cd9-kube-api-access-v98xt\") pod \"s3-tls-init-serving-z7k4r\" (UID: \"766278b4-babe-4f1b-9525-5589a3575cd9\") " pod="kserve/s3-tls-init-serving-z7k4r" Apr 23 16:42:59.131904 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:59.131819 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-z7k4r" Apr 23 16:42:59.269781 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:42:59.269735 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-z7k4r"] Apr 23 16:42:59.272703 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:42:59.272676 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod766278b4_babe_4f1b_9525_5589a3575cd9.slice/crio-21cfc190c92396cbd570e178c9fed19e41cb8a8a47f3fe04199f8e091dc95d89 WatchSource:0}: Error finding container 21cfc190c92396cbd570e178c9fed19e41cb8a8a47f3fe04199f8e091dc95d89: Status 404 returned error can't find the container with id 21cfc190c92396cbd570e178c9fed19e41cb8a8a47f3fe04199f8e091dc95d89 Apr 23 16:43:00.238137 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:43:00.238097 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-z7k4r" event={"ID":"766278b4-babe-4f1b-9525-5589a3575cd9","Type":"ContainerStarted","Data":"1086b1972333407a584f9df98b18c7fbbd0057eee888a116d35189ac110d9d43"} Apr 23 16:43:00.238137 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:43:00.238134 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-z7k4r" event={"ID":"766278b4-babe-4f1b-9525-5589a3575cd9","Type":"ContainerStarted","Data":"21cfc190c92396cbd570e178c9fed19e41cb8a8a47f3fe04199f8e091dc95d89"} Apr 23 16:43:00.254879 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:43:00.254835 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-serving-z7k4r" podStartSLOduration=2.254821836 podStartE2EDuration="2.254821836s" podCreationTimestamp="2026-04-23 16:42:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:43:00.25336692 +0000 UTC m=+460.106181995" watchObservedRunningTime="2026-04-23 16:43:00.254821836 +0000 UTC m=+460.107636899" Apr 23 16:43:04.252801 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:43:04.252766 2562 generic.go:358] "Generic (PLEG): container finished" podID="766278b4-babe-4f1b-9525-5589a3575cd9" containerID="1086b1972333407a584f9df98b18c7fbbd0057eee888a116d35189ac110d9d43" exitCode=0 Apr 23 16:43:04.253180 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:43:04.252819 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-z7k4r" event={"ID":"766278b4-babe-4f1b-9525-5589a3575cd9","Type":"ContainerDied","Data":"1086b1972333407a584f9df98b18c7fbbd0057eee888a116d35189ac110d9d43"} Apr 23 16:43:05.377987 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:43:05.377965 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-z7k4r" Apr 23 16:43:05.474486 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:43:05.474451 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v98xt\" (UniqueName: \"kubernetes.io/projected/766278b4-babe-4f1b-9525-5589a3575cd9-kube-api-access-v98xt\") pod \"766278b4-babe-4f1b-9525-5589a3575cd9\" (UID: \"766278b4-babe-4f1b-9525-5589a3575cd9\") " Apr 23 16:43:05.476505 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:43:05.476468 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/766278b4-babe-4f1b-9525-5589a3575cd9-kube-api-access-v98xt" (OuterVolumeSpecName: "kube-api-access-v98xt") pod "766278b4-babe-4f1b-9525-5589a3575cd9" (UID: "766278b4-babe-4f1b-9525-5589a3575cd9"). InnerVolumeSpecName "kube-api-access-v98xt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:43:05.575347 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:43:05.575245 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v98xt\" (UniqueName: \"kubernetes.io/projected/766278b4-babe-4f1b-9525-5589a3575cd9-kube-api-access-v98xt\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 16:43:06.260174 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:43:06.260143 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-z7k4r" Apr 23 16:43:06.260317 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:43:06.260142 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-z7k4r" event={"ID":"766278b4-babe-4f1b-9525-5589a3575cd9","Type":"ContainerDied","Data":"21cfc190c92396cbd570e178c9fed19e41cb8a8a47f3fe04199f8e091dc95d89"} Apr 23 16:43:06.260317 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:43:06.260260 2562 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21cfc190c92396cbd570e178c9fed19e41cb8a8a47f3fe04199f8e091dc95d89" Apr 23 16:43:15.722131 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:43:15.722093 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5bf56f45fb-ctprt"] Apr 23 16:43:15.722588 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:43:15.722416 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="766278b4-babe-4f1b-9525-5589a3575cd9" containerName="s3-tls-init-serving" Apr 23 16:43:15.722588 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:43:15.722429 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="766278b4-babe-4f1b-9525-5589a3575cd9" containerName="s3-tls-init-serving" Apr 23 16:43:15.722588 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:43:15.722477 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="766278b4-babe-4f1b-9525-5589a3575cd9" containerName="s3-tls-init-serving" Apr 23 16:43:15.726926 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:43:15.726893 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5bf56f45fb-ctprt" Apr 23 16:43:15.729465 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:43:15.729444 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-rtrj5\"" Apr 23 16:43:15.736067 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:43:15.736041 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5bf56f45fb-ctprt"] Apr 23 16:43:15.759777 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:43:15.759726 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7131f42-48ba-4e6b-992c-0ee4f7082c8c-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-5bf56f45fb-ctprt\" (UID: \"c7131f42-48ba-4e6b-992c-0ee4f7082c8c\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5bf56f45fb-ctprt" Apr 23 16:43:15.860986 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:43:15.860948 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7131f42-48ba-4e6b-992c-0ee4f7082c8c-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-5bf56f45fb-ctprt\" (UID: \"c7131f42-48ba-4e6b-992c-0ee4f7082c8c\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5bf56f45fb-ctprt" Apr 23 16:43:15.861314 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:43:15.861294 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7131f42-48ba-4e6b-992c-0ee4f7082c8c-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-5bf56f45fb-ctprt\" (UID: \"c7131f42-48ba-4e6b-992c-0ee4f7082c8c\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5bf56f45fb-ctprt" Apr 23 16:43:16.037691 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:43:16.037608 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5bf56f45fb-ctprt" Apr 23 16:43:16.157524 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:43:16.157499 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5bf56f45fb-ctprt"] Apr 23 16:43:16.160306 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:43:16.160264 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7131f42_48ba_4e6b_992c_0ee4f7082c8c.slice/crio-d80e45fa0936a0cd0f1479a74aad1b479be3d72b716e11e2d11eb3983680aab7 WatchSource:0}: Error finding container d80e45fa0936a0cd0f1479a74aad1b479be3d72b716e11e2d11eb3983680aab7: Status 404 returned error can't find the container with id d80e45fa0936a0cd0f1479a74aad1b479be3d72b716e11e2d11eb3983680aab7 Apr 23 16:43:16.291012 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:43:16.290921 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5bf56f45fb-ctprt" event={"ID":"c7131f42-48ba-4e6b-992c-0ee4f7082c8c","Type":"ContainerStarted","Data":"d80e45fa0936a0cd0f1479a74aad1b479be3d72b716e11e2d11eb3983680aab7"} Apr 23 16:43:20.305058 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:43:20.305020 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5bf56f45fb-ctprt" event={"ID":"c7131f42-48ba-4e6b-992c-0ee4f7082c8c","Type":"ContainerStarted","Data":"4552b515e409fc2a0499e6757e85e4012dead312ab52625d5ba845905327d434"} Apr 23 16:43:20.641445 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:43:20.641418 2562 scope.go:117] "RemoveContainer" containerID="2dc1118604faa682a1571b866c0bc4a0a1333cd0f6326f076bdc73c31aecaf17" Apr 23 16:43:24.317177 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:43:24.317138 2562 generic.go:358] "Generic (PLEG): container finished" podID="c7131f42-48ba-4e6b-992c-0ee4f7082c8c" containerID="4552b515e409fc2a0499e6757e85e4012dead312ab52625d5ba845905327d434" exitCode=0 Apr 23 16:43:24.317654 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:43:24.317217 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5bf56f45fb-ctprt" event={"ID":"c7131f42-48ba-4e6b-992c-0ee4f7082c8c","Type":"ContainerDied","Data":"4552b515e409fc2a0499e6757e85e4012dead312ab52625d5ba845905327d434"} Apr 23 16:43:37.363484 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:43:37.363451 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5bf56f45fb-ctprt" event={"ID":"c7131f42-48ba-4e6b-992c-0ee4f7082c8c","Type":"ContainerStarted","Data":"86ebffd1d94ce5bbf8bbb82da21548cd139e14f9e6fca3e102b2ffc6cb4e0a48"} Apr 23 16:43:40.377752 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:43:40.377702 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5bf56f45fb-ctprt" event={"ID":"c7131f42-48ba-4e6b-992c-0ee4f7082c8c","Type":"ContainerStarted","Data":"97f534595bd1cb948b1912f23bda9c26e99ed70e60f32ddfd6e1a96186cb1a68"} Apr 23 16:43:40.378198 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:43:40.377890 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5bf56f45fb-ctprt" Apr 23 16:43:40.378198 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:43:40.377925 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5bf56f45fb-ctprt" Apr 23 16:43:40.379368 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:43:40.379314 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5bf56f45fb-ctprt" podUID="c7131f42-48ba-4e6b-992c-0ee4f7082c8c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 23 16:43:40.379961 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:43:40.379940 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5bf56f45fb-ctprt" podUID="c7131f42-48ba-4e6b-992c-0ee4f7082c8c" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:43:40.395693 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:43:40.395645 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5bf56f45fb-ctprt" podStartSLOduration=1.8921537590000002 podStartE2EDuration="25.395630263s" podCreationTimestamp="2026-04-23 16:43:15 +0000 UTC" firstStartedPulling="2026-04-23 16:43:16.16220298 +0000 UTC m=+476.015018025" lastFinishedPulling="2026-04-23 16:43:39.665679484 +0000 UTC m=+499.518494529" observedRunningTime="2026-04-23 16:43:40.395219119 +0000 UTC m=+500.248034184" watchObservedRunningTime="2026-04-23 16:43:40.395630263 +0000 UTC m=+500.248445326" Apr 23 16:43:41.381311 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:43:41.381267 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5bf56f45fb-ctprt" podUID="c7131f42-48ba-4e6b-992c-0ee4f7082c8c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 23 16:43:41.381696 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:43:41.381677 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5bf56f45fb-ctprt" podUID="c7131f42-48ba-4e6b-992c-0ee4f7082c8c" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:43:51.382215 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:43:51.382159 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5bf56f45fb-ctprt" podUID="c7131f42-48ba-4e6b-992c-0ee4f7082c8c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 23 16:43:51.382718 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:43:51.382661 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5bf56f45fb-ctprt" podUID="c7131f42-48ba-4e6b-992c-0ee4f7082c8c" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:44:01.381495 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:44:01.381451 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5bf56f45fb-ctprt" podUID="c7131f42-48ba-4e6b-992c-0ee4f7082c8c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 23 16:44:01.382041 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:44:01.382001 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5bf56f45fb-ctprt" podUID="c7131f42-48ba-4e6b-992c-0ee4f7082c8c" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:44:11.382212 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:44:11.382148 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5bf56f45fb-ctprt" podUID="c7131f42-48ba-4e6b-992c-0ee4f7082c8c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 23 16:44:11.382654 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:44:11.382629 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5bf56f45fb-ctprt" podUID="c7131f42-48ba-4e6b-992c-0ee4f7082c8c" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:44:21.381318 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:44:21.381269 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5bf56f45fb-ctprt" podUID="c7131f42-48ba-4e6b-992c-0ee4f7082c8c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 23 16:44:21.381812 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:44:21.381726 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5bf56f45fb-ctprt" podUID="c7131f42-48ba-4e6b-992c-0ee4f7082c8c" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:44:31.382013 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:44:31.381962 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5bf56f45fb-ctprt" podUID="c7131f42-48ba-4e6b-992c-0ee4f7082c8c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 23 16:44:31.382584 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:44:31.382416 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5bf56f45fb-ctprt" podUID="c7131f42-48ba-4e6b-992c-0ee4f7082c8c" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:44:41.381887 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:44:41.381836 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5bf56f45fb-ctprt" podUID="c7131f42-48ba-4e6b-992c-0ee4f7082c8c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 23 16:44:41.382413 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:44:41.382215 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5bf56f45fb-ctprt" podUID="c7131f42-48ba-4e6b-992c-0ee4f7082c8c" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:44:51.381949 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:44:51.381918 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5bf56f45fb-ctprt" Apr 23 16:44:51.382357 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:44:51.381988 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5bf56f45fb-ctprt" Apr 23 16:45:00.871806 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:00.871774 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5bf56f45fb-ctprt"] Apr 23 16:45:00.872239 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:00.872075 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5bf56f45fb-ctprt" podUID="c7131f42-48ba-4e6b-992c-0ee4f7082c8c" containerName="kserve-container" containerID="cri-o://86ebffd1d94ce5bbf8bbb82da21548cd139e14f9e6fca3e102b2ffc6cb4e0a48" gracePeriod=30 Apr 23 16:45:00.872239 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:00.872159 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5bf56f45fb-ctprt" podUID="c7131f42-48ba-4e6b-992c-0ee4f7082c8c" containerName="agent" containerID="cri-o://97f534595bd1cb948b1912f23bda9c26e99ed70e60f32ddfd6e1a96186cb1a68" gracePeriod=30 Apr 23 16:45:00.977490 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:00.977452 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-59fcd7879c-ncrgp"] Apr 23 16:45:00.980936 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:00.980921 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-59fcd7879c-ncrgp" Apr 23 16:45:00.990043 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:00.990019 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-59fcd7879c-ncrgp"] Apr 23 16:45:01.069920 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:01.069878 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/97432120-a47d-4b89-8d6d-ff310518e5aa-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-59fcd7879c-ncrgp\" (UID: \"97432120-a47d-4b89-8d6d-ff310518e5aa\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-59fcd7879c-ncrgp" Apr 23 16:45:01.171257 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:01.171218 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/97432120-a47d-4b89-8d6d-ff310518e5aa-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-59fcd7879c-ncrgp\" (UID: \"97432120-a47d-4b89-8d6d-ff310518e5aa\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-59fcd7879c-ncrgp" Apr 23 16:45:01.171605 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:01.171584 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/97432120-a47d-4b89-8d6d-ff310518e5aa-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-59fcd7879c-ncrgp\" (UID: \"97432120-a47d-4b89-8d6d-ff310518e5aa\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-59fcd7879c-ncrgp" Apr 23 16:45:01.292149 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:01.292116 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-59fcd7879c-ncrgp" Apr 23 16:45:01.381608 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:01.381573 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5bf56f45fb-ctprt" podUID="c7131f42-48ba-4e6b-992c-0ee4f7082c8c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 23 16:45:01.382140 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:01.382107 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5bf56f45fb-ctprt" podUID="c7131f42-48ba-4e6b-992c-0ee4f7082c8c" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:45:01.413870 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:01.413844 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-59fcd7879c-ncrgp"] Apr 23 16:45:01.415933 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:45:01.415906 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97432120_a47d_4b89_8d6d_ff310518e5aa.slice/crio-b95bdb17a51c7ce5700ed19f9b7819adc79e4d85d804f39ee9a4df2f53c16bcc WatchSource:0}: Error finding container b95bdb17a51c7ce5700ed19f9b7819adc79e4d85d804f39ee9a4df2f53c16bcc: Status 404 returned error can't find the container with id b95bdb17a51c7ce5700ed19f9b7819adc79e4d85d804f39ee9a4df2f53c16bcc Apr 23 16:45:01.617012 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:01.616970 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-59fcd7879c-ncrgp" event={"ID":"97432120-a47d-4b89-8d6d-ff310518e5aa","Type":"ContainerStarted","Data":"929913718b5c0458d7fc056b91c0e8480f4d2f86ae06bc2e5b1e7325c434139f"} Apr 23 16:45:01.617012 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:01.617013 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-59fcd7879c-ncrgp" event={"ID":"97432120-a47d-4b89-8d6d-ff310518e5aa","Type":"ContainerStarted","Data":"b95bdb17a51c7ce5700ed19f9b7819adc79e4d85d804f39ee9a4df2f53c16bcc"} Apr 23 16:45:05.633732 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:05.633702 2562 generic.go:358] "Generic (PLEG): container finished" podID="c7131f42-48ba-4e6b-992c-0ee4f7082c8c" containerID="86ebffd1d94ce5bbf8bbb82da21548cd139e14f9e6fca3e102b2ffc6cb4e0a48" exitCode=0 Apr 23 16:45:05.634166 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:05.633773 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5bf56f45fb-ctprt" event={"ID":"c7131f42-48ba-4e6b-992c-0ee4f7082c8c","Type":"ContainerDied","Data":"86ebffd1d94ce5bbf8bbb82da21548cd139e14f9e6fca3e102b2ffc6cb4e0a48"} Apr 23 16:45:05.635030 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:05.635009 2562 generic.go:358] "Generic (PLEG): container finished" podID="97432120-a47d-4b89-8d6d-ff310518e5aa" containerID="929913718b5c0458d7fc056b91c0e8480f4d2f86ae06bc2e5b1e7325c434139f" exitCode=0 Apr 23 16:45:05.635140 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:05.635047 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-59fcd7879c-ncrgp" event={"ID":"97432120-a47d-4b89-8d6d-ff310518e5aa","Type":"ContainerDied","Data":"929913718b5c0458d7fc056b91c0e8480f4d2f86ae06bc2e5b1e7325c434139f"} Apr 23 16:45:06.639969 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:06.639936 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-59fcd7879c-ncrgp" event={"ID":"97432120-a47d-4b89-8d6d-ff310518e5aa","Type":"ContainerStarted","Data":"b3f01e03bbca457d6bea507f5ec9a50f6f1709110781048587abc50936b8d1c0"} Apr 23 16:45:06.640407 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:06.639980 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-59fcd7879c-ncrgp" event={"ID":"97432120-a47d-4b89-8d6d-ff310518e5aa","Type":"ContainerStarted","Data":"2e40c33be41f083fd7d27705ff00a349b6094d98fb201bf8f6b980ad7fd101dc"} Apr 23 16:45:06.640407 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:06.640330 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-59fcd7879c-ncrgp" Apr 23 16:45:06.640407 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:06.640361 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-59fcd7879c-ncrgp" Apr 23 16:45:06.641996 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:06.641965 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-59fcd7879c-ncrgp" podUID="97432120-a47d-4b89-8d6d-ff310518e5aa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:5000: connect: connection refused" Apr 23 16:45:06.642539 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:06.642513 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-59fcd7879c-ncrgp" podUID="97432120-a47d-4b89-8d6d-ff310518e5aa" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:45:06.660788 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:06.660717 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-59fcd7879c-ncrgp" podStartSLOduration=6.660700192 podStartE2EDuration="6.660700192s" podCreationTimestamp="2026-04-23 16:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:45:06.659305389 +0000 UTC m=+586.512120476" watchObservedRunningTime="2026-04-23 16:45:06.660700192 +0000 UTC m=+586.513515257" Apr 23 16:45:07.643910 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:07.643865 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-59fcd7879c-ncrgp" podUID="97432120-a47d-4b89-8d6d-ff310518e5aa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:5000: connect: connection refused" Apr 23 16:45:07.644328 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:07.644271 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-59fcd7879c-ncrgp" podUID="97432120-a47d-4b89-8d6d-ff310518e5aa" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:45:11.381344 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:11.381295 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5bf56f45fb-ctprt" podUID="c7131f42-48ba-4e6b-992c-0ee4f7082c8c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 23 16:45:11.381700 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:11.381617 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5bf56f45fb-ctprt" podUID="c7131f42-48ba-4e6b-992c-0ee4f7082c8c" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:45:17.644494 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:17.644447 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-59fcd7879c-ncrgp" podUID="97432120-a47d-4b89-8d6d-ff310518e5aa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:5000: connect: connection refused" Apr 23 16:45:17.645102 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:17.645080 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-59fcd7879c-ncrgp" podUID="97432120-a47d-4b89-8d6d-ff310518e5aa" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:45:21.381815 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:21.381769 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5bf56f45fb-ctprt" podUID="c7131f42-48ba-4e6b-992c-0ee4f7082c8c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.24:8080: connect: connection refused" Apr 23 16:45:21.382250 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:21.381917 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5bf56f45fb-ctprt" Apr 23 16:45:21.382250 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:21.381981 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5bf56f45fb-ctprt" podUID="c7131f42-48ba-4e6b-992c-0ee4f7082c8c" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:45:21.382250 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:21.382105 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5bf56f45fb-ctprt" Apr 23 16:45:27.644529 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:27.644481 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-59fcd7879c-ncrgp" podUID="97432120-a47d-4b89-8d6d-ff310518e5aa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:5000: connect: connection refused" Apr 23 16:45:27.645112 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:27.645081 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-59fcd7879c-ncrgp" podUID="97432120-a47d-4b89-8d6d-ff310518e5aa" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:45:31.008488 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:31.008467 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5bf56f45fb-ctprt" Apr 23 16:45:31.128764 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:31.128664 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7131f42-48ba-4e6b-992c-0ee4f7082c8c-kserve-provision-location\") pod \"c7131f42-48ba-4e6b-992c-0ee4f7082c8c\" (UID: \"c7131f42-48ba-4e6b-992c-0ee4f7082c8c\") " Apr 23 16:45:31.128985 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:31.128962 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7131f42-48ba-4e6b-992c-0ee4f7082c8c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c7131f42-48ba-4e6b-992c-0ee4f7082c8c" (UID: "c7131f42-48ba-4e6b-992c-0ee4f7082c8c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:45:31.229233 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:31.229203 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7131f42-48ba-4e6b-992c-0ee4f7082c8c-kserve-provision-location\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 16:45:31.717179 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:31.717146 2562 generic.go:358] "Generic (PLEG): container finished" podID="c7131f42-48ba-4e6b-992c-0ee4f7082c8c" containerID="97f534595bd1cb948b1912f23bda9c26e99ed70e60f32ddfd6e1a96186cb1a68" exitCode=0 Apr 23 16:45:31.717371 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:31.717214 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5bf56f45fb-ctprt" event={"ID":"c7131f42-48ba-4e6b-992c-0ee4f7082c8c","Type":"ContainerDied","Data":"97f534595bd1cb948b1912f23bda9c26e99ed70e60f32ddfd6e1a96186cb1a68"} Apr 23 16:45:31.717371 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:31.717228 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5bf56f45fb-ctprt" Apr 23 16:45:31.717371 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:31.717240 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5bf56f45fb-ctprt" event={"ID":"c7131f42-48ba-4e6b-992c-0ee4f7082c8c","Type":"ContainerDied","Data":"d80e45fa0936a0cd0f1479a74aad1b479be3d72b716e11e2d11eb3983680aab7"} Apr 23 16:45:31.717371 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:31.717255 2562 scope.go:117] "RemoveContainer" containerID="97f534595bd1cb948b1912f23bda9c26e99ed70e60f32ddfd6e1a96186cb1a68" Apr 23 16:45:31.725347 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:31.725322 2562 scope.go:117] "RemoveContainer" containerID="86ebffd1d94ce5bbf8bbb82da21548cd139e14f9e6fca3e102b2ffc6cb4e0a48" Apr 23 16:45:31.732279 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:31.732263 2562 scope.go:117] "RemoveContainer" containerID="4552b515e409fc2a0499e6757e85e4012dead312ab52625d5ba845905327d434" Apr 23 16:45:31.739031 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:31.739013 2562 scope.go:117] "RemoveContainer" containerID="97f534595bd1cb948b1912f23bda9c26e99ed70e60f32ddfd6e1a96186cb1a68" Apr 23 16:45:31.739440 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:45:31.739416 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97f534595bd1cb948b1912f23bda9c26e99ed70e60f32ddfd6e1a96186cb1a68\": container with ID starting with 97f534595bd1cb948b1912f23bda9c26e99ed70e60f32ddfd6e1a96186cb1a68 not found: ID does not exist" containerID="97f534595bd1cb948b1912f23bda9c26e99ed70e60f32ddfd6e1a96186cb1a68" Apr 23 16:45:31.739534 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:31.739448 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97f534595bd1cb948b1912f23bda9c26e99ed70e60f32ddfd6e1a96186cb1a68"} err="failed to get container status \"97f534595bd1cb948b1912f23bda9c26e99ed70e60f32ddfd6e1a96186cb1a68\": rpc error: code = NotFound desc = could not find container \"97f534595bd1cb948b1912f23bda9c26e99ed70e60f32ddfd6e1a96186cb1a68\": container with ID starting with 97f534595bd1cb948b1912f23bda9c26e99ed70e60f32ddfd6e1a96186cb1a68 not found: ID does not exist" Apr 23 16:45:31.739534 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:31.739466 2562 scope.go:117] "RemoveContainer" containerID="86ebffd1d94ce5bbf8bbb82da21548cd139e14f9e6fca3e102b2ffc6cb4e0a48" Apr 23 16:45:31.739824 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:45:31.739802 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86ebffd1d94ce5bbf8bbb82da21548cd139e14f9e6fca3e102b2ffc6cb4e0a48\": container with ID starting with 86ebffd1d94ce5bbf8bbb82da21548cd139e14f9e6fca3e102b2ffc6cb4e0a48 not found: ID does not exist" containerID="86ebffd1d94ce5bbf8bbb82da21548cd139e14f9e6fca3e102b2ffc6cb4e0a48" Apr 23 16:45:31.739824 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:31.739827 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86ebffd1d94ce5bbf8bbb82da21548cd139e14f9e6fca3e102b2ffc6cb4e0a48"} err="failed to get container status \"86ebffd1d94ce5bbf8bbb82da21548cd139e14f9e6fca3e102b2ffc6cb4e0a48\": rpc error: code = NotFound desc = could not find container \"86ebffd1d94ce5bbf8bbb82da21548cd139e14f9e6fca3e102b2ffc6cb4e0a48\": container with ID starting with 86ebffd1d94ce5bbf8bbb82da21548cd139e14f9e6fca3e102b2ffc6cb4e0a48 not found: ID does not exist" Apr 23 16:45:31.740002 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:31.739848 2562 scope.go:117] "RemoveContainer" containerID="4552b515e409fc2a0499e6757e85e4012dead312ab52625d5ba845905327d434" Apr 23 16:45:31.740122 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:45:31.740104 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4552b515e409fc2a0499e6757e85e4012dead312ab52625d5ba845905327d434\": container with ID starting with 4552b515e409fc2a0499e6757e85e4012dead312ab52625d5ba845905327d434 not found: ID does not exist" containerID="4552b515e409fc2a0499e6757e85e4012dead312ab52625d5ba845905327d434" Apr 23 16:45:31.740180 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:31.740125 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4552b515e409fc2a0499e6757e85e4012dead312ab52625d5ba845905327d434"} err="failed to get container status \"4552b515e409fc2a0499e6757e85e4012dead312ab52625d5ba845905327d434\": rpc error: code = NotFound desc = could not find container \"4552b515e409fc2a0499e6757e85e4012dead312ab52625d5ba845905327d434\": container with ID starting with 4552b515e409fc2a0499e6757e85e4012dead312ab52625d5ba845905327d434 not found: ID does not exist" Apr 23 16:45:31.741565 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:31.741544 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5bf56f45fb-ctprt"] Apr 23 16:45:31.746791 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:31.746769 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5bf56f45fb-ctprt"] Apr 23 16:45:32.741623 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:32.741589 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7131f42-48ba-4e6b-992c-0ee4f7082c8c" path="/var/lib/kubelet/pods/c7131f42-48ba-4e6b-992c-0ee4f7082c8c/volumes" Apr 23 16:45:37.644349 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:37.644292 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-59fcd7879c-ncrgp" podUID="97432120-a47d-4b89-8d6d-ff310518e5aa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:5000: connect: connection refused" Apr 23 16:45:37.644873 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:37.644674 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-59fcd7879c-ncrgp" podUID="97432120-a47d-4b89-8d6d-ff310518e5aa" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:45:47.644188 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:47.644096 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-59fcd7879c-ncrgp" podUID="97432120-a47d-4b89-8d6d-ff310518e5aa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:5000: connect: connection refused" Apr 23 16:45:47.644646 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:47.644486 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-59fcd7879c-ncrgp" podUID="97432120-a47d-4b89-8d6d-ff310518e5aa" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:45:57.644287 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:57.644226 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-59fcd7879c-ncrgp" podUID="97432120-a47d-4b89-8d6d-ff310518e5aa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:5000: connect: connection refused" Apr 23 16:45:57.644733 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:45:57.644669 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-59fcd7879c-ncrgp" podUID="97432120-a47d-4b89-8d6d-ff310518e5aa" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:46:07.643957 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:07.643915 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-59fcd7879c-ncrgp" podUID="97432120-a47d-4b89-8d6d-ff310518e5aa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:5000: connect: connection refused" Apr 23 16:46:07.644440 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:07.644304 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-59fcd7879c-ncrgp" podUID="97432120-a47d-4b89-8d6d-ff310518e5aa" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:46:17.645229 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:17.645197 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-59fcd7879c-ncrgp" Apr 23 16:46:17.647677 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:17.645692 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-59fcd7879c-ncrgp" Apr 23 16:46:26.066286 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:26.066254 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-59fcd7879c-ncrgp"] Apr 23 16:46:26.066643 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:26.066559 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-59fcd7879c-ncrgp" podUID="97432120-a47d-4b89-8d6d-ff310518e5aa" containerName="kserve-container" containerID="cri-o://2e40c33be41f083fd7d27705ff00a349b6094d98fb201bf8f6b980ad7fd101dc" gracePeriod=30 Apr 23 16:46:26.066705 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:26.066643 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-59fcd7879c-ncrgp" podUID="97432120-a47d-4b89-8d6d-ff310518e5aa" containerName="agent" containerID="cri-o://b3f01e03bbca457d6bea507f5ec9a50f6f1709110781048587abc50936b8d1c0" gracePeriod=30 Apr 23 16:46:27.645062 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:27.645023 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-59fcd7879c-ncrgp" podUID="97432120-a47d-4b89-8d6d-ff310518e5aa" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:46:27.646276 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:27.646249 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-59fcd7879c-ncrgp" podUID="97432120-a47d-4b89-8d6d-ff310518e5aa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:5000: connect: connection refused" Apr 23 16:46:30.892840 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:30.892805 2562 generic.go:358] "Generic (PLEG): container finished" podID="97432120-a47d-4b89-8d6d-ff310518e5aa" containerID="2e40c33be41f083fd7d27705ff00a349b6094d98fb201bf8f6b980ad7fd101dc" exitCode=0 Apr 23 16:46:30.893178 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:30.892847 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-59fcd7879c-ncrgp" event={"ID":"97432120-a47d-4b89-8d6d-ff310518e5aa","Type":"ContainerDied","Data":"2e40c33be41f083fd7d27705ff00a349b6094d98fb201bf8f6b980ad7fd101dc"} Apr 23 16:46:36.159719 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:36.159681 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-595f4f5b7-w76qr"] Apr 23 16:46:36.160139 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:36.160020 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c7131f42-48ba-4e6b-992c-0ee4f7082c8c" containerName="agent" Apr 23 16:46:36.160139 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:36.160033 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7131f42-48ba-4e6b-992c-0ee4f7082c8c" containerName="agent" Apr 23 16:46:36.160139 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:36.160045 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c7131f42-48ba-4e6b-992c-0ee4f7082c8c" containerName="kserve-container" Apr 23 16:46:36.160139 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:36.160051 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7131f42-48ba-4e6b-992c-0ee4f7082c8c" containerName="kserve-container" Apr 23 16:46:36.160139 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:36.160059 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c7131f42-48ba-4e6b-992c-0ee4f7082c8c" containerName="storage-initializer" Apr 23 16:46:36.160139 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:36.160066 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7131f42-48ba-4e6b-992c-0ee4f7082c8c" containerName="storage-initializer" Apr 23 16:46:36.160139 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:36.160110 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="c7131f42-48ba-4e6b-992c-0ee4f7082c8c" containerName="agent" Apr 23 16:46:36.160139 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:36.160121 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="c7131f42-48ba-4e6b-992c-0ee4f7082c8c" containerName="kserve-container" Apr 23 16:46:36.163123 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:36.163106 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-595f4f5b7-w76qr" Apr 23 16:46:36.173188 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:36.173161 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-595f4f5b7-w76qr"] Apr 23 16:46:36.176155 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:36.176132 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f1144f65-d5c6-49c5-84bc-e4585fbe22c2-kserve-provision-location\") pod \"isvc-logger-predictor-595f4f5b7-w76qr\" (UID: \"f1144f65-d5c6-49c5-84bc-e4585fbe22c2\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-595f4f5b7-w76qr" Apr 23 16:46:36.276711 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:36.276662 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f1144f65-d5c6-49c5-84bc-e4585fbe22c2-kserve-provision-location\") pod \"isvc-logger-predictor-595f4f5b7-w76qr\" (UID: \"f1144f65-d5c6-49c5-84bc-e4585fbe22c2\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-595f4f5b7-w76qr" Apr 23 16:46:36.277108 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:36.277082 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f1144f65-d5c6-49c5-84bc-e4585fbe22c2-kserve-provision-location\") pod \"isvc-logger-predictor-595f4f5b7-w76qr\" (UID: \"f1144f65-d5c6-49c5-84bc-e4585fbe22c2\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-595f4f5b7-w76qr" Apr 23 16:46:36.473616 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:36.473526 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-595f4f5b7-w76qr" Apr 23 16:46:36.598619 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:36.598592 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-595f4f5b7-w76qr"] Apr 23 16:46:36.601194 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:46:36.601165 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1144f65_d5c6_49c5_84bc_e4585fbe22c2.slice/crio-a6101ea737a5e743542e2bf46e99090b1f65f27864f0e746dc8810b3123c587c WatchSource:0}: Error finding container a6101ea737a5e743542e2bf46e99090b1f65f27864f0e746dc8810b3123c587c: Status 404 returned error can't find the container with id a6101ea737a5e743542e2bf46e99090b1f65f27864f0e746dc8810b3123c587c Apr 23 16:46:36.602991 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:36.602971 2562 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 16:46:36.911063 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:36.911028 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-595f4f5b7-w76qr" event={"ID":"f1144f65-d5c6-49c5-84bc-e4585fbe22c2","Type":"ContainerStarted","Data":"8753d2de1139b4a43183eacac75042e874e42794d60477cddd233a7da798ec3c"} Apr 23 16:46:36.911063 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:36.911066 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-595f4f5b7-w76qr" event={"ID":"f1144f65-d5c6-49c5-84bc-e4585fbe22c2","Type":"ContainerStarted","Data":"a6101ea737a5e743542e2bf46e99090b1f65f27864f0e746dc8810b3123c587c"} Apr 23 16:46:37.644793 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:37.644725 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-59fcd7879c-ncrgp" podUID="97432120-a47d-4b89-8d6d-ff310518e5aa" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:46:37.645985 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:37.645956 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-59fcd7879c-ncrgp" podUID="97432120-a47d-4b89-8d6d-ff310518e5aa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:5000: connect: connection refused" Apr 23 16:46:40.924057 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:40.924022 2562 generic.go:358] "Generic (PLEG): container finished" podID="f1144f65-d5c6-49c5-84bc-e4585fbe22c2" containerID="8753d2de1139b4a43183eacac75042e874e42794d60477cddd233a7da798ec3c" exitCode=0 Apr 23 16:46:40.924400 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:40.924065 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-595f4f5b7-w76qr" event={"ID":"f1144f65-d5c6-49c5-84bc-e4585fbe22c2","Type":"ContainerDied","Data":"8753d2de1139b4a43183eacac75042e874e42794d60477cddd233a7da798ec3c"} Apr 23 16:46:41.929257 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:41.929222 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-595f4f5b7-w76qr" event={"ID":"f1144f65-d5c6-49c5-84bc-e4585fbe22c2","Type":"ContainerStarted","Data":"5c3f12f91c1952c28748db61a256e8c82858d688e0ddebd7c4a57ba2aadbb362"} Apr 23 16:46:41.929257 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:41.929266 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-595f4f5b7-w76qr" event={"ID":"f1144f65-d5c6-49c5-84bc-e4585fbe22c2","Type":"ContainerStarted","Data":"a7bbf341203eca984b74a3572d396691a3866a3e982ce220eeabbf97faa17616"} Apr 23 16:46:41.929796 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:41.929549 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-595f4f5b7-w76qr" Apr 23 16:46:41.929796 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:41.929578 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-595f4f5b7-w76qr" Apr 23 16:46:41.930984 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:41.930941 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-595f4f5b7-w76qr" podUID="f1144f65-d5c6-49c5-84bc-e4585fbe22c2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 23 16:46:41.931650 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:41.931624 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-595f4f5b7-w76qr" podUID="f1144f65-d5c6-49c5-84bc-e4585fbe22c2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:46:41.949064 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:41.949011 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-predictor-595f4f5b7-w76qr" podStartSLOduration=5.948998035 podStartE2EDuration="5.948998035s" podCreationTimestamp="2026-04-23 16:46:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:46:41.945834245 +0000 UTC m=+681.798649300" watchObservedRunningTime="2026-04-23 16:46:41.948998035 +0000 UTC m=+681.801813098" Apr 23 16:46:42.933118 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:42.933077 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-595f4f5b7-w76qr" podUID="f1144f65-d5c6-49c5-84bc-e4585fbe22c2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 23 16:46:42.933568 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:42.933425 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-595f4f5b7-w76qr" podUID="f1144f65-d5c6-49c5-84bc-e4585fbe22c2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:46:47.644517 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:47.644477 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-59fcd7879c-ncrgp" podUID="97432120-a47d-4b89-8d6d-ff310518e5aa" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:46:47.645063 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:47.644617 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-59fcd7879c-ncrgp" Apr 23 16:46:47.646717 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:47.646693 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-59fcd7879c-ncrgp" podUID="97432120-a47d-4b89-8d6d-ff310518e5aa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:5000: connect: connection refused" Apr 23 16:46:47.646824 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:47.646811 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-59fcd7879c-ncrgp" Apr 23 16:46:52.934478 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:52.933890 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-595f4f5b7-w76qr" podUID="f1144f65-d5c6-49c5-84bc-e4585fbe22c2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 23 16:46:52.934478 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:52.934366 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-595f4f5b7-w76qr" podUID="f1144f65-d5c6-49c5-84bc-e4585fbe22c2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:46:56.200377 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:56.200353 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-59fcd7879c-ncrgp" Apr 23 16:46:56.226638 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:56.226605 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/97432120-a47d-4b89-8d6d-ff310518e5aa-kserve-provision-location\") pod \"97432120-a47d-4b89-8d6d-ff310518e5aa\" (UID: \"97432120-a47d-4b89-8d6d-ff310518e5aa\") " Apr 23 16:46:56.226931 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:56.226909 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97432120-a47d-4b89-8d6d-ff310518e5aa-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "97432120-a47d-4b89-8d6d-ff310518e5aa" (UID: "97432120-a47d-4b89-8d6d-ff310518e5aa"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:46:56.327275 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:56.327189 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/97432120-a47d-4b89-8d6d-ff310518e5aa-kserve-provision-location\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 16:46:56.980761 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:56.980707 2562 generic.go:358] "Generic (PLEG): container finished" podID="97432120-a47d-4b89-8d6d-ff310518e5aa" containerID="b3f01e03bbca457d6bea507f5ec9a50f6f1709110781048587abc50936b8d1c0" exitCode=0 Apr 23 16:46:56.980936 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:56.980789 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-59fcd7879c-ncrgp" event={"ID":"97432120-a47d-4b89-8d6d-ff310518e5aa","Type":"ContainerDied","Data":"b3f01e03bbca457d6bea507f5ec9a50f6f1709110781048587abc50936b8d1c0"} Apr 23 16:46:56.980936 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:56.980816 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-59fcd7879c-ncrgp" Apr 23 16:46:56.980936 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:56.980824 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-59fcd7879c-ncrgp" event={"ID":"97432120-a47d-4b89-8d6d-ff310518e5aa","Type":"ContainerDied","Data":"b95bdb17a51c7ce5700ed19f9b7819adc79e4d85d804f39ee9a4df2f53c16bcc"} Apr 23 16:46:56.980936 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:56.980840 2562 scope.go:117] "RemoveContainer" containerID="b3f01e03bbca457d6bea507f5ec9a50f6f1709110781048587abc50936b8d1c0" Apr 23 16:46:56.988442 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:56.988410 2562 scope.go:117] "RemoveContainer" containerID="2e40c33be41f083fd7d27705ff00a349b6094d98fb201bf8f6b980ad7fd101dc" Apr 23 16:46:56.995253 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:56.995236 2562 scope.go:117] "RemoveContainer" containerID="929913718b5c0458d7fc056b91c0e8480f4d2f86ae06bc2e5b1e7325c434139f" Apr 23 16:46:56.997936 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:56.997916 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-59fcd7879c-ncrgp"] Apr 23 16:46:57.001791 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:57.001714 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-59fcd7879c-ncrgp"] Apr 23 16:46:57.003305 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:57.003284 2562 scope.go:117] "RemoveContainer" containerID="b3f01e03bbca457d6bea507f5ec9a50f6f1709110781048587abc50936b8d1c0" Apr 23 16:46:57.003587 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:46:57.003565 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3f01e03bbca457d6bea507f5ec9a50f6f1709110781048587abc50936b8d1c0\": container with ID starting with b3f01e03bbca457d6bea507f5ec9a50f6f1709110781048587abc50936b8d1c0 not found: ID does not exist" containerID="b3f01e03bbca457d6bea507f5ec9a50f6f1709110781048587abc50936b8d1c0" Apr 23 16:46:57.003651 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:57.003593 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3f01e03bbca457d6bea507f5ec9a50f6f1709110781048587abc50936b8d1c0"} err="failed to get container status \"b3f01e03bbca457d6bea507f5ec9a50f6f1709110781048587abc50936b8d1c0\": rpc error: code = NotFound desc = could not find container \"b3f01e03bbca457d6bea507f5ec9a50f6f1709110781048587abc50936b8d1c0\": container with ID starting with b3f01e03bbca457d6bea507f5ec9a50f6f1709110781048587abc50936b8d1c0 not found: ID does not exist" Apr 23 16:46:57.003651 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:57.003612 2562 scope.go:117] "RemoveContainer" containerID="2e40c33be41f083fd7d27705ff00a349b6094d98fb201bf8f6b980ad7fd101dc" Apr 23 16:46:57.003887 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:46:57.003871 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e40c33be41f083fd7d27705ff00a349b6094d98fb201bf8f6b980ad7fd101dc\": container with ID starting with 2e40c33be41f083fd7d27705ff00a349b6094d98fb201bf8f6b980ad7fd101dc not found: ID does not exist" containerID="2e40c33be41f083fd7d27705ff00a349b6094d98fb201bf8f6b980ad7fd101dc" Apr 23 16:46:57.003943 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:57.003890 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e40c33be41f083fd7d27705ff00a349b6094d98fb201bf8f6b980ad7fd101dc"} err="failed to get container status \"2e40c33be41f083fd7d27705ff00a349b6094d98fb201bf8f6b980ad7fd101dc\": rpc error: code = NotFound desc = could not find container \"2e40c33be41f083fd7d27705ff00a349b6094d98fb201bf8f6b980ad7fd101dc\": container with ID starting with 2e40c33be41f083fd7d27705ff00a349b6094d98fb201bf8f6b980ad7fd101dc not found: ID does not exist" Apr 23 16:46:57.003943 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:57.003903 2562 scope.go:117] "RemoveContainer" containerID="929913718b5c0458d7fc056b91c0e8480f4d2f86ae06bc2e5b1e7325c434139f" Apr 23 16:46:57.004106 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:46:57.004090 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"929913718b5c0458d7fc056b91c0e8480f4d2f86ae06bc2e5b1e7325c434139f\": container with ID starting with 929913718b5c0458d7fc056b91c0e8480f4d2f86ae06bc2e5b1e7325c434139f not found: ID does not exist" containerID="929913718b5c0458d7fc056b91c0e8480f4d2f86ae06bc2e5b1e7325c434139f" Apr 23 16:46:57.004147 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:57.004109 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"929913718b5c0458d7fc056b91c0e8480f4d2f86ae06bc2e5b1e7325c434139f"} err="failed to get container status \"929913718b5c0458d7fc056b91c0e8480f4d2f86ae06bc2e5b1e7325c434139f\": rpc error: code = NotFound desc = could not find container \"929913718b5c0458d7fc056b91c0e8480f4d2f86ae06bc2e5b1e7325c434139f\": container with ID starting with 929913718b5c0458d7fc056b91c0e8480f4d2f86ae06bc2e5b1e7325c434139f not found: ID does not exist" Apr 23 16:46:58.743150 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:46:58.743113 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97432120-a47d-4b89-8d6d-ff310518e5aa" path="/var/lib/kubelet/pods/97432120-a47d-4b89-8d6d-ff310518e5aa/volumes" Apr 23 16:47:02.933080 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:47:02.933031 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-595f4f5b7-w76qr" podUID="f1144f65-d5c6-49c5-84bc-e4585fbe22c2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 23 16:47:02.933475 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:47:02.933438 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-595f4f5b7-w76qr" podUID="f1144f65-d5c6-49c5-84bc-e4585fbe22c2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:47:12.933962 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:47:12.933910 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-595f4f5b7-w76qr" podUID="f1144f65-d5c6-49c5-84bc-e4585fbe22c2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 23 16:47:12.934458 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:47:12.934358 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-595f4f5b7-w76qr" podUID="f1144f65-d5c6-49c5-84bc-e4585fbe22c2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:47:22.933074 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:47:22.933019 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-595f4f5b7-w76qr" podUID="f1144f65-d5c6-49c5-84bc-e4585fbe22c2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 23 16:47:22.933563 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:47:22.933540 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-595f4f5b7-w76qr" podUID="f1144f65-d5c6-49c5-84bc-e4585fbe22c2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:47:32.933779 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:47:32.933709 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-595f4f5b7-w76qr" podUID="f1144f65-d5c6-49c5-84bc-e4585fbe22c2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 23 16:47:32.934254 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:47:32.934228 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-595f4f5b7-w76qr" podUID="f1144f65-d5c6-49c5-84bc-e4585fbe22c2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:47:42.933427 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:47:42.933374 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-595f4f5b7-w76qr" podUID="f1144f65-d5c6-49c5-84bc-e4585fbe22c2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 23 16:47:42.933939 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:47:42.933902 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-595f4f5b7-w76qr" podUID="f1144f65-d5c6-49c5-84bc-e4585fbe22c2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:47:52.933918 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:47:52.933884 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-595f4f5b7-w76qr" Apr 23 16:47:52.934427 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:47:52.933942 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-595f4f5b7-w76qr" Apr 23 16:48:01.400109 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:01.400077 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-595f4f5b7-w76qr"] Apr 23 16:48:01.400506 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:01.400350 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-595f4f5b7-w76qr" podUID="f1144f65-d5c6-49c5-84bc-e4585fbe22c2" containerName="kserve-container" containerID="cri-o://a7bbf341203eca984b74a3572d396691a3866a3e982ce220eeabbf97faa17616" gracePeriod=30 Apr 23 16:48:01.400575 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:01.400476 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-595f4f5b7-w76qr" podUID="f1144f65-d5c6-49c5-84bc-e4585fbe22c2" containerName="agent" containerID="cri-o://5c3f12f91c1952c28748db61a256e8c82858d688e0ddebd7c4a57ba2aadbb362" gracePeriod=30 Apr 23 16:48:01.431433 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:01.431400 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-jlq4w"] Apr 23 16:48:01.431802 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:01.431785 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="97432120-a47d-4b89-8d6d-ff310518e5aa" containerName="agent" Apr 23 16:48:01.431892 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:01.431804 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="97432120-a47d-4b89-8d6d-ff310518e5aa" containerName="agent" Apr 23 16:48:01.431892 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:01.431820 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="97432120-a47d-4b89-8d6d-ff310518e5aa" containerName="kserve-container" Apr 23 16:48:01.431892 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:01.431828 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="97432120-a47d-4b89-8d6d-ff310518e5aa" containerName="kserve-container" Apr 23 16:48:01.431892 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:01.431854 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="97432120-a47d-4b89-8d6d-ff310518e5aa" containerName="storage-initializer" Apr 23 16:48:01.431892 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:01.431863 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="97432120-a47d-4b89-8d6d-ff310518e5aa" containerName="storage-initializer" Apr 23 16:48:01.432134 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:01.431951 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="97432120-a47d-4b89-8d6d-ff310518e5aa" containerName="kserve-container" Apr 23 16:48:01.432134 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:01.431969 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="97432120-a47d-4b89-8d6d-ff310518e5aa" containerName="agent" Apr 23 16:48:01.435046 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:01.435025 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-jlq4w" Apr 23 16:48:01.442923 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:01.442877 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-jlq4w"] Apr 23 16:48:01.535006 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:01.534949 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e5e470f8-7279-40a1-bdc8-9cc97ab028e8-kserve-provision-location\") pod \"isvc-lightgbm-predictor-78c8d484d6-jlq4w\" (UID: \"e5e470f8-7279-40a1-bdc8-9cc97ab028e8\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-jlq4w" Apr 23 16:48:01.636059 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:01.636020 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e5e470f8-7279-40a1-bdc8-9cc97ab028e8-kserve-provision-location\") pod \"isvc-lightgbm-predictor-78c8d484d6-jlq4w\" (UID: \"e5e470f8-7279-40a1-bdc8-9cc97ab028e8\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-jlq4w" Apr 23 16:48:01.636418 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:01.636396 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e5e470f8-7279-40a1-bdc8-9cc97ab028e8-kserve-provision-location\") pod \"isvc-lightgbm-predictor-78c8d484d6-jlq4w\" (UID: \"e5e470f8-7279-40a1-bdc8-9cc97ab028e8\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-jlq4w" Apr 23 16:48:01.749012 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:01.748923 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-jlq4w" Apr 23 16:48:01.872120 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:01.872068 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-jlq4w"] Apr 23 16:48:01.874362 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:48:01.874335 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5e470f8_7279_40a1_bdc8_9cc97ab028e8.slice/crio-4288a2806f5493b1d3e5cb5141aa583e408e919cbaf7bf5ea1a9ab9c2d519cd3 WatchSource:0}: Error finding container 4288a2806f5493b1d3e5cb5141aa583e408e919cbaf7bf5ea1a9ab9c2d519cd3: Status 404 returned error can't find the container with id 4288a2806f5493b1d3e5cb5141aa583e408e919cbaf7bf5ea1a9ab9c2d519cd3 Apr 23 16:48:02.174677 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:02.174629 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-jlq4w" event={"ID":"e5e470f8-7279-40a1-bdc8-9cc97ab028e8","Type":"ContainerStarted","Data":"3c00d48f34b1f2e11945a22e9756f786dd3ec9773afe8c57dd0feb2e85a50a41"} Apr 23 16:48:02.174677 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:02.174668 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-jlq4w" event={"ID":"e5e470f8-7279-40a1-bdc8-9cc97ab028e8","Type":"ContainerStarted","Data":"4288a2806f5493b1d3e5cb5141aa583e408e919cbaf7bf5ea1a9ab9c2d519cd3"} Apr 23 16:48:02.933571 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:02.933522 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-595f4f5b7-w76qr" podUID="f1144f65-d5c6-49c5-84bc-e4585fbe22c2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 23 16:48:02.934021 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:02.933905 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-595f4f5b7-w76qr" podUID="f1144f65-d5c6-49c5-84bc-e4585fbe22c2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:48:06.189763 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:06.189665 2562 generic.go:358] "Generic (PLEG): container finished" podID="f1144f65-d5c6-49c5-84bc-e4585fbe22c2" containerID="a7bbf341203eca984b74a3572d396691a3866a3e982ce220eeabbf97faa17616" exitCode=0 Apr 23 16:48:06.189763 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:06.189751 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-595f4f5b7-w76qr" event={"ID":"f1144f65-d5c6-49c5-84bc-e4585fbe22c2","Type":"ContainerDied","Data":"a7bbf341203eca984b74a3572d396691a3866a3e982ce220eeabbf97faa17616"} Apr 23 16:48:06.190991 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:06.190973 2562 generic.go:358] "Generic (PLEG): container finished" podID="e5e470f8-7279-40a1-bdc8-9cc97ab028e8" containerID="3c00d48f34b1f2e11945a22e9756f786dd3ec9773afe8c57dd0feb2e85a50a41" exitCode=0 Apr 23 16:48:06.191069 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:06.191016 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-jlq4w" event={"ID":"e5e470f8-7279-40a1-bdc8-9cc97ab028e8","Type":"ContainerDied","Data":"3c00d48f34b1f2e11945a22e9756f786dd3ec9773afe8c57dd0feb2e85a50a41"} Apr 23 16:48:12.933146 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:12.933092 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-595f4f5b7-w76qr" podUID="f1144f65-d5c6-49c5-84bc-e4585fbe22c2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 23 16:48:12.933612 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:12.933494 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-595f4f5b7-w76qr" podUID="f1144f65-d5c6-49c5-84bc-e4585fbe22c2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:48:15.223487 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:15.223453 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-jlq4w" event={"ID":"e5e470f8-7279-40a1-bdc8-9cc97ab028e8","Type":"ContainerStarted","Data":"6fb546fb69bd5ed998adede50ab80625e41a21b9caaa612e9c9ddc122939c33b"} Apr 23 16:48:15.223890 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:15.223759 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-jlq4w" Apr 23 16:48:15.225143 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:15.225118 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-jlq4w" podUID="e5e470f8-7279-40a1-bdc8-9cc97ab028e8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 23 16:48:15.241141 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:15.241088 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-jlq4w" podStartSLOduration=6.181644198 podStartE2EDuration="14.241070829s" podCreationTimestamp="2026-04-23 16:48:01 +0000 UTC" firstStartedPulling="2026-04-23 16:48:06.192232682 +0000 UTC m=+766.045047727" lastFinishedPulling="2026-04-23 16:48:14.251659312 +0000 UTC m=+774.104474358" observedRunningTime="2026-04-23 16:48:15.239152104 +0000 UTC m=+775.091967171" watchObservedRunningTime="2026-04-23 16:48:15.241070829 +0000 UTC m=+775.093885894" Apr 23 16:48:16.226620 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:16.226579 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-jlq4w" podUID="e5e470f8-7279-40a1-bdc8-9cc97ab028e8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 23 16:48:22.933199 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:22.933150 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-595f4f5b7-w76qr" podUID="f1144f65-d5c6-49c5-84bc-e4585fbe22c2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:8080: connect: connection refused" Apr 23 16:48:22.933632 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:22.933291 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-595f4f5b7-w76qr" Apr 23 16:48:22.933632 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:22.933471 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-595f4f5b7-w76qr" podUID="f1144f65-d5c6-49c5-84bc-e4585fbe22c2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:48:22.933632 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:22.933572 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-595f4f5b7-w76qr" Apr 23 16:48:26.227110 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:26.227069 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-jlq4w" podUID="e5e470f8-7279-40a1-bdc8-9cc97ab028e8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 23 16:48:31.585340 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:31.585315 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-595f4f5b7-w76qr" Apr 23 16:48:31.699837 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:31.699727 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f1144f65-d5c6-49c5-84bc-e4585fbe22c2-kserve-provision-location\") pod \"f1144f65-d5c6-49c5-84bc-e4585fbe22c2\" (UID: \"f1144f65-d5c6-49c5-84bc-e4585fbe22c2\") " Apr 23 16:48:31.700058 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:31.700035 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1144f65-d5c6-49c5-84bc-e4585fbe22c2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f1144f65-d5c6-49c5-84bc-e4585fbe22c2" (UID: "f1144f65-d5c6-49c5-84bc-e4585fbe22c2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:48:31.800603 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:31.800568 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f1144f65-d5c6-49c5-84bc-e4585fbe22c2-kserve-provision-location\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 16:48:32.279848 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:32.279810 2562 generic.go:358] "Generic (PLEG): container finished" podID="f1144f65-d5c6-49c5-84bc-e4585fbe22c2" containerID="5c3f12f91c1952c28748db61a256e8c82858d688e0ddebd7c4a57ba2aadbb362" exitCode=137 Apr 23 16:48:32.280018 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:32.279887 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-595f4f5b7-w76qr" event={"ID":"f1144f65-d5c6-49c5-84bc-e4585fbe22c2","Type":"ContainerDied","Data":"5c3f12f91c1952c28748db61a256e8c82858d688e0ddebd7c4a57ba2aadbb362"} Apr 23 16:48:32.280018 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:32.279925 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-595f4f5b7-w76qr" event={"ID":"f1144f65-d5c6-49c5-84bc-e4585fbe22c2","Type":"ContainerDied","Data":"a6101ea737a5e743542e2bf46e99090b1f65f27864f0e746dc8810b3123c587c"} Apr 23 16:48:32.280018 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:32.279942 2562 scope.go:117] "RemoveContainer" containerID="5c3f12f91c1952c28748db61a256e8c82858d688e0ddebd7c4a57ba2aadbb362" Apr 23 16:48:32.280018 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:32.279897 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-595f4f5b7-w76qr" Apr 23 16:48:32.288631 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:32.288550 2562 scope.go:117] "RemoveContainer" containerID="a7bbf341203eca984b74a3572d396691a3866a3e982ce220eeabbf97faa17616" Apr 23 16:48:32.295758 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:32.295718 2562 scope.go:117] "RemoveContainer" containerID="8753d2de1139b4a43183eacac75042e874e42794d60477cddd233a7da798ec3c" Apr 23 16:48:32.308167 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:32.307153 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-595f4f5b7-w76qr"] Apr 23 16:48:32.308341 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:32.308227 2562 scope.go:117] "RemoveContainer" containerID="5c3f12f91c1952c28748db61a256e8c82858d688e0ddebd7c4a57ba2aadbb362" Apr 23 16:48:32.308878 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:48:32.308851 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c3f12f91c1952c28748db61a256e8c82858d688e0ddebd7c4a57ba2aadbb362\": container with ID starting with 5c3f12f91c1952c28748db61a256e8c82858d688e0ddebd7c4a57ba2aadbb362 not found: ID does not exist" containerID="5c3f12f91c1952c28748db61a256e8c82858d688e0ddebd7c4a57ba2aadbb362" Apr 23 16:48:32.308975 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:32.308891 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c3f12f91c1952c28748db61a256e8c82858d688e0ddebd7c4a57ba2aadbb362"} err="failed to get container status \"5c3f12f91c1952c28748db61a256e8c82858d688e0ddebd7c4a57ba2aadbb362\": rpc error: code = NotFound desc = could not find container \"5c3f12f91c1952c28748db61a256e8c82858d688e0ddebd7c4a57ba2aadbb362\": container with ID starting with 5c3f12f91c1952c28748db61a256e8c82858d688e0ddebd7c4a57ba2aadbb362 not found: ID does not exist" Apr 23 16:48:32.308975 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:32.308913 2562 scope.go:117] "RemoveContainer" containerID="a7bbf341203eca984b74a3572d396691a3866a3e982ce220eeabbf97faa17616" Apr 23 16:48:32.309297 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:48:32.309221 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7bbf341203eca984b74a3572d396691a3866a3e982ce220eeabbf97faa17616\": container with ID starting with a7bbf341203eca984b74a3572d396691a3866a3e982ce220eeabbf97faa17616 not found: ID does not exist" containerID="a7bbf341203eca984b74a3572d396691a3866a3e982ce220eeabbf97faa17616" Apr 23 16:48:32.309359 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:32.309305 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7bbf341203eca984b74a3572d396691a3866a3e982ce220eeabbf97faa17616"} err="failed to get container status \"a7bbf341203eca984b74a3572d396691a3866a3e982ce220eeabbf97faa17616\": rpc error: code = NotFound desc = could not find container \"a7bbf341203eca984b74a3572d396691a3866a3e982ce220eeabbf97faa17616\": container with ID starting with a7bbf341203eca984b74a3572d396691a3866a3e982ce220eeabbf97faa17616 not found: ID does not exist" Apr 23 16:48:32.309359 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:32.309323 2562 scope.go:117] "RemoveContainer" containerID="8753d2de1139b4a43183eacac75042e874e42794d60477cddd233a7da798ec3c" Apr 23 16:48:32.309586 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:48:32.309562 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8753d2de1139b4a43183eacac75042e874e42794d60477cddd233a7da798ec3c\": container with ID starting with 8753d2de1139b4a43183eacac75042e874e42794d60477cddd233a7da798ec3c not found: ID does not exist" containerID="8753d2de1139b4a43183eacac75042e874e42794d60477cddd233a7da798ec3c" Apr 23 16:48:32.309651 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:32.309593 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8753d2de1139b4a43183eacac75042e874e42794d60477cddd233a7da798ec3c"} err="failed to get container status \"8753d2de1139b4a43183eacac75042e874e42794d60477cddd233a7da798ec3c\": rpc error: code = NotFound desc = could not find container \"8753d2de1139b4a43183eacac75042e874e42794d60477cddd233a7da798ec3c\": container with ID starting with 8753d2de1139b4a43183eacac75042e874e42794d60477cddd233a7da798ec3c not found: ID does not exist" Apr 23 16:48:32.310114 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:32.310093 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-595f4f5b7-w76qr"] Apr 23 16:48:32.742012 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:32.741979 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1144f65-d5c6-49c5-84bc-e4585fbe22c2" path="/var/lib/kubelet/pods/f1144f65-d5c6-49c5-84bc-e4585fbe22c2/volumes" Apr 23 16:48:36.226512 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:36.226471 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-jlq4w" podUID="e5e470f8-7279-40a1-bdc8-9cc97ab028e8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 23 16:48:46.227290 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:46.227195 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-jlq4w" podUID="e5e470f8-7279-40a1-bdc8-9cc97ab028e8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 23 16:48:56.227050 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:48:56.227008 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-jlq4w" podUID="e5e470f8-7279-40a1-bdc8-9cc97ab028e8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 23 16:49:06.226928 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:49:06.226883 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-jlq4w" podUID="e5e470f8-7279-40a1-bdc8-9cc97ab028e8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 23 16:49:16.227081 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:49:16.227030 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-jlq4w" podUID="e5e470f8-7279-40a1-bdc8-9cc97ab028e8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 23 16:49:26.227129 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:49:26.227085 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-jlq4w" podUID="e5e470f8-7279-40a1-bdc8-9cc97ab028e8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 23 16:49:36.227588 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:49:36.227556 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-jlq4w" Apr 23 16:49:41.588641 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:49:41.588607 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-jlq4w"] Apr 23 16:49:41.589157 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:49:41.588891 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-jlq4w" podUID="e5e470f8-7279-40a1-bdc8-9cc97ab028e8" containerName="kserve-container" containerID="cri-o://6fb546fb69bd5ed998adede50ab80625e41a21b9caaa612e9c9ddc122939c33b" gracePeriod=30 Apr 23 16:49:41.734216 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:49:41.734176 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dbq9f"] Apr 23 16:49:41.734596 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:49:41.734578 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1144f65-d5c6-49c5-84bc-e4585fbe22c2" containerName="agent" Apr 23 16:49:41.734688 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:49:41.734598 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1144f65-d5c6-49c5-84bc-e4585fbe22c2" containerName="agent" Apr 23 16:49:41.734688 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:49:41.734619 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1144f65-d5c6-49c5-84bc-e4585fbe22c2" containerName="storage-initializer" Apr 23 16:49:41.734688 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:49:41.734628 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1144f65-d5c6-49c5-84bc-e4585fbe22c2" containerName="storage-initializer" Apr 23 16:49:41.734688 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:49:41.734652 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f1144f65-d5c6-49c5-84bc-e4585fbe22c2" containerName="kserve-container" Apr 23 16:49:41.734688 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:49:41.734661 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1144f65-d5c6-49c5-84bc-e4585fbe22c2" containerName="kserve-container" Apr 23 16:49:41.734960 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:49:41.734732 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="f1144f65-d5c6-49c5-84bc-e4585fbe22c2" containerName="kserve-container" Apr 23 16:49:41.734960 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:49:41.734768 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="f1144f65-d5c6-49c5-84bc-e4585fbe22c2" containerName="agent" Apr 23 16:49:41.738969 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:49:41.738941 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dbq9f" Apr 23 16:49:41.748676 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:49:41.748649 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dbq9f"] Apr 23 16:49:41.797730 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:49:41.797698 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/437a5f27-9f15-49f7-bae2-22a3097617e0-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-64984c7cb-dbq9f\" (UID: \"437a5f27-9f15-49f7-bae2-22a3097617e0\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dbq9f" Apr 23 16:49:41.898690 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:49:41.898658 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/437a5f27-9f15-49f7-bae2-22a3097617e0-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-64984c7cb-dbq9f\" (UID: \"437a5f27-9f15-49f7-bae2-22a3097617e0\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dbq9f" Apr 23 16:49:41.899134 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:49:41.899109 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/437a5f27-9f15-49f7-bae2-22a3097617e0-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-64984c7cb-dbq9f\" (UID: \"437a5f27-9f15-49f7-bae2-22a3097617e0\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dbq9f" Apr 23 16:49:42.049914 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:49:42.049881 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dbq9f" Apr 23 16:49:42.170036 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:49:42.169962 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dbq9f"] Apr 23 16:49:42.178484 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:49:42.178455 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod437a5f27_9f15_49f7_bae2_22a3097617e0.slice/crio-06fe26a5a883b5ef5dfe552b4de5b8facc05f53604f63292c14f6e80f22ddc81 WatchSource:0}: Error finding container 06fe26a5a883b5ef5dfe552b4de5b8facc05f53604f63292c14f6e80f22ddc81: Status 404 returned error can't find the container with id 06fe26a5a883b5ef5dfe552b4de5b8facc05f53604f63292c14f6e80f22ddc81 Apr 23 16:49:42.491995 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:49:42.491902 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dbq9f" event={"ID":"437a5f27-9f15-49f7-bae2-22a3097617e0","Type":"ContainerStarted","Data":"2e94bf22535873195c6f043bc17c4e8df72b81858987bc1a3f871a91425e7a8b"} Apr 23 16:49:42.491995 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:49:42.491943 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dbq9f" event={"ID":"437a5f27-9f15-49f7-bae2-22a3097617e0","Type":"ContainerStarted","Data":"06fe26a5a883b5ef5dfe552b4de5b8facc05f53604f63292c14f6e80f22ddc81"} Apr 23 16:49:46.226944 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:49:46.226903 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-jlq4w" podUID="e5e470f8-7279-40a1-bdc8-9cc97ab028e8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.27:8080: connect: connection refused" Apr 23 16:49:46.328440 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:49:46.328418 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-jlq4w" Apr 23 16:49:46.331727 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:49:46.331708 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e5e470f8-7279-40a1-bdc8-9cc97ab028e8-kserve-provision-location\") pod \"e5e470f8-7279-40a1-bdc8-9cc97ab028e8\" (UID: \"e5e470f8-7279-40a1-bdc8-9cc97ab028e8\") " Apr 23 16:49:46.332098 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:49:46.332072 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5e470f8-7279-40a1-bdc8-9cc97ab028e8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e5e470f8-7279-40a1-bdc8-9cc97ab028e8" (UID: "e5e470f8-7279-40a1-bdc8-9cc97ab028e8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:49:46.433084 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:49:46.433049 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e5e470f8-7279-40a1-bdc8-9cc97ab028e8-kserve-provision-location\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 16:49:46.507078 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:49:46.507044 2562 generic.go:358] "Generic (PLEG): container finished" podID="e5e470f8-7279-40a1-bdc8-9cc97ab028e8" containerID="6fb546fb69bd5ed998adede50ab80625e41a21b9caaa612e9c9ddc122939c33b" exitCode=0 Apr 23 16:49:46.507249 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:49:46.507112 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-jlq4w" Apr 23 16:49:46.507249 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:49:46.507128 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-jlq4w" event={"ID":"e5e470f8-7279-40a1-bdc8-9cc97ab028e8","Type":"ContainerDied","Data":"6fb546fb69bd5ed998adede50ab80625e41a21b9caaa612e9c9ddc122939c33b"} Apr 23 16:49:46.507249 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:49:46.507166 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-jlq4w" event={"ID":"e5e470f8-7279-40a1-bdc8-9cc97ab028e8","Type":"ContainerDied","Data":"4288a2806f5493b1d3e5cb5141aa583e408e919cbaf7bf5ea1a9ab9c2d519cd3"} Apr 23 16:49:46.507249 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:49:46.507183 2562 scope.go:117] "RemoveContainer" containerID="6fb546fb69bd5ed998adede50ab80625e41a21b9caaa612e9c9ddc122939c33b" Apr 23 16:49:46.508644 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:49:46.508622 2562 generic.go:358] "Generic (PLEG): container finished" podID="437a5f27-9f15-49f7-bae2-22a3097617e0" containerID="2e94bf22535873195c6f043bc17c4e8df72b81858987bc1a3f871a91425e7a8b" exitCode=0 Apr 23 16:49:46.508730 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:49:46.508664 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dbq9f" event={"ID":"437a5f27-9f15-49f7-bae2-22a3097617e0","Type":"ContainerDied","Data":"2e94bf22535873195c6f043bc17c4e8df72b81858987bc1a3f871a91425e7a8b"} Apr 23 16:49:46.515484 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:49:46.515440 2562 scope.go:117] "RemoveContainer" containerID="3c00d48f34b1f2e11945a22e9756f786dd3ec9773afe8c57dd0feb2e85a50a41" Apr 23 16:49:46.522980 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:49:46.522960 2562 scope.go:117] "RemoveContainer" containerID="6fb546fb69bd5ed998adede50ab80625e41a21b9caaa612e9c9ddc122939c33b" Apr 23 16:49:46.523234 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:49:46.523214 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fb546fb69bd5ed998adede50ab80625e41a21b9caaa612e9c9ddc122939c33b\": container with ID starting with 6fb546fb69bd5ed998adede50ab80625e41a21b9caaa612e9c9ddc122939c33b not found: ID does not exist" containerID="6fb546fb69bd5ed998adede50ab80625e41a21b9caaa612e9c9ddc122939c33b" Apr 23 16:49:46.523305 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:49:46.523247 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fb546fb69bd5ed998adede50ab80625e41a21b9caaa612e9c9ddc122939c33b"} err="failed to get container status \"6fb546fb69bd5ed998adede50ab80625e41a21b9caaa612e9c9ddc122939c33b\": rpc error: code = NotFound desc = could not find container \"6fb546fb69bd5ed998adede50ab80625e41a21b9caaa612e9c9ddc122939c33b\": container with ID starting with 6fb546fb69bd5ed998adede50ab80625e41a21b9caaa612e9c9ddc122939c33b not found: ID does not exist" Apr 23 16:49:46.523305 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:49:46.523274 2562 scope.go:117] "RemoveContainer" containerID="3c00d48f34b1f2e11945a22e9756f786dd3ec9773afe8c57dd0feb2e85a50a41" Apr 23 16:49:46.523554 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:49:46.523530 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c00d48f34b1f2e11945a22e9756f786dd3ec9773afe8c57dd0feb2e85a50a41\": container with ID starting with 3c00d48f34b1f2e11945a22e9756f786dd3ec9773afe8c57dd0feb2e85a50a41 not found: ID does not exist" containerID="3c00d48f34b1f2e11945a22e9756f786dd3ec9773afe8c57dd0feb2e85a50a41" Apr 23 16:49:46.523606 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:49:46.523562 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c00d48f34b1f2e11945a22e9756f786dd3ec9773afe8c57dd0feb2e85a50a41"} err="failed to get container status \"3c00d48f34b1f2e11945a22e9756f786dd3ec9773afe8c57dd0feb2e85a50a41\": rpc error: code = NotFound desc = could not find container \"3c00d48f34b1f2e11945a22e9756f786dd3ec9773afe8c57dd0feb2e85a50a41\": container with ID starting with 3c00d48f34b1f2e11945a22e9756f786dd3ec9773afe8c57dd0feb2e85a50a41 not found: ID does not exist" Apr 23 16:49:46.539183 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:49:46.539151 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-jlq4w"] Apr 23 16:49:46.547075 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:49:46.547048 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-jlq4w"] Apr 23 16:49:46.747840 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:49:46.747800 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5e470f8-7279-40a1-bdc8-9cc97ab028e8" path="/var/lib/kubelet/pods/e5e470f8-7279-40a1-bdc8-9cc97ab028e8/volumes" Apr 23 16:49:47.513961 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:49:47.513925 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dbq9f" event={"ID":"437a5f27-9f15-49f7-bae2-22a3097617e0","Type":"ContainerStarted","Data":"5b697b582e74dcfd400dba1419f1b3113720cd53a32ffdf4baa981002dc72ef7"} Apr 23 16:49:47.514381 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:49:47.514276 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dbq9f" Apr 23 16:49:47.515759 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:49:47.515713 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dbq9f" podUID="437a5f27-9f15-49f7-bae2-22a3097617e0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 23 16:49:47.531322 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:49:47.531270 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dbq9f" podStartSLOduration=6.531255935 podStartE2EDuration="6.531255935s" podCreationTimestamp="2026-04-23 16:49:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:49:47.529139321 +0000 UTC m=+867.381954384" watchObservedRunningTime="2026-04-23 16:49:47.531255935 +0000 UTC m=+867.384070999" Apr 23 16:49:48.518923 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:49:48.518884 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dbq9f" podUID="437a5f27-9f15-49f7-bae2-22a3097617e0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 23 16:49:58.519367 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:49:58.519324 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dbq9f" podUID="437a5f27-9f15-49f7-bae2-22a3097617e0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 23 16:50:08.519665 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:50:08.519620 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dbq9f" podUID="437a5f27-9f15-49f7-bae2-22a3097617e0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 23 16:50:18.519004 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:50:18.518962 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dbq9f" podUID="437a5f27-9f15-49f7-bae2-22a3097617e0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 23 16:50:28.518983 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:50:28.518934 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dbq9f" podUID="437a5f27-9f15-49f7-bae2-22a3097617e0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 23 16:50:38.519070 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:50:38.519020 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dbq9f" podUID="437a5f27-9f15-49f7-bae2-22a3097617e0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 23 16:50:48.519643 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:50:48.519600 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dbq9f" podUID="437a5f27-9f15-49f7-bae2-22a3097617e0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 23 16:50:58.519756 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:50:58.519708 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dbq9f" podUID="437a5f27-9f15-49f7-bae2-22a3097617e0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 23 16:51:02.746021 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:51:02.745995 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dbq9f" Apr 23 16:51:12.080430 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:51:12.080395 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dbq9f"] Apr 23 16:51:12.080908 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:51:12.080802 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dbq9f" podUID="437a5f27-9f15-49f7-bae2-22a3097617e0" containerName="kserve-container" containerID="cri-o://5b697b582e74dcfd400dba1419f1b3113720cd53a32ffdf4baa981002dc72ef7" gracePeriod=30 Apr 23 16:51:12.194644 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:51:12.194614 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-mznpd"] Apr 23 16:51:12.194956 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:51:12.194942 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e5e470f8-7279-40a1-bdc8-9cc97ab028e8" containerName="storage-initializer" Apr 23 16:51:12.195006 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:51:12.194959 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5e470f8-7279-40a1-bdc8-9cc97ab028e8" containerName="storage-initializer" Apr 23 16:51:12.195006 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:51:12.194971 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e5e470f8-7279-40a1-bdc8-9cc97ab028e8" containerName="kserve-container" Apr 23 16:51:12.195006 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:51:12.194978 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5e470f8-7279-40a1-bdc8-9cc97ab028e8" containerName="kserve-container" Apr 23 16:51:12.195099 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:51:12.195033 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="e5e470f8-7279-40a1-bdc8-9cc97ab028e8" containerName="kserve-container" Apr 23 16:51:12.198080 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:51:12.198062 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-mznpd" Apr 23 16:51:12.205758 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:51:12.205715 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-mznpd"] Apr 23 16:51:12.356210 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:51:12.356109 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc43c2df-4bbd-475e-8471-639451e077bc-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8d44c64dc-mznpd\" (UID: \"bc43c2df-4bbd-475e-8471-639451e077bc\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-mznpd" Apr 23 16:51:12.457193 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:51:12.457147 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc43c2df-4bbd-475e-8471-639451e077bc-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8d44c64dc-mznpd\" (UID: \"bc43c2df-4bbd-475e-8471-639451e077bc\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-mznpd" Apr 23 16:51:12.457530 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:51:12.457509 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc43c2df-4bbd-475e-8471-639451e077bc-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8d44c64dc-mznpd\" (UID: \"bc43c2df-4bbd-475e-8471-639451e077bc\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-mznpd" Apr 23 16:51:12.508992 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:51:12.508945 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-mznpd" Apr 23 16:51:12.625984 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:51:12.625961 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-mznpd"] Apr 23 16:51:12.628279 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:51:12.628247 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc43c2df_4bbd_475e_8471_639451e077bc.slice/crio-8324346c31c84d4dc693f3ca1698ce1c516e1af0502fcadc9dc11234f001b054 WatchSource:0}: Error finding container 8324346c31c84d4dc693f3ca1698ce1c516e1af0502fcadc9dc11234f001b054: Status 404 returned error can't find the container with id 8324346c31c84d4dc693f3ca1698ce1c516e1af0502fcadc9dc11234f001b054 Apr 23 16:51:12.738350 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:51:12.738303 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dbq9f" podUID="437a5f27-9f15-49f7-bae2-22a3097617e0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 23 16:51:12.767376 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:51:12.767339 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-mznpd" event={"ID":"bc43c2df-4bbd-475e-8471-639451e077bc","Type":"ContainerStarted","Data":"a18124edb298be4a1877678cb428bd030213a5bc574e3d1f32a48c033d3aa4aa"} Apr 23 16:51:12.767376 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:51:12.767376 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-mznpd" event={"ID":"bc43c2df-4bbd-475e-8471-639451e077bc","Type":"ContainerStarted","Data":"8324346c31c84d4dc693f3ca1698ce1c516e1af0502fcadc9dc11234f001b054"} Apr 23 16:51:16.779001 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:51:16.778919 2562 generic.go:358] "Generic (PLEG): container finished" podID="437a5f27-9f15-49f7-bae2-22a3097617e0" containerID="5b697b582e74dcfd400dba1419f1b3113720cd53a32ffdf4baa981002dc72ef7" exitCode=0 Apr 23 16:51:16.779374 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:51:16.778993 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dbq9f" event={"ID":"437a5f27-9f15-49f7-bae2-22a3097617e0","Type":"ContainerDied","Data":"5b697b582e74dcfd400dba1419f1b3113720cd53a32ffdf4baa981002dc72ef7"} Apr 23 16:51:16.780297 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:51:16.780274 2562 generic.go:358] "Generic (PLEG): container finished" podID="bc43c2df-4bbd-475e-8471-639451e077bc" containerID="a18124edb298be4a1877678cb428bd030213a5bc574e3d1f32a48c033d3aa4aa" exitCode=0 Apr 23 16:51:16.780412 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:51:16.780323 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-mznpd" event={"ID":"bc43c2df-4bbd-475e-8471-639451e077bc","Type":"ContainerDied","Data":"a18124edb298be4a1877678cb428bd030213a5bc574e3d1f32a48c033d3aa4aa"} Apr 23 16:51:16.810757 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:51:16.810725 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dbq9f" Apr 23 16:51:16.891950 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:51:16.891919 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/437a5f27-9f15-49f7-bae2-22a3097617e0-kserve-provision-location\") pod \"437a5f27-9f15-49f7-bae2-22a3097617e0\" (UID: \"437a5f27-9f15-49f7-bae2-22a3097617e0\") " Apr 23 16:51:16.892255 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:51:16.892229 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/437a5f27-9f15-49f7-bae2-22a3097617e0-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "437a5f27-9f15-49f7-bae2-22a3097617e0" (UID: "437a5f27-9f15-49f7-bae2-22a3097617e0"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:51:16.993354 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:51:16.993324 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/437a5f27-9f15-49f7-bae2-22a3097617e0-kserve-provision-location\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 16:51:17.787306 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:51:17.787190 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dbq9f" event={"ID":"437a5f27-9f15-49f7-bae2-22a3097617e0","Type":"ContainerDied","Data":"06fe26a5a883b5ef5dfe552b4de5b8facc05f53604f63292c14f6e80f22ddc81"} Apr 23 16:51:17.787306 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:51:17.787246 2562 scope.go:117] "RemoveContainer" containerID="5b697b582e74dcfd400dba1419f1b3113720cd53a32ffdf4baa981002dc72ef7" Apr 23 16:51:17.810020 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:51:17.787410 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dbq9f" Apr 23 16:51:17.810020 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:51:17.809876 2562 scope.go:117] "RemoveContainer" containerID="2e94bf22535873195c6f043bc17c4e8df72b81858987bc1a3f871a91425e7a8b" Apr 23 16:51:17.818154 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:51:17.817705 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dbq9f"] Apr 23 16:51:17.822233 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:51:17.822196 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-dbq9f"] Apr 23 16:51:18.745363 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:51:18.744949 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="437a5f27-9f15-49f7-bae2-22a3097617e0" path="/var/lib/kubelet/pods/437a5f27-9f15-49f7-bae2-22a3097617e0/volumes" Apr 23 16:53:31.244356 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:53:31.244320 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-mznpd" event={"ID":"bc43c2df-4bbd-475e-8471-639451e077bc","Type":"ContainerStarted","Data":"99736cab003f1f977acd05686979f85a508f62c55888c3793a2056ea7b1706fe"} Apr 23 16:53:31.244799 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:53:31.244547 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-mznpd" Apr 23 16:53:31.271144 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:53:31.271095 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-mznpd" podStartSLOduration=5.16342785 podStartE2EDuration="2m19.271080278s" podCreationTimestamp="2026-04-23 16:51:12 +0000 UTC" firstStartedPulling="2026-04-23 16:51:16.781329525 +0000 UTC m=+956.634144567" lastFinishedPulling="2026-04-23 16:53:30.88898195 +0000 UTC m=+1090.741796995" observedRunningTime="2026-04-23 16:53:31.269977651 +0000 UTC m=+1091.122792718" watchObservedRunningTime="2026-04-23 16:53:31.271080278 +0000 UTC m=+1091.123895341" Apr 23 16:54:02.252986 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:02.252948 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-mznpd" Apr 23 16:54:12.368260 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:12.368225 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-mznpd"] Apr 23 16:54:12.368831 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:12.368519 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-mznpd" podUID="bc43c2df-4bbd-475e-8471-639451e077bc" containerName="kserve-container" containerID="cri-o://99736cab003f1f977acd05686979f85a508f62c55888c3793a2056ea7b1706fe" gracePeriod=30 Apr 23 16:54:12.460563 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:12.460519 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-wgc4f"] Apr 23 16:54:12.461066 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:12.461046 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="437a5f27-9f15-49f7-bae2-22a3097617e0" containerName="kserve-container" Apr 23 16:54:12.461123 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:12.461071 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="437a5f27-9f15-49f7-bae2-22a3097617e0" containerName="kserve-container" Apr 23 16:54:12.461123 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:12.461087 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="437a5f27-9f15-49f7-bae2-22a3097617e0" containerName="storage-initializer" Apr 23 16:54:12.461123 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:12.461096 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="437a5f27-9f15-49f7-bae2-22a3097617e0" containerName="storage-initializer" Apr 23 16:54:12.461260 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:12.461171 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="437a5f27-9f15-49f7-bae2-22a3097617e0" containerName="kserve-container" Apr 23 16:54:12.464918 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:12.464902 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-wgc4f" Apr 23 16:54:12.474680 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:12.474654 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-wgc4f"] Apr 23 16:54:12.568141 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:12.568053 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e54675d1-bc93-4a35-b18e-7200792fe7e9-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-wgc4f\" (UID: \"e54675d1-bc93-4a35-b18e-7200792fe7e9\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-wgc4f" Apr 23 16:54:12.668941 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:12.668903 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e54675d1-bc93-4a35-b18e-7200792fe7e9-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-wgc4f\" (UID: \"e54675d1-bc93-4a35-b18e-7200792fe7e9\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-wgc4f" Apr 23 16:54:12.669307 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:12.669289 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e54675d1-bc93-4a35-b18e-7200792fe7e9-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-wgc4f\" (UID: \"e54675d1-bc93-4a35-b18e-7200792fe7e9\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-wgc4f" Apr 23 16:54:12.775688 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:12.775651 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-wgc4f" Apr 23 16:54:12.900862 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:12.900832 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-wgc4f"] Apr 23 16:54:12.903379 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:54:12.903352 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode54675d1_bc93_4a35_b18e_7200792fe7e9.slice/crio-786d5771de009fbd6a0550904c27477e22cea132689d002ee4a3fe315c8bd421 WatchSource:0}: Error finding container 786d5771de009fbd6a0550904c27477e22cea132689d002ee4a3fe315c8bd421: Status 404 returned error can't find the container with id 786d5771de009fbd6a0550904c27477e22cea132689d002ee4a3fe315c8bd421 Apr 23 16:54:12.905279 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:12.905261 2562 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 16:54:13.326899 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:54:13.326869 2562 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc43c2df_4bbd_475e_8471_639451e077bc.slice/crio-99736cab003f1f977acd05686979f85a508f62c55888c3793a2056ea7b1706fe.scope\": RecentStats: unable to find data in memory cache]" Apr 23 16:54:13.369736 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:13.369701 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-wgc4f" event={"ID":"e54675d1-bc93-4a35-b18e-7200792fe7e9","Type":"ContainerStarted","Data":"a0af72ff2902c4ae5295cf990b1f987686d870158cd6919d8b3132f839c47ace"} Apr 23 16:54:13.370211 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:13.369758 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-wgc4f" event={"ID":"e54675d1-bc93-4a35-b18e-7200792fe7e9","Type":"ContainerStarted","Data":"786d5771de009fbd6a0550904c27477e22cea132689d002ee4a3fe315c8bd421"} Apr 23 16:54:13.490948 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:13.490924 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-mznpd" Apr 23 16:54:13.576520 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:13.576425 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc43c2df-4bbd-475e-8471-639451e077bc-kserve-provision-location\") pod \"bc43c2df-4bbd-475e-8471-639451e077bc\" (UID: \"bc43c2df-4bbd-475e-8471-639451e077bc\") " Apr 23 16:54:13.576819 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:13.576790 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc43c2df-4bbd-475e-8471-639451e077bc-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "bc43c2df-4bbd-475e-8471-639451e077bc" (UID: "bc43c2df-4bbd-475e-8471-639451e077bc"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:54:13.678014 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:13.677962 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc43c2df-4bbd-475e-8471-639451e077bc-kserve-provision-location\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 16:54:14.373478 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:14.373446 2562 generic.go:358] "Generic (PLEG): container finished" podID="bc43c2df-4bbd-475e-8471-639451e077bc" containerID="99736cab003f1f977acd05686979f85a508f62c55888c3793a2056ea7b1706fe" exitCode=0 Apr 23 16:54:14.373949 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:14.373513 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-mznpd" Apr 23 16:54:14.373949 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:14.373543 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-mznpd" event={"ID":"bc43c2df-4bbd-475e-8471-639451e077bc","Type":"ContainerDied","Data":"99736cab003f1f977acd05686979f85a508f62c55888c3793a2056ea7b1706fe"} Apr 23 16:54:14.373949 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:14.373592 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-mznpd" event={"ID":"bc43c2df-4bbd-475e-8471-639451e077bc","Type":"ContainerDied","Data":"8324346c31c84d4dc693f3ca1698ce1c516e1af0502fcadc9dc11234f001b054"} Apr 23 16:54:14.373949 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:14.373621 2562 scope.go:117] "RemoveContainer" containerID="99736cab003f1f977acd05686979f85a508f62c55888c3793a2056ea7b1706fe" Apr 23 16:54:14.382425 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:14.382259 2562 scope.go:117] "RemoveContainer" containerID="a18124edb298be4a1877678cb428bd030213a5bc574e3d1f32a48c033d3aa4aa" Apr 23 16:54:14.391384 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:14.391330 2562 scope.go:117] "RemoveContainer" containerID="99736cab003f1f977acd05686979f85a508f62c55888c3793a2056ea7b1706fe" Apr 23 16:54:14.391860 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:54:14.391832 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99736cab003f1f977acd05686979f85a508f62c55888c3793a2056ea7b1706fe\": container with ID starting with 99736cab003f1f977acd05686979f85a508f62c55888c3793a2056ea7b1706fe not found: ID does not exist" containerID="99736cab003f1f977acd05686979f85a508f62c55888c3793a2056ea7b1706fe" Apr 23 16:54:14.391979 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:14.391862 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99736cab003f1f977acd05686979f85a508f62c55888c3793a2056ea7b1706fe"} err="failed to get container status \"99736cab003f1f977acd05686979f85a508f62c55888c3793a2056ea7b1706fe\": rpc error: code = NotFound desc = could not find container \"99736cab003f1f977acd05686979f85a508f62c55888c3793a2056ea7b1706fe\": container with ID starting with 99736cab003f1f977acd05686979f85a508f62c55888c3793a2056ea7b1706fe not found: ID does not exist" Apr 23 16:54:14.391979 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:14.391881 2562 scope.go:117] "RemoveContainer" containerID="a18124edb298be4a1877678cb428bd030213a5bc574e3d1f32a48c033d3aa4aa" Apr 23 16:54:14.392137 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:54:14.392115 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a18124edb298be4a1877678cb428bd030213a5bc574e3d1f32a48c033d3aa4aa\": container with ID starting with a18124edb298be4a1877678cb428bd030213a5bc574e3d1f32a48c033d3aa4aa not found: ID does not exist" containerID="a18124edb298be4a1877678cb428bd030213a5bc574e3d1f32a48c033d3aa4aa" Apr 23 16:54:14.392194 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:14.392149 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a18124edb298be4a1877678cb428bd030213a5bc574e3d1f32a48c033d3aa4aa"} err="failed to get container status \"a18124edb298be4a1877678cb428bd030213a5bc574e3d1f32a48c033d3aa4aa\": rpc error: code = NotFound desc = could not find container \"a18124edb298be4a1877678cb428bd030213a5bc574e3d1f32a48c033d3aa4aa\": container with ID starting with a18124edb298be4a1877678cb428bd030213a5bc574e3d1f32a48c033d3aa4aa not found: ID does not exist" Apr 23 16:54:14.394583 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:14.394557 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-mznpd"] Apr 23 16:54:14.399232 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:14.399210 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-mznpd"] Apr 23 16:54:14.741852 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:14.741823 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc43c2df-4bbd-475e-8471-639451e077bc" path="/var/lib/kubelet/pods/bc43c2df-4bbd-475e-8471-639451e077bc/volumes" Apr 23 16:54:17.385039 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:17.385007 2562 generic.go:358] "Generic (PLEG): container finished" podID="e54675d1-bc93-4a35-b18e-7200792fe7e9" containerID="a0af72ff2902c4ae5295cf990b1f987686d870158cd6919d8b3132f839c47ace" exitCode=0 Apr 23 16:54:17.385521 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:17.385059 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-wgc4f" event={"ID":"e54675d1-bc93-4a35-b18e-7200792fe7e9","Type":"ContainerDied","Data":"a0af72ff2902c4ae5295cf990b1f987686d870158cd6919d8b3132f839c47ace"} Apr 23 16:54:18.390098 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:18.390056 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-wgc4f" event={"ID":"e54675d1-bc93-4a35-b18e-7200792fe7e9","Type":"ContainerStarted","Data":"349ad7443c1432ae8d289b425ec1025330ae42691f7c69bc02b3afbe1b891cd1"} Apr 23 16:54:18.390685 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:18.390327 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-wgc4f" Apr 23 16:54:18.391701 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:18.391675 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-wgc4f" podUID="e54675d1-bc93-4a35-b18e-7200792fe7e9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 23 16:54:18.406019 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:18.405959 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-wgc4f" podStartSLOduration=6.405945443 podStartE2EDuration="6.405945443s" podCreationTimestamp="2026-04-23 16:54:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:54:18.404937912 +0000 UTC m=+1138.257752981" watchObservedRunningTime="2026-04-23 16:54:18.405945443 +0000 UTC m=+1138.258760506" Apr 23 16:54:19.393721 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:19.393680 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-wgc4f" podUID="e54675d1-bc93-4a35-b18e-7200792fe7e9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 23 16:54:29.395370 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:29.395338 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-wgc4f" Apr 23 16:54:32.526277 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:32.526246 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-wgc4f"] Apr 23 16:54:32.526718 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:32.526476 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-wgc4f" podUID="e54675d1-bc93-4a35-b18e-7200792fe7e9" containerName="kserve-container" containerID="cri-o://349ad7443c1432ae8d289b425ec1025330ae42691f7c69bc02b3afbe1b891cd1" gracePeriod=30 Apr 23 16:54:32.579514 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:32.579478 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-2nrv6"] Apr 23 16:54:32.579813 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:32.579800 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc43c2df-4bbd-475e-8471-639451e077bc" containerName="storage-initializer" Apr 23 16:54:32.579866 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:32.579814 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc43c2df-4bbd-475e-8471-639451e077bc" containerName="storage-initializer" Apr 23 16:54:32.579866 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:32.579832 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc43c2df-4bbd-475e-8471-639451e077bc" containerName="kserve-container" Apr 23 16:54:32.579866 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:32.579838 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc43c2df-4bbd-475e-8471-639451e077bc" containerName="kserve-container" Apr 23 16:54:32.579961 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:32.579887 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="bc43c2df-4bbd-475e-8471-639451e077bc" containerName="kserve-container" Apr 23 16:54:32.584112 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:32.584094 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-2nrv6" Apr 23 16:54:32.592080 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:32.592057 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-2nrv6"] Apr 23 16:54:32.744019 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:32.743979 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/70547a3d-ef2b-4bad-a44e-620f6c893585-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5497fc5f68-2nrv6\" (UID: \"70547a3d-ef2b-4bad-a44e-620f6c893585\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-2nrv6" Apr 23 16:54:32.845447 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:32.845358 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/70547a3d-ef2b-4bad-a44e-620f6c893585-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5497fc5f68-2nrv6\" (UID: \"70547a3d-ef2b-4bad-a44e-620f6c893585\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-2nrv6" Apr 23 16:54:32.845769 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:32.845724 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/70547a3d-ef2b-4bad-a44e-620f6c893585-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5497fc5f68-2nrv6\" (UID: \"70547a3d-ef2b-4bad-a44e-620f6c893585\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-2nrv6" Apr 23 16:54:32.894634 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:32.894590 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-2nrv6" Apr 23 16:54:33.020667 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:33.020631 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-2nrv6"] Apr 23 16:54:33.023856 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:54:33.023823 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70547a3d_ef2b_4bad_a44e_620f6c893585.slice/crio-d5eb08f180473ea1ffa4d2d44da22d2817edbebae16edf49eaefc5b72d84845c WatchSource:0}: Error finding container d5eb08f180473ea1ffa4d2d44da22d2817edbebae16edf49eaefc5b72d84845c: Status 404 returned error can't find the container with id d5eb08f180473ea1ffa4d2d44da22d2817edbebae16edf49eaefc5b72d84845c Apr 23 16:54:33.266469 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:33.266444 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-wgc4f" Apr 23 16:54:33.439250 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:33.439155 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-2nrv6" event={"ID":"70547a3d-ef2b-4bad-a44e-620f6c893585","Type":"ContainerStarted","Data":"a57b9b9af37e885cf74b47613b722f99a80cbd7473f6fc5d98ab6fa17a5d8dc3"} Apr 23 16:54:33.439250 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:33.439195 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-2nrv6" event={"ID":"70547a3d-ef2b-4bad-a44e-620f6c893585","Type":"ContainerStarted","Data":"d5eb08f180473ea1ffa4d2d44da22d2817edbebae16edf49eaefc5b72d84845c"} Apr 23 16:54:33.440469 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:33.440445 2562 generic.go:358] "Generic (PLEG): container finished" podID="e54675d1-bc93-4a35-b18e-7200792fe7e9" containerID="349ad7443c1432ae8d289b425ec1025330ae42691f7c69bc02b3afbe1b891cd1" exitCode=0 Apr 23 16:54:33.440579 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:33.440503 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-wgc4f" event={"ID":"e54675d1-bc93-4a35-b18e-7200792fe7e9","Type":"ContainerDied","Data":"349ad7443c1432ae8d289b425ec1025330ae42691f7c69bc02b3afbe1b891cd1"} Apr 23 16:54:33.440579 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:33.440525 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-wgc4f" Apr 23 16:54:33.440579 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:33.440541 2562 scope.go:117] "RemoveContainer" containerID="349ad7443c1432ae8d289b425ec1025330ae42691f7c69bc02b3afbe1b891cd1" Apr 23 16:54:33.440698 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:33.440531 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-wgc4f" event={"ID":"e54675d1-bc93-4a35-b18e-7200792fe7e9","Type":"ContainerDied","Data":"786d5771de009fbd6a0550904c27477e22cea132689d002ee4a3fe315c8bd421"} Apr 23 16:54:33.448630 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:33.448610 2562 scope.go:117] "RemoveContainer" containerID="a0af72ff2902c4ae5295cf990b1f987686d870158cd6919d8b3132f839c47ace" Apr 23 16:54:33.450648 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:33.450631 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e54675d1-bc93-4a35-b18e-7200792fe7e9-kserve-provision-location\") pod \"e54675d1-bc93-4a35-b18e-7200792fe7e9\" (UID: \"e54675d1-bc93-4a35-b18e-7200792fe7e9\") " Apr 23 16:54:33.450955 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:33.450934 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e54675d1-bc93-4a35-b18e-7200792fe7e9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e54675d1-bc93-4a35-b18e-7200792fe7e9" (UID: "e54675d1-bc93-4a35-b18e-7200792fe7e9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:54:33.455847 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:33.455665 2562 scope.go:117] "RemoveContainer" containerID="349ad7443c1432ae8d289b425ec1025330ae42691f7c69bc02b3afbe1b891cd1" Apr 23 16:54:33.456969 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:54:33.456939 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"349ad7443c1432ae8d289b425ec1025330ae42691f7c69bc02b3afbe1b891cd1\": container with ID starting with 349ad7443c1432ae8d289b425ec1025330ae42691f7c69bc02b3afbe1b891cd1 not found: ID does not exist" containerID="349ad7443c1432ae8d289b425ec1025330ae42691f7c69bc02b3afbe1b891cd1" Apr 23 16:54:33.457080 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:33.456982 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"349ad7443c1432ae8d289b425ec1025330ae42691f7c69bc02b3afbe1b891cd1"} err="failed to get container status \"349ad7443c1432ae8d289b425ec1025330ae42691f7c69bc02b3afbe1b891cd1\": rpc error: code = NotFound desc = could not find container \"349ad7443c1432ae8d289b425ec1025330ae42691f7c69bc02b3afbe1b891cd1\": container with ID starting with 349ad7443c1432ae8d289b425ec1025330ae42691f7c69bc02b3afbe1b891cd1 not found: ID does not exist" Apr 23 16:54:33.457080 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:33.457008 2562 scope.go:117] "RemoveContainer" containerID="a0af72ff2902c4ae5295cf990b1f987686d870158cd6919d8b3132f839c47ace" Apr 23 16:54:33.457326 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:54:33.457293 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0af72ff2902c4ae5295cf990b1f987686d870158cd6919d8b3132f839c47ace\": container with ID starting with a0af72ff2902c4ae5295cf990b1f987686d870158cd6919d8b3132f839c47ace not found: ID does not exist" containerID="a0af72ff2902c4ae5295cf990b1f987686d870158cd6919d8b3132f839c47ace" Apr 23 16:54:33.457418 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:33.457331 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0af72ff2902c4ae5295cf990b1f987686d870158cd6919d8b3132f839c47ace"} err="failed to get container status \"a0af72ff2902c4ae5295cf990b1f987686d870158cd6919d8b3132f839c47ace\": rpc error: code = NotFound desc = could not find container \"a0af72ff2902c4ae5295cf990b1f987686d870158cd6919d8b3132f839c47ace\": container with ID starting with a0af72ff2902c4ae5295cf990b1f987686d870158cd6919d8b3132f839c47ace not found: ID does not exist" Apr 23 16:54:33.551294 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:33.551259 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e54675d1-bc93-4a35-b18e-7200792fe7e9-kserve-provision-location\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 16:54:33.762499 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:33.762472 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-wgc4f"] Apr 23 16:54:33.766672 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:33.766647 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-wgc4f"] Apr 23 16:54:34.742898 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:34.742866 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e54675d1-bc93-4a35-b18e-7200792fe7e9" path="/var/lib/kubelet/pods/e54675d1-bc93-4a35-b18e-7200792fe7e9/volumes" Apr 23 16:54:37.454705 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:37.454672 2562 generic.go:358] "Generic (PLEG): container finished" podID="70547a3d-ef2b-4bad-a44e-620f6c893585" containerID="a57b9b9af37e885cf74b47613b722f99a80cbd7473f6fc5d98ab6fa17a5d8dc3" exitCode=0 Apr 23 16:54:37.455091 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:37.454758 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-2nrv6" event={"ID":"70547a3d-ef2b-4bad-a44e-620f6c893585","Type":"ContainerDied","Data":"a57b9b9af37e885cf74b47613b722f99a80cbd7473f6fc5d98ab6fa17a5d8dc3"} Apr 23 16:54:38.459144 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:38.459109 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-2nrv6" event={"ID":"70547a3d-ef2b-4bad-a44e-620f6c893585","Type":"ContainerStarted","Data":"8f3e2b2cc55a9c2ad1f97367b247d0579dd009f23d9904e2573f322e27023d43"} Apr 23 16:54:38.459528 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:38.459356 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-2nrv6" Apr 23 16:54:38.476650 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:54:38.476581 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-2nrv6" podStartSLOduration=6.476565267 podStartE2EDuration="6.476565267s" podCreationTimestamp="2026-04-23 16:54:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:54:38.475435694 +0000 UTC m=+1158.328250783" watchObservedRunningTime="2026-04-23 16:54:38.476565267 +0000 UTC m=+1158.329380330" Apr 23 16:55:09.466819 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:55:09.466731 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-2nrv6" Apr 23 16:55:12.720887 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:55:12.720851 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-2nrv6"] Apr 23 16:55:12.721248 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:55:12.721103 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-2nrv6" podUID="70547a3d-ef2b-4bad-a44e-620f6c893585" containerName="kserve-container" containerID="cri-o://8f3e2b2cc55a9c2ad1f97367b247d0579dd009f23d9904e2573f322e27023d43" gracePeriod=30 Apr 23 16:55:12.770062 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:55:12.770025 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5c9576c899-6cntj"] Apr 23 16:55:12.770429 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:55:12.770411 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e54675d1-bc93-4a35-b18e-7200792fe7e9" containerName="storage-initializer" Apr 23 16:55:12.770509 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:55:12.770432 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="e54675d1-bc93-4a35-b18e-7200792fe7e9" containerName="storage-initializer" Apr 23 16:55:12.770565 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:55:12.770510 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e54675d1-bc93-4a35-b18e-7200792fe7e9" containerName="kserve-container" Apr 23 16:55:12.770565 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:55:12.770521 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="e54675d1-bc93-4a35-b18e-7200792fe7e9" containerName="kserve-container" Apr 23 16:55:12.770661 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:55:12.770621 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="e54675d1-bc93-4a35-b18e-7200792fe7e9" containerName="kserve-container" Apr 23 16:55:12.773653 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:55:12.773632 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5c9576c899-6cntj" Apr 23 16:55:12.785317 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:55:12.785290 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5c9576c899-6cntj"] Apr 23 16:55:12.835887 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:55:12.835846 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/62381469-ebaf-455d-9c10-1c121a0682aa-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-5c9576c899-6cntj\" (UID: \"62381469-ebaf-455d-9c10-1c121a0682aa\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5c9576c899-6cntj" Apr 23 16:55:12.937263 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:55:12.937224 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/62381469-ebaf-455d-9c10-1c121a0682aa-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-5c9576c899-6cntj\" (UID: \"62381469-ebaf-455d-9c10-1c121a0682aa\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5c9576c899-6cntj" Apr 23 16:55:12.937625 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:55:12.937605 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/62381469-ebaf-455d-9c10-1c121a0682aa-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-5c9576c899-6cntj\" (UID: \"62381469-ebaf-455d-9c10-1c121a0682aa\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5c9576c899-6cntj" Apr 23 16:55:13.083327 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:55:13.083237 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5c9576c899-6cntj" Apr 23 16:55:13.209205 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:55:13.209117 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5c9576c899-6cntj"] Apr 23 16:55:13.211707 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:55:13.211678 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62381469_ebaf_455d_9c10_1c121a0682aa.slice/crio-b4ed4d32681a4e56ad0b9acd171eef9d1af2b742585febbe45e88f02558e03db WatchSource:0}: Error finding container b4ed4d32681a4e56ad0b9acd171eef9d1af2b742585febbe45e88f02558e03db: Status 404 returned error can't find the container with id b4ed4d32681a4e56ad0b9acd171eef9d1af2b742585febbe45e88f02558e03db Apr 23 16:55:13.579363 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:55:13.579328 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5c9576c899-6cntj" event={"ID":"62381469-ebaf-455d-9c10-1c121a0682aa","Type":"ContainerStarted","Data":"3b45ac9313f6c8989d8d447815663647fab10ea03425e48688ff148e829ef14f"} Apr 23 16:55:13.579539 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:55:13.579375 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5c9576c899-6cntj" event={"ID":"62381469-ebaf-455d-9c10-1c121a0682aa","Type":"ContainerStarted","Data":"b4ed4d32681a4e56ad0b9acd171eef9d1af2b742585febbe45e88f02558e03db"} Apr 23 16:55:14.051338 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:55:14.051312 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-2nrv6" Apr 23 16:55:14.148272 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:55:14.148232 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/70547a3d-ef2b-4bad-a44e-620f6c893585-kserve-provision-location\") pod \"70547a3d-ef2b-4bad-a44e-620f6c893585\" (UID: \"70547a3d-ef2b-4bad-a44e-620f6c893585\") " Apr 23 16:55:14.148619 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:55:14.148598 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70547a3d-ef2b-4bad-a44e-620f6c893585-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "70547a3d-ef2b-4bad-a44e-620f6c893585" (UID: "70547a3d-ef2b-4bad-a44e-620f6c893585"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:55:14.249667 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:55:14.249634 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/70547a3d-ef2b-4bad-a44e-620f6c893585-kserve-provision-location\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 16:55:14.583579 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:55:14.583541 2562 generic.go:358] "Generic (PLEG): container finished" podID="70547a3d-ef2b-4bad-a44e-620f6c893585" containerID="8f3e2b2cc55a9c2ad1f97367b247d0579dd009f23d9904e2573f322e27023d43" exitCode=0 Apr 23 16:55:14.583793 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:55:14.583612 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-2nrv6" Apr 23 16:55:14.583793 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:55:14.583639 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-2nrv6" event={"ID":"70547a3d-ef2b-4bad-a44e-620f6c893585","Type":"ContainerDied","Data":"8f3e2b2cc55a9c2ad1f97367b247d0579dd009f23d9904e2573f322e27023d43"} Apr 23 16:55:14.583793 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:55:14.583680 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-2nrv6" event={"ID":"70547a3d-ef2b-4bad-a44e-620f6c893585","Type":"ContainerDied","Data":"d5eb08f180473ea1ffa4d2d44da22d2817edbebae16edf49eaefc5b72d84845c"} Apr 23 16:55:14.583793 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:55:14.583696 2562 scope.go:117] "RemoveContainer" containerID="8f3e2b2cc55a9c2ad1f97367b247d0579dd009f23d9904e2573f322e27023d43" Apr 23 16:55:14.592177 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:55:14.592160 2562 scope.go:117] "RemoveContainer" containerID="a57b9b9af37e885cf74b47613b722f99a80cbd7473f6fc5d98ab6fa17a5d8dc3" Apr 23 16:55:14.599260 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:55:14.599239 2562 scope.go:117] "RemoveContainer" containerID="8f3e2b2cc55a9c2ad1f97367b247d0579dd009f23d9904e2573f322e27023d43" Apr 23 16:55:14.599477 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:55:14.599457 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f3e2b2cc55a9c2ad1f97367b247d0579dd009f23d9904e2573f322e27023d43\": container with ID starting with 8f3e2b2cc55a9c2ad1f97367b247d0579dd009f23d9904e2573f322e27023d43 not found: ID does not exist" containerID="8f3e2b2cc55a9c2ad1f97367b247d0579dd009f23d9904e2573f322e27023d43" Apr 23 16:55:14.599546 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:55:14.599491 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f3e2b2cc55a9c2ad1f97367b247d0579dd009f23d9904e2573f322e27023d43"} err="failed to get container status \"8f3e2b2cc55a9c2ad1f97367b247d0579dd009f23d9904e2573f322e27023d43\": rpc error: code = NotFound desc = could not find container \"8f3e2b2cc55a9c2ad1f97367b247d0579dd009f23d9904e2573f322e27023d43\": container with ID starting with 8f3e2b2cc55a9c2ad1f97367b247d0579dd009f23d9904e2573f322e27023d43 not found: ID does not exist" Apr 23 16:55:14.599546 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:55:14.599517 2562 scope.go:117] "RemoveContainer" containerID="a57b9b9af37e885cf74b47613b722f99a80cbd7473f6fc5d98ab6fa17a5d8dc3" Apr 23 16:55:14.599778 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:55:14.599756 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a57b9b9af37e885cf74b47613b722f99a80cbd7473f6fc5d98ab6fa17a5d8dc3\": container with ID starting with a57b9b9af37e885cf74b47613b722f99a80cbd7473f6fc5d98ab6fa17a5d8dc3 not found: ID does not exist" containerID="a57b9b9af37e885cf74b47613b722f99a80cbd7473f6fc5d98ab6fa17a5d8dc3" Apr 23 16:55:14.599827 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:55:14.599787 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a57b9b9af37e885cf74b47613b722f99a80cbd7473f6fc5d98ab6fa17a5d8dc3"} err="failed to get container status \"a57b9b9af37e885cf74b47613b722f99a80cbd7473f6fc5d98ab6fa17a5d8dc3\": rpc error: code = NotFound desc = could not find container \"a57b9b9af37e885cf74b47613b722f99a80cbd7473f6fc5d98ab6fa17a5d8dc3\": container with ID starting with a57b9b9af37e885cf74b47613b722f99a80cbd7473f6fc5d98ab6fa17a5d8dc3 not found: ID does not exist" Apr 23 16:55:14.604870 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:55:14.604849 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-2nrv6"] Apr 23 16:55:14.608365 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:55:14.608347 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-2nrv6"] Apr 23 16:55:14.742593 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:55:14.742564 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70547a3d-ef2b-4bad-a44e-620f6c893585" path="/var/lib/kubelet/pods/70547a3d-ef2b-4bad-a44e-620f6c893585/volumes" Apr 23 16:55:17.594578 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:55:17.594542 2562 generic.go:358] "Generic (PLEG): container finished" podID="62381469-ebaf-455d-9c10-1c121a0682aa" containerID="3b45ac9313f6c8989d8d447815663647fab10ea03425e48688ff148e829ef14f" exitCode=0 Apr 23 16:55:17.594972 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:55:17.594615 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5c9576c899-6cntj" event={"ID":"62381469-ebaf-455d-9c10-1c121a0682aa","Type":"ContainerDied","Data":"3b45ac9313f6c8989d8d447815663647fab10ea03425e48688ff148e829ef14f"} Apr 23 16:55:18.599806 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:55:18.599771 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5c9576c899-6cntj" event={"ID":"62381469-ebaf-455d-9c10-1c121a0682aa","Type":"ContainerStarted","Data":"5ab43cd195da21aa31d4a9dbe701b8796d2007a612b393e4c140bd8510eb5d5e"} Apr 23 16:55:20.609252 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:55:20.609220 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5c9576c899-6cntj" event={"ID":"62381469-ebaf-455d-9c10-1c121a0682aa","Type":"ContainerStarted","Data":"f0c607b4904222f9bd7a320c579a08752c0f3b6f356d1b83117bb547f8c369f8"} Apr 23 16:55:20.609652 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:55:20.609352 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5c9576c899-6cntj" Apr 23 16:55:20.609652 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:55:20.609374 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5c9576c899-6cntj" Apr 23 16:55:20.627506 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:55:20.627453 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5c9576c899-6cntj" podStartSLOduration=5.804726443 podStartE2EDuration="8.627437822s" podCreationTimestamp="2026-04-23 16:55:12 +0000 UTC" firstStartedPulling="2026-04-23 16:55:17.67273119 +0000 UTC m=+1197.525546233" lastFinishedPulling="2026-04-23 16:55:20.49544257 +0000 UTC m=+1200.348257612" observedRunningTime="2026-04-23 16:55:20.62652331 +0000 UTC m=+1200.479338374" watchObservedRunningTime="2026-04-23 16:55:20.627437822 +0000 UTC m=+1200.480252885" Apr 23 16:55:51.616266 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:55:51.616228 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5c9576c899-6cntj" Apr 23 16:56:21.618153 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:56:21.618123 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5c9576c899-6cntj" Apr 23 16:56:22.900819 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:56:22.900790 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5c9576c899-6cntj"] Apr 23 16:56:22.901318 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:56:22.901057 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5c9576c899-6cntj" podUID="62381469-ebaf-455d-9c10-1c121a0682aa" containerName="kserve-container" containerID="cri-o://5ab43cd195da21aa31d4a9dbe701b8796d2007a612b393e4c140bd8510eb5d5e" gracePeriod=30 Apr 23 16:56:22.901318 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:56:22.901081 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5c9576c899-6cntj" podUID="62381469-ebaf-455d-9c10-1c121a0682aa" containerName="kserve-agent" containerID="cri-o://f0c607b4904222f9bd7a320c579a08752c0f3b6f356d1b83117bb547f8c369f8" gracePeriod=30 Apr 23 16:56:22.962380 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:56:22.962340 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-2cs4k"] Apr 23 16:56:22.962759 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:56:22.962721 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="70547a3d-ef2b-4bad-a44e-620f6c893585" containerName="kserve-container" Apr 23 16:56:22.962759 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:56:22.962756 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="70547a3d-ef2b-4bad-a44e-620f6c893585" containerName="kserve-container" Apr 23 16:56:22.962899 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:56:22.962799 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="70547a3d-ef2b-4bad-a44e-620f6c893585" containerName="storage-initializer" Apr 23 16:56:22.962899 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:56:22.962808 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="70547a3d-ef2b-4bad-a44e-620f6c893585" containerName="storage-initializer" Apr 23 16:56:22.962899 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:56:22.962892 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="70547a3d-ef2b-4bad-a44e-620f6c893585" containerName="kserve-container" Apr 23 16:56:22.965956 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:56:22.965936 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-2cs4k" Apr 23 16:56:22.976524 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:56:22.976500 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-2cs4k"] Apr 23 16:56:23.010928 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:56:23.010894 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6d29c8d4-57cc-47e1-9981-e00dced1c3ee-kserve-provision-location\") pod \"isvc-paddle-predictor-7dddcb4bd4-2cs4k\" (UID: \"6d29c8d4-57cc-47e1-9981-e00dced1c3ee\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-2cs4k" Apr 23 16:56:23.112111 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:56:23.112073 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6d29c8d4-57cc-47e1-9981-e00dced1c3ee-kserve-provision-location\") pod \"isvc-paddle-predictor-7dddcb4bd4-2cs4k\" (UID: \"6d29c8d4-57cc-47e1-9981-e00dced1c3ee\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-2cs4k" Apr 23 16:56:23.112453 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:56:23.112433 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6d29c8d4-57cc-47e1-9981-e00dced1c3ee-kserve-provision-location\") pod \"isvc-paddle-predictor-7dddcb4bd4-2cs4k\" (UID: \"6d29c8d4-57cc-47e1-9981-e00dced1c3ee\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-2cs4k" Apr 23 16:56:23.276921 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:56:23.276833 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-2cs4k" Apr 23 16:56:23.396051 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:56:23.396027 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-2cs4k"] Apr 23 16:56:23.398226 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:56:23.398195 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d29c8d4_57cc_47e1_9981_e00dced1c3ee.slice/crio-a2c45c9188acc015d82452188adbdbb4ff2bb877b2a8f5fa8023102ba091dd73 WatchSource:0}: Error finding container a2c45c9188acc015d82452188adbdbb4ff2bb877b2a8f5fa8023102ba091dd73: Status 404 returned error can't find the container with id a2c45c9188acc015d82452188adbdbb4ff2bb877b2a8f5fa8023102ba091dd73 Apr 23 16:56:23.792330 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:56:23.792298 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-2cs4k" event={"ID":"6d29c8d4-57cc-47e1-9981-e00dced1c3ee","Type":"ContainerStarted","Data":"b34b6d45a8877489f04ad7ceb1010eb438ec0a6f7811a5ba2cdfc7b9911306ce"} Apr 23 16:56:23.792495 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:56:23.792336 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-2cs4k" event={"ID":"6d29c8d4-57cc-47e1-9981-e00dced1c3ee","Type":"ContainerStarted","Data":"a2c45c9188acc015d82452188adbdbb4ff2bb877b2a8f5fa8023102ba091dd73"} Apr 23 16:56:25.800449 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:56:25.800420 2562 generic.go:358] "Generic (PLEG): container finished" podID="62381469-ebaf-455d-9c10-1c121a0682aa" containerID="5ab43cd195da21aa31d4a9dbe701b8796d2007a612b393e4c140bd8510eb5d5e" exitCode=0 Apr 23 16:56:25.800833 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:56:25.800492 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5c9576c899-6cntj" event={"ID":"62381469-ebaf-455d-9c10-1c121a0682aa","Type":"ContainerDied","Data":"5ab43cd195da21aa31d4a9dbe701b8796d2007a612b393e4c140bd8510eb5d5e"} Apr 23 16:56:28.810499 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:56:28.810458 2562 generic.go:358] "Generic (PLEG): container finished" podID="6d29c8d4-57cc-47e1-9981-e00dced1c3ee" containerID="b34b6d45a8877489f04ad7ceb1010eb438ec0a6f7811a5ba2cdfc7b9911306ce" exitCode=0 Apr 23 16:56:28.810906 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:56:28.810532 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-2cs4k" event={"ID":"6d29c8d4-57cc-47e1-9981-e00dced1c3ee","Type":"ContainerDied","Data":"b34b6d45a8877489f04ad7ceb1010eb438ec0a6f7811a5ba2cdfc7b9911306ce"} Apr 23 16:56:31.613792 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:56:31.613728 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5c9576c899-6cntj" podUID="62381469-ebaf-455d-9c10-1c121a0682aa" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.32:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.133.0.32:8080: connect: connection refused" Apr 23 16:56:40.853983 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:56:40.853890 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-2cs4k" event={"ID":"6d29c8d4-57cc-47e1-9981-e00dced1c3ee","Type":"ContainerStarted","Data":"a85c98602cb34746f17dd0122d16ad9c117d42081929b65f36eaae309f0b79b9"} Apr 23 16:56:40.854454 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:56:40.854229 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-2cs4k" Apr 23 16:56:40.855593 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:56:40.855565 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-2cs4k" podUID="6d29c8d4-57cc-47e1-9981-e00dced1c3ee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 23 16:56:40.880270 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:56:40.880218 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-2cs4k" podStartSLOduration=7.248400647 podStartE2EDuration="18.880204702s" podCreationTimestamp="2026-04-23 16:56:22 +0000 UTC" firstStartedPulling="2026-04-23 16:56:28.811677271 +0000 UTC m=+1268.664492313" lastFinishedPulling="2026-04-23 16:56:40.443481322 +0000 UTC m=+1280.296296368" observedRunningTime="2026-04-23 16:56:40.878520512 +0000 UTC m=+1280.731335576" watchObservedRunningTime="2026-04-23 16:56:40.880204702 +0000 UTC m=+1280.733019767" Apr 23 16:56:41.613141 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:56:41.613096 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5c9576c899-6cntj" podUID="62381469-ebaf-455d-9c10-1c121a0682aa" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.32:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.133.0.32:8080: connect: connection refused" Apr 23 16:56:41.859881 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:56:41.859841 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-2cs4k" podUID="6d29c8d4-57cc-47e1-9981-e00dced1c3ee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 23 16:56:51.613720 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:56:51.613680 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5c9576c899-6cntj" podUID="62381469-ebaf-455d-9c10-1c121a0682aa" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.32:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.133.0.32:8080: connect: connection refused" Apr 23 16:56:51.614112 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:56:51.613855 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5c9576c899-6cntj" Apr 23 16:56:51.860873 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:56:51.860832 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-2cs4k" podUID="6d29c8d4-57cc-47e1-9981-e00dced1c3ee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 23 16:56:53.050965 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:56:53.050943 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5c9576c899-6cntj" Apr 23 16:56:53.187585 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:56:53.187488 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/62381469-ebaf-455d-9c10-1c121a0682aa-kserve-provision-location\") pod \"62381469-ebaf-455d-9c10-1c121a0682aa\" (UID: \"62381469-ebaf-455d-9c10-1c121a0682aa\") " Apr 23 16:56:53.187916 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:56:53.187884 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62381469-ebaf-455d-9c10-1c121a0682aa-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "62381469-ebaf-455d-9c10-1c121a0682aa" (UID: "62381469-ebaf-455d-9c10-1c121a0682aa"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:56:53.288931 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:56:53.288894 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/62381469-ebaf-455d-9c10-1c121a0682aa-kserve-provision-location\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 16:56:53.895985 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:56:53.895948 2562 generic.go:358] "Generic (PLEG): container finished" podID="62381469-ebaf-455d-9c10-1c121a0682aa" containerID="f0c607b4904222f9bd7a320c579a08752c0f3b6f356d1b83117bb547f8c369f8" exitCode=0 Apr 23 16:56:53.896270 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:56:53.896023 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5c9576c899-6cntj" event={"ID":"62381469-ebaf-455d-9c10-1c121a0682aa","Type":"ContainerDied","Data":"f0c607b4904222f9bd7a320c579a08752c0f3b6f356d1b83117bb547f8c369f8"} Apr 23 16:56:53.896270 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:56:53.896058 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5c9576c899-6cntj" event={"ID":"62381469-ebaf-455d-9c10-1c121a0682aa","Type":"ContainerDied","Data":"b4ed4d32681a4e56ad0b9acd171eef9d1af2b742585febbe45e88f02558e03db"} Apr 23 16:56:53.896270 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:56:53.896074 2562 scope.go:117] "RemoveContainer" containerID="f0c607b4904222f9bd7a320c579a08752c0f3b6f356d1b83117bb547f8c369f8" Apr 23 16:56:53.896270 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:56:53.896031 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5c9576c899-6cntj" Apr 23 16:56:53.904165 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:56:53.904147 2562 scope.go:117] "RemoveContainer" containerID="5ab43cd195da21aa31d4a9dbe701b8796d2007a612b393e4c140bd8510eb5d5e" Apr 23 16:56:53.911100 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:56:53.910907 2562 scope.go:117] "RemoveContainer" containerID="3b45ac9313f6c8989d8d447815663647fab10ea03425e48688ff148e829ef14f" Apr 23 16:56:53.917870 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:56:53.917859 2562 scope.go:117] "RemoveContainer" containerID="f0c607b4904222f9bd7a320c579a08752c0f3b6f356d1b83117bb547f8c369f8" Apr 23 16:56:53.918128 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:56:53.918108 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0c607b4904222f9bd7a320c579a08752c0f3b6f356d1b83117bb547f8c369f8\": container with ID starting with f0c607b4904222f9bd7a320c579a08752c0f3b6f356d1b83117bb547f8c369f8 not found: ID does not exist" containerID="f0c607b4904222f9bd7a320c579a08752c0f3b6f356d1b83117bb547f8c369f8" Apr 23 16:56:53.918194 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:56:53.918137 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0c607b4904222f9bd7a320c579a08752c0f3b6f356d1b83117bb547f8c369f8"} err="failed to get container status \"f0c607b4904222f9bd7a320c579a08752c0f3b6f356d1b83117bb547f8c369f8\": rpc error: code = NotFound desc = could not find container \"f0c607b4904222f9bd7a320c579a08752c0f3b6f356d1b83117bb547f8c369f8\": container with ID starting with f0c607b4904222f9bd7a320c579a08752c0f3b6f356d1b83117bb547f8c369f8 not found: ID does not exist" Apr 23 16:56:53.918194 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:56:53.918154 2562 scope.go:117] "RemoveContainer" containerID="5ab43cd195da21aa31d4a9dbe701b8796d2007a612b393e4c140bd8510eb5d5e" Apr 23 16:56:53.918388 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:56:53.918373 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ab43cd195da21aa31d4a9dbe701b8796d2007a612b393e4c140bd8510eb5d5e\": container with ID starting with 5ab43cd195da21aa31d4a9dbe701b8796d2007a612b393e4c140bd8510eb5d5e not found: ID does not exist" containerID="5ab43cd195da21aa31d4a9dbe701b8796d2007a612b393e4c140bd8510eb5d5e" Apr 23 16:56:53.918453 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:56:53.918390 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ab43cd195da21aa31d4a9dbe701b8796d2007a612b393e4c140bd8510eb5d5e"} err="failed to get container status \"5ab43cd195da21aa31d4a9dbe701b8796d2007a612b393e4c140bd8510eb5d5e\": rpc error: code = NotFound desc = could not find container \"5ab43cd195da21aa31d4a9dbe701b8796d2007a612b393e4c140bd8510eb5d5e\": container with ID starting with 5ab43cd195da21aa31d4a9dbe701b8796d2007a612b393e4c140bd8510eb5d5e not found: ID does not exist" Apr 23 16:56:53.918453 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:56:53.918403 2562 scope.go:117] "RemoveContainer" containerID="3b45ac9313f6c8989d8d447815663647fab10ea03425e48688ff148e829ef14f" Apr 23 16:56:53.918453 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:56:53.918425 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5c9576c899-6cntj"] Apr 23 16:56:53.918605 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:56:53.918589 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b45ac9313f6c8989d8d447815663647fab10ea03425e48688ff148e829ef14f\": container with ID starting with 3b45ac9313f6c8989d8d447815663647fab10ea03425e48688ff148e829ef14f not found: ID does not exist" containerID="3b45ac9313f6c8989d8d447815663647fab10ea03425e48688ff148e829ef14f" Apr 23 16:56:53.918644 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:56:53.918607 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b45ac9313f6c8989d8d447815663647fab10ea03425e48688ff148e829ef14f"} err="failed to get container status \"3b45ac9313f6c8989d8d447815663647fab10ea03425e48688ff148e829ef14f\": rpc error: code = NotFound desc = could not find container \"3b45ac9313f6c8989d8d447815663647fab10ea03425e48688ff148e829ef14f\": container with ID starting with 3b45ac9313f6c8989d8d447815663647fab10ea03425e48688ff148e829ef14f not found: ID does not exist" Apr 23 16:56:53.921089 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:56:53.921065 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5c9576c899-6cntj"] Apr 23 16:56:54.741417 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:56:54.741386 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62381469-ebaf-455d-9c10-1c121a0682aa" path="/var/lib/kubelet/pods/62381469-ebaf-455d-9c10-1c121a0682aa/volumes" Apr 23 16:57:01.860538 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:57:01.860491 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-2cs4k" podUID="6d29c8d4-57cc-47e1-9981-e00dced1c3ee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 23 16:57:11.860045 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:57:11.860003 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-2cs4k" podUID="6d29c8d4-57cc-47e1-9981-e00dced1c3ee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 23 16:57:21.860449 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:57:21.860403 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-2cs4k" podUID="6d29c8d4-57cc-47e1-9981-e00dced1c3ee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 23 16:57:31.861568 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:57:31.861529 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-2cs4k" Apr 23 16:57:34.475892 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:57:34.475855 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-2cs4k"] Apr 23 16:57:34.476311 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:57:34.476106 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-2cs4k" podUID="6d29c8d4-57cc-47e1-9981-e00dced1c3ee" containerName="kserve-container" containerID="cri-o://a85c98602cb34746f17dd0122d16ad9c117d42081929b65f36eaae309f0b79b9" gracePeriod=30 Apr 23 16:57:34.554772 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:57:34.554722 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-rjh9l"] Apr 23 16:57:34.555081 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:57:34.555068 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="62381469-ebaf-455d-9c10-1c121a0682aa" containerName="kserve-container" Apr 23 16:57:34.555124 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:57:34.555083 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="62381469-ebaf-455d-9c10-1c121a0682aa" containerName="kserve-container" Apr 23 16:57:34.555124 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:57:34.555091 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="62381469-ebaf-455d-9c10-1c121a0682aa" containerName="kserve-agent" Apr 23 16:57:34.555124 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:57:34.555097 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="62381469-ebaf-455d-9c10-1c121a0682aa" containerName="kserve-agent" Apr 23 16:57:34.555124 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:57:34.555114 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="62381469-ebaf-455d-9c10-1c121a0682aa" containerName="storage-initializer" Apr 23 16:57:34.555124 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:57:34.555119 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="62381469-ebaf-455d-9c10-1c121a0682aa" containerName="storage-initializer" Apr 23 16:57:34.555275 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:57:34.555163 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="62381469-ebaf-455d-9c10-1c121a0682aa" containerName="kserve-agent" Apr 23 16:57:34.555275 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:57:34.555175 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="62381469-ebaf-455d-9c10-1c121a0682aa" containerName="kserve-container" Apr 23 16:57:34.559297 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:57:34.559280 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-rjh9l" Apr 23 16:57:34.571141 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:57:34.571115 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-rjh9l"] Apr 23 16:57:34.600912 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:57:34.600884 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c5d8ddf4-0d63-4ecd-a373-4235173b9e85-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-86b49c4466-rjh9l\" (UID: \"c5d8ddf4-0d63-4ecd-a373-4235173b9e85\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-rjh9l" Apr 23 16:57:34.701580 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:57:34.701541 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c5d8ddf4-0d63-4ecd-a373-4235173b9e85-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-86b49c4466-rjh9l\" (UID: \"c5d8ddf4-0d63-4ecd-a373-4235173b9e85\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-rjh9l" Apr 23 16:57:34.701965 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:57:34.701947 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c5d8ddf4-0d63-4ecd-a373-4235173b9e85-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-86b49c4466-rjh9l\" (UID: \"c5d8ddf4-0d63-4ecd-a373-4235173b9e85\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-rjh9l" Apr 23 16:57:34.869005 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:57:34.868928 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-rjh9l" Apr 23 16:57:34.988350 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:57:34.988325 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-rjh9l"] Apr 23 16:57:34.990235 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:57:34.990206 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5d8ddf4_0d63_4ecd_a373_4235173b9e85.slice/crio-61f16176739108b34702bb36dcb742bf2a1f1588b003ed986b4ada514dc8fa3d WatchSource:0}: Error finding container 61f16176739108b34702bb36dcb742bf2a1f1588b003ed986b4ada514dc8fa3d: Status 404 returned error can't find the container with id 61f16176739108b34702bb36dcb742bf2a1f1588b003ed986b4ada514dc8fa3d Apr 23 16:57:35.018550 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:57:35.018523 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-rjh9l" event={"ID":"c5d8ddf4-0d63-4ecd-a373-4235173b9e85","Type":"ContainerStarted","Data":"61f16176739108b34702bb36dcb742bf2a1f1588b003ed986b4ada514dc8fa3d"} Apr 23 16:57:36.022811 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:57:36.022775 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-rjh9l" event={"ID":"c5d8ddf4-0d63-4ecd-a373-4235173b9e85","Type":"ContainerStarted","Data":"cde3a75e998cb27d3ba90a9379d85ee618c80e073f04a48ad262f2f303b5f20e"} Apr 23 16:57:37.319086 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:57:37.319064 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-2cs4k" Apr 23 16:57:37.423256 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:57:37.423171 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6d29c8d4-57cc-47e1-9981-e00dced1c3ee-kserve-provision-location\") pod \"6d29c8d4-57cc-47e1-9981-e00dced1c3ee\" (UID: \"6d29c8d4-57cc-47e1-9981-e00dced1c3ee\") " Apr 23 16:57:37.432846 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:57:37.432817 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d29c8d4-57cc-47e1-9981-e00dced1c3ee-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6d29c8d4-57cc-47e1-9981-e00dced1c3ee" (UID: "6d29c8d4-57cc-47e1-9981-e00dced1c3ee"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:57:37.523692 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:57:37.523657 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6d29c8d4-57cc-47e1-9981-e00dced1c3ee-kserve-provision-location\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 16:57:38.029476 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:57:38.029442 2562 generic.go:358] "Generic (PLEG): container finished" podID="6d29c8d4-57cc-47e1-9981-e00dced1c3ee" containerID="a85c98602cb34746f17dd0122d16ad9c117d42081929b65f36eaae309f0b79b9" exitCode=0 Apr 23 16:57:38.029476 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:57:38.029479 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-2cs4k" event={"ID":"6d29c8d4-57cc-47e1-9981-e00dced1c3ee","Type":"ContainerDied","Data":"a85c98602cb34746f17dd0122d16ad9c117d42081929b65f36eaae309f0b79b9"} Apr 23 16:57:38.029709 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:57:38.029510 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-2cs4k" event={"ID":"6d29c8d4-57cc-47e1-9981-e00dced1c3ee","Type":"ContainerDied","Data":"a2c45c9188acc015d82452188adbdbb4ff2bb877b2a8f5fa8023102ba091dd73"} Apr 23 16:57:38.029709 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:57:38.029527 2562 scope.go:117] "RemoveContainer" containerID="a85c98602cb34746f17dd0122d16ad9c117d42081929b65f36eaae309f0b79b9" Apr 23 16:57:38.029709 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:57:38.029530 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-2cs4k" Apr 23 16:57:38.039395 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:57:38.039373 2562 scope.go:117] "RemoveContainer" containerID="b34b6d45a8877489f04ad7ceb1010eb438ec0a6f7811a5ba2cdfc7b9911306ce" Apr 23 16:57:38.047259 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:57:38.047236 2562 scope.go:117] "RemoveContainer" containerID="a85c98602cb34746f17dd0122d16ad9c117d42081929b65f36eaae309f0b79b9" Apr 23 16:57:38.047521 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:57:38.047496 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a85c98602cb34746f17dd0122d16ad9c117d42081929b65f36eaae309f0b79b9\": container with ID starting with a85c98602cb34746f17dd0122d16ad9c117d42081929b65f36eaae309f0b79b9 not found: ID does not exist" containerID="a85c98602cb34746f17dd0122d16ad9c117d42081929b65f36eaae309f0b79b9" Apr 23 16:57:38.047569 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:57:38.047534 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a85c98602cb34746f17dd0122d16ad9c117d42081929b65f36eaae309f0b79b9"} err="failed to get container status \"a85c98602cb34746f17dd0122d16ad9c117d42081929b65f36eaae309f0b79b9\": rpc error: code = NotFound desc = could not find container \"a85c98602cb34746f17dd0122d16ad9c117d42081929b65f36eaae309f0b79b9\": container with ID starting with a85c98602cb34746f17dd0122d16ad9c117d42081929b65f36eaae309f0b79b9 not found: ID does not exist" Apr 23 16:57:38.047569 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:57:38.047559 2562 scope.go:117] "RemoveContainer" containerID="b34b6d45a8877489f04ad7ceb1010eb438ec0a6f7811a5ba2cdfc7b9911306ce" Apr 23 16:57:38.047852 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:57:38.047832 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b34b6d45a8877489f04ad7ceb1010eb438ec0a6f7811a5ba2cdfc7b9911306ce\": container with ID starting with b34b6d45a8877489f04ad7ceb1010eb438ec0a6f7811a5ba2cdfc7b9911306ce not found: ID does not exist" containerID="b34b6d45a8877489f04ad7ceb1010eb438ec0a6f7811a5ba2cdfc7b9911306ce" Apr 23 16:57:38.047904 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:57:38.047858 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b34b6d45a8877489f04ad7ceb1010eb438ec0a6f7811a5ba2cdfc7b9911306ce"} err="failed to get container status \"b34b6d45a8877489f04ad7ceb1010eb438ec0a6f7811a5ba2cdfc7b9911306ce\": rpc error: code = NotFound desc = could not find container \"b34b6d45a8877489f04ad7ceb1010eb438ec0a6f7811a5ba2cdfc7b9911306ce\": container with ID starting with b34b6d45a8877489f04ad7ceb1010eb438ec0a6f7811a5ba2cdfc7b9911306ce not found: ID does not exist" Apr 23 16:57:38.052769 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:57:38.052725 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-2cs4k"] Apr 23 16:57:38.056445 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:57:38.056420 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-2cs4k"] Apr 23 16:57:38.744001 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:57:38.743954 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d29c8d4-57cc-47e1-9981-e00dced1c3ee" path="/var/lib/kubelet/pods/6d29c8d4-57cc-47e1-9981-e00dced1c3ee/volumes" Apr 23 16:57:40.037629 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:57:40.037592 2562 generic.go:358] "Generic (PLEG): container finished" podID="c5d8ddf4-0d63-4ecd-a373-4235173b9e85" containerID="cde3a75e998cb27d3ba90a9379d85ee618c80e073f04a48ad262f2f303b5f20e" exitCode=0 Apr 23 16:57:40.038058 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:57:40.037675 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-rjh9l" event={"ID":"c5d8ddf4-0d63-4ecd-a373-4235173b9e85","Type":"ContainerDied","Data":"cde3a75e998cb27d3ba90a9379d85ee618c80e073f04a48ad262f2f303b5f20e"} Apr 23 16:57:41.042509 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:57:41.042472 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-rjh9l" event={"ID":"c5d8ddf4-0d63-4ecd-a373-4235173b9e85","Type":"ContainerStarted","Data":"ce4a179d677d3c94aaa37cb37a2786ed6bcb42f21b92a098753303519f9ff8be"} Apr 23 16:57:41.043011 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:57:41.042848 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-rjh9l" Apr 23 16:57:41.044250 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:57:41.044226 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-rjh9l" podUID="c5d8ddf4-0d63-4ecd-a373-4235173b9e85" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 23 16:57:41.061064 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:57:41.061019 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-rjh9l" podStartSLOduration=7.061006531 podStartE2EDuration="7.061006531s" podCreationTimestamp="2026-04-23 16:57:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:57:41.060030373 +0000 UTC m=+1340.912845451" watchObservedRunningTime="2026-04-23 16:57:41.061006531 +0000 UTC m=+1340.913821593" Apr 23 16:57:42.045799 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:57:42.045763 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-rjh9l" podUID="c5d8ddf4-0d63-4ecd-a373-4235173b9e85" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 23 16:57:52.046289 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:57:52.046233 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-rjh9l" podUID="c5d8ddf4-0d63-4ecd-a373-4235173b9e85" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 23 16:58:02.046562 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:58:02.046511 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-rjh9l" podUID="c5d8ddf4-0d63-4ecd-a373-4235173b9e85" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 23 16:58:12.046603 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:58:12.046550 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-rjh9l" podUID="c5d8ddf4-0d63-4ecd-a373-4235173b9e85" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 23 16:58:22.046717 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:58:22.046659 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-rjh9l" podUID="c5d8ddf4-0d63-4ecd-a373-4235173b9e85" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 23 16:58:32.046929 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:58:32.046899 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-rjh9l" Apr 23 16:58:36.076434 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:58:36.076401 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-rjh9l"] Apr 23 16:58:36.076850 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:58:36.076656 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-rjh9l" podUID="c5d8ddf4-0d63-4ecd-a373-4235173b9e85" containerName="kserve-container" containerID="cri-o://ce4a179d677d3c94aaa37cb37a2786ed6bcb42f21b92a098753303519f9ff8be" gracePeriod=30 Apr 23 16:58:36.190483 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:58:36.190443 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-6nf8m"] Apr 23 16:58:36.190870 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:58:36.190853 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6d29c8d4-57cc-47e1-9981-e00dced1c3ee" containerName="kserve-container" Apr 23 16:58:36.190961 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:58:36.190873 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d29c8d4-57cc-47e1-9981-e00dced1c3ee" containerName="kserve-container" Apr 23 16:58:36.190961 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:58:36.190905 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6d29c8d4-57cc-47e1-9981-e00dced1c3ee" containerName="storage-initializer" Apr 23 16:58:36.190961 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:58:36.190914 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d29c8d4-57cc-47e1-9981-e00dced1c3ee" containerName="storage-initializer" Apr 23 16:58:36.191111 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:58:36.190998 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="6d29c8d4-57cc-47e1-9981-e00dced1c3ee" containerName="kserve-container" Apr 23 16:58:36.195282 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:58:36.195261 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-6nf8m" Apr 23 16:58:36.200724 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:58:36.200702 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-6nf8m"] Apr 23 16:58:36.285974 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:58:36.285936 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7e4fad5d-c473-4c5f-9761-683213d3f383-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7c6bd76f7b-6nf8m\" (UID: \"7e4fad5d-c473-4c5f-9761-683213d3f383\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-6nf8m" Apr 23 16:58:36.386641 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:58:36.386562 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7e4fad5d-c473-4c5f-9761-683213d3f383-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7c6bd76f7b-6nf8m\" (UID: \"7e4fad5d-c473-4c5f-9761-683213d3f383\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-6nf8m" Apr 23 16:58:36.386950 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:58:36.386932 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7e4fad5d-c473-4c5f-9761-683213d3f383-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7c6bd76f7b-6nf8m\" (UID: \"7e4fad5d-c473-4c5f-9761-683213d3f383\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-6nf8m" Apr 23 16:58:36.506793 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:58:36.506761 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-6nf8m" Apr 23 16:58:36.626412 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:58:36.626370 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-6nf8m"] Apr 23 16:58:36.628607 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:58:36.628569 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e4fad5d_c473_4c5f_9761_683213d3f383.slice/crio-e5617da0816dbcb9df0dfc65f2e5b77bdb916a85661b940683043cd71c983fbf WatchSource:0}: Error finding container e5617da0816dbcb9df0dfc65f2e5b77bdb916a85661b940683043cd71c983fbf: Status 404 returned error can't find the container with id e5617da0816dbcb9df0dfc65f2e5b77bdb916a85661b940683043cd71c983fbf Apr 23 16:58:37.210441 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:58:37.210406 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-6nf8m" event={"ID":"7e4fad5d-c473-4c5f-9761-683213d3f383","Type":"ContainerStarted","Data":"0d47c9a2433ab02959bfa4870f189da61a03897c6d9c9742bfd7da640ed2cda2"} Apr 23 16:58:37.210441 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:58:37.210440 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-6nf8m" event={"ID":"7e4fad5d-c473-4c5f-9761-683213d3f383","Type":"ContainerStarted","Data":"e5617da0816dbcb9df0dfc65f2e5b77bdb916a85661b940683043cd71c983fbf"} Apr 23 16:58:38.913538 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:58:38.913514 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-rjh9l" Apr 23 16:58:39.008394 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:58:39.008308 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c5d8ddf4-0d63-4ecd-a373-4235173b9e85-kserve-provision-location\") pod \"c5d8ddf4-0d63-4ecd-a373-4235173b9e85\" (UID: \"c5d8ddf4-0d63-4ecd-a373-4235173b9e85\") " Apr 23 16:58:39.017610 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:58:39.017582 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5d8ddf4-0d63-4ecd-a373-4235173b9e85-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c5d8ddf4-0d63-4ecd-a373-4235173b9e85" (UID: "c5d8ddf4-0d63-4ecd-a373-4235173b9e85"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:58:39.109065 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:58:39.109030 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c5d8ddf4-0d63-4ecd-a373-4235173b9e85-kserve-provision-location\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 16:58:39.217310 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:58:39.217275 2562 generic.go:358] "Generic (PLEG): container finished" podID="c5d8ddf4-0d63-4ecd-a373-4235173b9e85" containerID="ce4a179d677d3c94aaa37cb37a2786ed6bcb42f21b92a098753303519f9ff8be" exitCode=0 Apr 23 16:58:39.217474 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:58:39.217337 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-rjh9l" event={"ID":"c5d8ddf4-0d63-4ecd-a373-4235173b9e85","Type":"ContainerDied","Data":"ce4a179d677d3c94aaa37cb37a2786ed6bcb42f21b92a098753303519f9ff8be"} Apr 23 16:58:39.217474 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:58:39.217350 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-rjh9l" Apr 23 16:58:39.217474 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:58:39.217363 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-rjh9l" event={"ID":"c5d8ddf4-0d63-4ecd-a373-4235173b9e85","Type":"ContainerDied","Data":"61f16176739108b34702bb36dcb742bf2a1f1588b003ed986b4ada514dc8fa3d"} Apr 23 16:58:39.217474 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:58:39.217378 2562 scope.go:117] "RemoveContainer" containerID="ce4a179d677d3c94aaa37cb37a2786ed6bcb42f21b92a098753303519f9ff8be" Apr 23 16:58:39.225381 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:58:39.225366 2562 scope.go:117] "RemoveContainer" containerID="cde3a75e998cb27d3ba90a9379d85ee618c80e073f04a48ad262f2f303b5f20e" Apr 23 16:58:39.234489 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:58:39.234463 2562 scope.go:117] "RemoveContainer" containerID="ce4a179d677d3c94aaa37cb37a2786ed6bcb42f21b92a098753303519f9ff8be" Apr 23 16:58:39.234958 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:58:39.234933 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce4a179d677d3c94aaa37cb37a2786ed6bcb42f21b92a098753303519f9ff8be\": container with ID starting with ce4a179d677d3c94aaa37cb37a2786ed6bcb42f21b92a098753303519f9ff8be not found: ID does not exist" containerID="ce4a179d677d3c94aaa37cb37a2786ed6bcb42f21b92a098753303519f9ff8be" Apr 23 16:58:39.235051 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:58:39.234970 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce4a179d677d3c94aaa37cb37a2786ed6bcb42f21b92a098753303519f9ff8be"} err="failed to get container status \"ce4a179d677d3c94aaa37cb37a2786ed6bcb42f21b92a098753303519f9ff8be\": rpc error: code = NotFound desc = could not find container \"ce4a179d677d3c94aaa37cb37a2786ed6bcb42f21b92a098753303519f9ff8be\": container with ID starting with ce4a179d677d3c94aaa37cb37a2786ed6bcb42f21b92a098753303519f9ff8be not found: ID does not exist" Apr 23 16:58:39.235051 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:58:39.234995 2562 scope.go:117] "RemoveContainer" containerID="cde3a75e998cb27d3ba90a9379d85ee618c80e073f04a48ad262f2f303b5f20e" Apr 23 16:58:39.235290 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:58:39.235272 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cde3a75e998cb27d3ba90a9379d85ee618c80e073f04a48ad262f2f303b5f20e\": container with ID starting with cde3a75e998cb27d3ba90a9379d85ee618c80e073f04a48ad262f2f303b5f20e not found: ID does not exist" containerID="cde3a75e998cb27d3ba90a9379d85ee618c80e073f04a48ad262f2f303b5f20e" Apr 23 16:58:39.235332 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:58:39.235296 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cde3a75e998cb27d3ba90a9379d85ee618c80e073f04a48ad262f2f303b5f20e"} err="failed to get container status \"cde3a75e998cb27d3ba90a9379d85ee618c80e073f04a48ad262f2f303b5f20e\": rpc error: code = NotFound desc = could not find container \"cde3a75e998cb27d3ba90a9379d85ee618c80e073f04a48ad262f2f303b5f20e\": container with ID starting with cde3a75e998cb27d3ba90a9379d85ee618c80e073f04a48ad262f2f303b5f20e not found: ID does not exist" Apr 23 16:58:39.238074 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:58:39.238050 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-rjh9l"] Apr 23 16:58:39.243501 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:58:39.243473 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-rjh9l"] Apr 23 16:58:40.743118 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:58:40.743075 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5d8ddf4-0d63-4ecd-a373-4235173b9e85" path="/var/lib/kubelet/pods/c5d8ddf4-0d63-4ecd-a373-4235173b9e85/volumes" Apr 23 16:58:41.225489 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:58:41.225455 2562 generic.go:358] "Generic (PLEG): container finished" podID="7e4fad5d-c473-4c5f-9761-683213d3f383" containerID="0d47c9a2433ab02959bfa4870f189da61a03897c6d9c9742bfd7da640ed2cda2" exitCode=0 Apr 23 16:58:41.225665 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:58:41.225545 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-6nf8m" event={"ID":"7e4fad5d-c473-4c5f-9761-683213d3f383","Type":"ContainerDied","Data":"0d47c9a2433ab02959bfa4870f189da61a03897c6d9c9742bfd7da640ed2cda2"} Apr 23 16:58:42.229556 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:58:42.229524 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-6nf8m" event={"ID":"7e4fad5d-c473-4c5f-9761-683213d3f383","Type":"ContainerStarted","Data":"a89524fa5a79259e694be47cd88d8322722a4c92a23cdab65138519afb6bab9c"} Apr 23 16:58:42.229959 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:58:42.229834 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-6nf8m" Apr 23 16:58:42.231261 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:58:42.231233 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-6nf8m" podUID="7e4fad5d-c473-4c5f-9761-683213d3f383" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 23 16:58:42.245181 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:58:42.245134 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-6nf8m" podStartSLOduration=6.245117025 podStartE2EDuration="6.245117025s" podCreationTimestamp="2026-04-23 16:58:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:58:42.244859258 +0000 UTC m=+1402.097674325" watchObservedRunningTime="2026-04-23 16:58:42.245117025 +0000 UTC m=+1402.097932095" Apr 23 16:58:43.232998 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:58:43.232962 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-6nf8m" podUID="7e4fad5d-c473-4c5f-9761-683213d3f383" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 23 16:58:53.233602 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:58:53.233557 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-6nf8m" podUID="7e4fad5d-c473-4c5f-9761-683213d3f383" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 23 16:59:03.233083 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:59:03.233039 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-6nf8m" podUID="7e4fad5d-c473-4c5f-9761-683213d3f383" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 23 16:59:13.233264 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:59:13.233219 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-6nf8m" podUID="7e4fad5d-c473-4c5f-9761-683213d3f383" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 23 16:59:23.233103 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:59:23.233055 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-6nf8m" podUID="7e4fad5d-c473-4c5f-9761-683213d3f383" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 23 16:59:33.234457 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:59:33.234422 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-6nf8m" Apr 23 16:59:37.986241 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:59:37.986209 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-6nf8m"] Apr 23 16:59:37.986659 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:59:37.986467 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-6nf8m" podUID="7e4fad5d-c473-4c5f-9761-683213d3f383" containerName="kserve-container" containerID="cri-o://a89524fa5a79259e694be47cd88d8322722a4c92a23cdab65138519afb6bab9c" gracePeriod=30 Apr 23 16:59:38.084120 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:59:38.084090 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ggjzf"] Apr 23 16:59:38.084543 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:59:38.084524 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c5d8ddf4-0d63-4ecd-a373-4235173b9e85" containerName="storage-initializer" Apr 23 16:59:38.084642 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:59:38.084545 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5d8ddf4-0d63-4ecd-a373-4235173b9e85" containerName="storage-initializer" Apr 23 16:59:38.084642 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:59:38.084556 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c5d8ddf4-0d63-4ecd-a373-4235173b9e85" containerName="kserve-container" Apr 23 16:59:38.084642 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:59:38.084564 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5d8ddf4-0d63-4ecd-a373-4235173b9e85" containerName="kserve-container" Apr 23 16:59:38.084642 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:59:38.084631 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="c5d8ddf4-0d63-4ecd-a373-4235173b9e85" containerName="kserve-container" Apr 23 16:59:38.088815 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:59:38.088792 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ggjzf" Apr 23 16:59:38.096022 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:59:38.095986 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/49f9eef5-75c8-413e-b4d1-f0a3e0ac8ad0-kserve-provision-location\") pod \"isvc-pmml-predictor-5584ffd8c9-ggjzf\" (UID: \"49f9eef5-75c8-413e-b4d1-f0a3e0ac8ad0\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ggjzf" Apr 23 16:59:38.096510 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:59:38.096131 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ggjzf"] Apr 23 16:59:38.196565 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:59:38.196525 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/49f9eef5-75c8-413e-b4d1-f0a3e0ac8ad0-kserve-provision-location\") pod \"isvc-pmml-predictor-5584ffd8c9-ggjzf\" (UID: \"49f9eef5-75c8-413e-b4d1-f0a3e0ac8ad0\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ggjzf" Apr 23 16:59:38.197018 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:59:38.197000 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/49f9eef5-75c8-413e-b4d1-f0a3e0ac8ad0-kserve-provision-location\") pod \"isvc-pmml-predictor-5584ffd8c9-ggjzf\" (UID: \"49f9eef5-75c8-413e-b4d1-f0a3e0ac8ad0\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ggjzf" Apr 23 16:59:38.400591 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:59:38.400561 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ggjzf" Apr 23 16:59:38.525452 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:59:38.525426 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ggjzf"] Apr 23 16:59:38.528229 ip-10-0-135-57 kubenswrapper[2562]: W0423 16:59:38.528196 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49f9eef5_75c8_413e_b4d1_f0a3e0ac8ad0.slice/crio-a494d12564b66d058ede4914c828fa1523fc30125f64ffe80396844c827636b7 WatchSource:0}: Error finding container a494d12564b66d058ede4914c828fa1523fc30125f64ffe80396844c827636b7: Status 404 returned error can't find the container with id a494d12564b66d058ede4914c828fa1523fc30125f64ffe80396844c827636b7 Apr 23 16:59:38.529981 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:59:38.529963 2562 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 16:59:39.400235 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:59:39.400196 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ggjzf" event={"ID":"49f9eef5-75c8-413e-b4d1-f0a3e0ac8ad0","Type":"ContainerStarted","Data":"e3b20f2988055491a9686782b25594069772cf7c61ddd7a7d49552f92be944f7"} Apr 23 16:59:39.400592 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:59:39.400241 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ggjzf" event={"ID":"49f9eef5-75c8-413e-b4d1-f0a3e0ac8ad0","Type":"ContainerStarted","Data":"a494d12564b66d058ede4914c828fa1523fc30125f64ffe80396844c827636b7"} Apr 23 16:59:40.820337 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:59:40.820314 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-6nf8m" Apr 23 16:59:40.915256 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:59:40.915226 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7e4fad5d-c473-4c5f-9761-683213d3f383-kserve-provision-location\") pod \"7e4fad5d-c473-4c5f-9761-683213d3f383\" (UID: \"7e4fad5d-c473-4c5f-9761-683213d3f383\") " Apr 23 16:59:40.924393 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:59:40.924365 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e4fad5d-c473-4c5f-9761-683213d3f383-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7e4fad5d-c473-4c5f-9761-683213d3f383" (UID: "7e4fad5d-c473-4c5f-9761-683213d3f383"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:59:41.016005 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:59:41.015973 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7e4fad5d-c473-4c5f-9761-683213d3f383-kserve-provision-location\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 16:59:41.408313 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:59:41.408282 2562 generic.go:358] "Generic (PLEG): container finished" podID="7e4fad5d-c473-4c5f-9761-683213d3f383" containerID="a89524fa5a79259e694be47cd88d8322722a4c92a23cdab65138519afb6bab9c" exitCode=0 Apr 23 16:59:41.408499 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:59:41.408325 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-6nf8m" event={"ID":"7e4fad5d-c473-4c5f-9761-683213d3f383","Type":"ContainerDied","Data":"a89524fa5a79259e694be47cd88d8322722a4c92a23cdab65138519afb6bab9c"} Apr 23 16:59:41.408499 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:59:41.408343 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-6nf8m" Apr 23 16:59:41.408499 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:59:41.408359 2562 scope.go:117] "RemoveContainer" containerID="a89524fa5a79259e694be47cd88d8322722a4c92a23cdab65138519afb6bab9c" Apr 23 16:59:41.408499 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:59:41.408347 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-6nf8m" event={"ID":"7e4fad5d-c473-4c5f-9761-683213d3f383","Type":"ContainerDied","Data":"e5617da0816dbcb9df0dfc65f2e5b77bdb916a85661b940683043cd71c983fbf"} Apr 23 16:59:41.416636 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:59:41.416614 2562 scope.go:117] "RemoveContainer" containerID="0d47c9a2433ab02959bfa4870f189da61a03897c6d9c9742bfd7da640ed2cda2" Apr 23 16:59:41.423393 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:59:41.423374 2562 scope.go:117] "RemoveContainer" containerID="a89524fa5a79259e694be47cd88d8322722a4c92a23cdab65138519afb6bab9c" Apr 23 16:59:41.423615 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:59:41.423597 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a89524fa5a79259e694be47cd88d8322722a4c92a23cdab65138519afb6bab9c\": container with ID starting with a89524fa5a79259e694be47cd88d8322722a4c92a23cdab65138519afb6bab9c not found: ID does not exist" containerID="a89524fa5a79259e694be47cd88d8322722a4c92a23cdab65138519afb6bab9c" Apr 23 16:59:41.423669 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:59:41.423623 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a89524fa5a79259e694be47cd88d8322722a4c92a23cdab65138519afb6bab9c"} err="failed to get container status \"a89524fa5a79259e694be47cd88d8322722a4c92a23cdab65138519afb6bab9c\": rpc error: code = NotFound desc = could not find container \"a89524fa5a79259e694be47cd88d8322722a4c92a23cdab65138519afb6bab9c\": container with ID starting with a89524fa5a79259e694be47cd88d8322722a4c92a23cdab65138519afb6bab9c not found: ID does not exist" Apr 23 16:59:41.423669 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:59:41.423640 2562 scope.go:117] "RemoveContainer" containerID="0d47c9a2433ab02959bfa4870f189da61a03897c6d9c9742bfd7da640ed2cda2" Apr 23 16:59:41.423904 ip-10-0-135-57 kubenswrapper[2562]: E0423 16:59:41.423891 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d47c9a2433ab02959bfa4870f189da61a03897c6d9c9742bfd7da640ed2cda2\": container with ID starting with 0d47c9a2433ab02959bfa4870f189da61a03897c6d9c9742bfd7da640ed2cda2 not found: ID does not exist" containerID="0d47c9a2433ab02959bfa4870f189da61a03897c6d9c9742bfd7da640ed2cda2" Apr 23 16:59:41.423956 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:59:41.423909 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d47c9a2433ab02959bfa4870f189da61a03897c6d9c9742bfd7da640ed2cda2"} err="failed to get container status \"0d47c9a2433ab02959bfa4870f189da61a03897c6d9c9742bfd7da640ed2cda2\": rpc error: code = NotFound desc = could not find container \"0d47c9a2433ab02959bfa4870f189da61a03897c6d9c9742bfd7da640ed2cda2\": container with ID starting with 0d47c9a2433ab02959bfa4870f189da61a03897c6d9c9742bfd7da640ed2cda2 not found: ID does not exist" Apr 23 16:59:41.428785 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:59:41.428758 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-6nf8m"] Apr 23 16:59:41.432845 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:59:41.432826 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-6nf8m"] Apr 23 16:59:42.412633 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:59:42.412600 2562 generic.go:358] "Generic (PLEG): container finished" podID="49f9eef5-75c8-413e-b4d1-f0a3e0ac8ad0" containerID="e3b20f2988055491a9686782b25594069772cf7c61ddd7a7d49552f92be944f7" exitCode=0 Apr 23 16:59:42.413101 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:59:42.412675 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ggjzf" event={"ID":"49f9eef5-75c8-413e-b4d1-f0a3e0ac8ad0","Type":"ContainerDied","Data":"e3b20f2988055491a9686782b25594069772cf7c61ddd7a7d49552f92be944f7"} Apr 23 16:59:42.742112 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:59:42.742078 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e4fad5d-c473-4c5f-9761-683213d3f383" path="/var/lib/kubelet/pods/7e4fad5d-c473-4c5f-9761-683213d3f383/volumes" Apr 23 16:59:49.440105 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:59:49.440069 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ggjzf" event={"ID":"49f9eef5-75c8-413e-b4d1-f0a3e0ac8ad0","Type":"ContainerStarted","Data":"0fade8c0bdaa7b9653ee91f360e31799bfe74f49b314be6c523c4416cb512f16"} Apr 23 16:59:49.440541 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:59:49.440377 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ggjzf" Apr 23 16:59:49.441804 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:59:49.441780 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ggjzf" podUID="49f9eef5-75c8-413e-b4d1-f0a3e0ac8ad0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 23 16:59:49.458040 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:59:49.457960 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ggjzf" podStartSLOduration=4.549764456 podStartE2EDuration="11.457941511s" podCreationTimestamp="2026-04-23 16:59:38 +0000 UTC" firstStartedPulling="2026-04-23 16:59:42.413955148 +0000 UTC m=+1462.266770194" lastFinishedPulling="2026-04-23 16:59:49.322132207 +0000 UTC m=+1469.174947249" observedRunningTime="2026-04-23 16:59:49.457376874 +0000 UTC m=+1469.310191959" watchObservedRunningTime="2026-04-23 16:59:49.457941511 +0000 UTC m=+1469.310756576" Apr 23 16:59:50.443724 ip-10-0-135-57 kubenswrapper[2562]: I0423 16:59:50.443681 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ggjzf" podUID="49f9eef5-75c8-413e-b4d1-f0a3e0ac8ad0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 23 17:00:00.444268 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:00:00.444217 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ggjzf" podUID="49f9eef5-75c8-413e-b4d1-f0a3e0ac8ad0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 23 17:00:10.444418 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:00:10.444375 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ggjzf" podUID="49f9eef5-75c8-413e-b4d1-f0a3e0ac8ad0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 23 17:00:20.443863 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:00:20.443816 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ggjzf" podUID="49f9eef5-75c8-413e-b4d1-f0a3e0ac8ad0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 23 17:00:30.444694 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:00:30.444649 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ggjzf" podUID="49f9eef5-75c8-413e-b4d1-f0a3e0ac8ad0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 23 17:00:40.443926 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:00:40.443832 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ggjzf" podUID="49f9eef5-75c8-413e-b4d1-f0a3e0ac8ad0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 23 17:00:50.443716 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:00:50.443671 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ggjzf" podUID="49f9eef5-75c8-413e-b4d1-f0a3e0ac8ad0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 23 17:00:56.738861 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:00:56.738814 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ggjzf" podUID="49f9eef5-75c8-413e-b4d1-f0a3e0ac8ad0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 23 17:01:06.738905 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:01:06.738860 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ggjzf" podUID="49f9eef5-75c8-413e-b4d1-f0a3e0ac8ad0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 23 17:01:16.741560 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:01:16.741530 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ggjzf" Apr 23 17:01:19.186737 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:01:19.186707 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ggjzf"] Apr 23 17:01:19.187111 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:01:19.186955 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ggjzf" podUID="49f9eef5-75c8-413e-b4d1-f0a3e0ac8ad0" containerName="kserve-container" containerID="cri-o://0fade8c0bdaa7b9653ee91f360e31799bfe74f49b314be6c523c4416cb512f16" gracePeriod=30 Apr 23 17:01:19.284710 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:01:19.284676 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-k8m6j"] Apr 23 17:01:19.285035 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:01:19.285021 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7e4fad5d-c473-4c5f-9761-683213d3f383" containerName="kserve-container" Apr 23 17:01:19.285082 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:01:19.285037 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e4fad5d-c473-4c5f-9761-683213d3f383" containerName="kserve-container" Apr 23 17:01:19.285082 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:01:19.285069 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7e4fad5d-c473-4c5f-9761-683213d3f383" containerName="storage-initializer" Apr 23 17:01:19.285082 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:01:19.285074 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e4fad5d-c473-4c5f-9761-683213d3f383" containerName="storage-initializer" Apr 23 17:01:19.285174 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:01:19.285125 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="7e4fad5d-c473-4c5f-9761-683213d3f383" containerName="kserve-container" Apr 23 17:01:19.288069 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:01:19.288053 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-k8m6j" Apr 23 17:01:19.298048 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:01:19.298026 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-k8m6j"] Apr 23 17:01:19.364847 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:01:19.364807 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/adece6cd-0836-4c60-9aab-4681b944afb3-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-7576f6b69f-k8m6j\" (UID: \"adece6cd-0836-4c60-9aab-4681b944afb3\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-k8m6j" Apr 23 17:01:19.465537 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:01:19.465448 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/adece6cd-0836-4c60-9aab-4681b944afb3-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-7576f6b69f-k8m6j\" (UID: \"adece6cd-0836-4c60-9aab-4681b944afb3\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-k8m6j" Apr 23 17:01:19.465863 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:01:19.465842 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/adece6cd-0836-4c60-9aab-4681b944afb3-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-7576f6b69f-k8m6j\" (UID: \"adece6cd-0836-4c60-9aab-4681b944afb3\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-k8m6j" Apr 23 17:01:19.598279 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:01:19.598246 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-k8m6j" Apr 23 17:01:19.715282 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:01:19.715257 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-k8m6j"] Apr 23 17:01:19.717583 ip-10-0-135-57 kubenswrapper[2562]: W0423 17:01:19.717516 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadece6cd_0836_4c60_9aab_4681b944afb3.slice/crio-c8282c096e1c2a86c35c7e12f73516e4b21e83558122c8c1f8a6c1a2ffde6097 WatchSource:0}: Error finding container c8282c096e1c2a86c35c7e12f73516e4b21e83558122c8c1f8a6c1a2ffde6097: Status 404 returned error can't find the container with id c8282c096e1c2a86c35c7e12f73516e4b21e83558122c8c1f8a6c1a2ffde6097 Apr 23 17:01:20.710018 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:01:20.709986 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-k8m6j" event={"ID":"adece6cd-0836-4c60-9aab-4681b944afb3","Type":"ContainerStarted","Data":"7c6390aae5d26999290b3899177bc276bc84c539058a7d0259ad290a4c22f8c7"} Apr 23 17:01:20.710415 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:01:20.710026 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-k8m6j" event={"ID":"adece6cd-0836-4c60-9aab-4681b944afb3","Type":"ContainerStarted","Data":"c8282c096e1c2a86c35c7e12f73516e4b21e83558122c8c1f8a6c1a2ffde6097"} Apr 23 17:01:23.019173 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:01:23.019150 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ggjzf" Apr 23 17:01:23.095307 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:01:23.095229 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/49f9eef5-75c8-413e-b4d1-f0a3e0ac8ad0-kserve-provision-location\") pod \"49f9eef5-75c8-413e-b4d1-f0a3e0ac8ad0\" (UID: \"49f9eef5-75c8-413e-b4d1-f0a3e0ac8ad0\") " Apr 23 17:01:23.095572 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:01:23.095553 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49f9eef5-75c8-413e-b4d1-f0a3e0ac8ad0-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "49f9eef5-75c8-413e-b4d1-f0a3e0ac8ad0" (UID: "49f9eef5-75c8-413e-b4d1-f0a3e0ac8ad0"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:01:23.196268 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:01:23.196227 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/49f9eef5-75c8-413e-b4d1-f0a3e0ac8ad0-kserve-provision-location\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 17:01:23.719451 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:01:23.719415 2562 generic.go:358] "Generic (PLEG): container finished" podID="adece6cd-0836-4c60-9aab-4681b944afb3" containerID="7c6390aae5d26999290b3899177bc276bc84c539058a7d0259ad290a4c22f8c7" exitCode=0 Apr 23 17:01:23.719629 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:01:23.719493 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-k8m6j" event={"ID":"adece6cd-0836-4c60-9aab-4681b944afb3","Type":"ContainerDied","Data":"7c6390aae5d26999290b3899177bc276bc84c539058a7d0259ad290a4c22f8c7"} Apr 23 17:01:23.720835 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:01:23.720802 2562 generic.go:358] "Generic (PLEG): container finished" podID="49f9eef5-75c8-413e-b4d1-f0a3e0ac8ad0" containerID="0fade8c0bdaa7b9653ee91f360e31799bfe74f49b314be6c523c4416cb512f16" exitCode=0 Apr 23 17:01:23.720923 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:01:23.720853 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ggjzf" event={"ID":"49f9eef5-75c8-413e-b4d1-f0a3e0ac8ad0","Type":"ContainerDied","Data":"0fade8c0bdaa7b9653ee91f360e31799bfe74f49b314be6c523c4416cb512f16"} Apr 23 17:01:23.720923 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:01:23.720881 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ggjzf" event={"ID":"49f9eef5-75c8-413e-b4d1-f0a3e0ac8ad0","Type":"ContainerDied","Data":"a494d12564b66d058ede4914c828fa1523fc30125f64ffe80396844c827636b7"} Apr 23 17:01:23.720923 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:01:23.720888 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ggjzf" Apr 23 17:01:23.720923 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:01:23.720899 2562 scope.go:117] "RemoveContainer" containerID="0fade8c0bdaa7b9653ee91f360e31799bfe74f49b314be6c523c4416cb512f16" Apr 23 17:01:23.729001 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:01:23.728915 2562 scope.go:117] "RemoveContainer" containerID="e3b20f2988055491a9686782b25594069772cf7c61ddd7a7d49552f92be944f7" Apr 23 17:01:23.736382 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:01:23.736361 2562 scope.go:117] "RemoveContainer" containerID="0fade8c0bdaa7b9653ee91f360e31799bfe74f49b314be6c523c4416cb512f16" Apr 23 17:01:23.736786 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:01:23.736765 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fade8c0bdaa7b9653ee91f360e31799bfe74f49b314be6c523c4416cb512f16\": container with ID starting with 0fade8c0bdaa7b9653ee91f360e31799bfe74f49b314be6c523c4416cb512f16 not found: ID does not exist" containerID="0fade8c0bdaa7b9653ee91f360e31799bfe74f49b314be6c523c4416cb512f16" Apr 23 17:01:23.736860 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:01:23.736794 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fade8c0bdaa7b9653ee91f360e31799bfe74f49b314be6c523c4416cb512f16"} err="failed to get container status \"0fade8c0bdaa7b9653ee91f360e31799bfe74f49b314be6c523c4416cb512f16\": rpc error: code = NotFound desc = could not find container \"0fade8c0bdaa7b9653ee91f360e31799bfe74f49b314be6c523c4416cb512f16\": container with ID starting with 0fade8c0bdaa7b9653ee91f360e31799bfe74f49b314be6c523c4416cb512f16 not found: ID does not exist" Apr 23 17:01:23.736860 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:01:23.736813 2562 scope.go:117] "RemoveContainer" containerID="e3b20f2988055491a9686782b25594069772cf7c61ddd7a7d49552f92be944f7" Apr 23 17:01:23.737078 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:01:23.737050 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3b20f2988055491a9686782b25594069772cf7c61ddd7a7d49552f92be944f7\": container with ID starting with e3b20f2988055491a9686782b25594069772cf7c61ddd7a7d49552f92be944f7 not found: ID does not exist" containerID="e3b20f2988055491a9686782b25594069772cf7c61ddd7a7d49552f92be944f7" Apr 23 17:01:23.737140 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:01:23.737081 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3b20f2988055491a9686782b25594069772cf7c61ddd7a7d49552f92be944f7"} err="failed to get container status \"e3b20f2988055491a9686782b25594069772cf7c61ddd7a7d49552f92be944f7\": rpc error: code = NotFound desc = could not find container \"e3b20f2988055491a9686782b25594069772cf7c61ddd7a7d49552f92be944f7\": container with ID starting with e3b20f2988055491a9686782b25594069772cf7c61ddd7a7d49552f92be944f7 not found: ID does not exist" Apr 23 17:01:23.750442 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:01:23.750412 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ggjzf"] Apr 23 17:01:23.751665 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:01:23.751644 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-ggjzf"] Apr 23 17:01:24.726505 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:01:24.726467 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-k8m6j" event={"ID":"adece6cd-0836-4c60-9aab-4681b944afb3","Type":"ContainerStarted","Data":"133f7fba1fcb124fb16da916e7971c4e8c67cfb8e742409260a8e9ef185fedcd"} Apr 23 17:01:24.726932 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:01:24.726775 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-k8m6j" Apr 23 17:01:24.728176 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:01:24.728143 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-k8m6j" podUID="adece6cd-0836-4c60-9aab-4681b944afb3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 23 17:01:24.741369 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:01:24.741343 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49f9eef5-75c8-413e-b4d1-f0a3e0ac8ad0" path="/var/lib/kubelet/pods/49f9eef5-75c8-413e-b4d1-f0a3e0ac8ad0/volumes" Apr 23 17:01:24.743726 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:01:24.743680 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-k8m6j" podStartSLOduration=5.743663762 podStartE2EDuration="5.743663762s" podCreationTimestamp="2026-04-23 17:01:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:01:24.743094282 +0000 UTC m=+1564.595909349" watchObservedRunningTime="2026-04-23 17:01:24.743663762 +0000 UTC m=+1564.596478828" Apr 23 17:01:25.729720 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:01:25.729685 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-k8m6j" podUID="adece6cd-0836-4c60-9aab-4681b944afb3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 23 17:01:35.730120 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:01:35.730072 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-k8m6j" podUID="adece6cd-0836-4c60-9aab-4681b944afb3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 23 17:01:45.729727 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:01:45.729684 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-k8m6j" podUID="adece6cd-0836-4c60-9aab-4681b944afb3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 23 17:01:55.730562 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:01:55.730512 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-k8m6j" podUID="adece6cd-0836-4c60-9aab-4681b944afb3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 23 17:02:05.730523 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:02:05.730480 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-k8m6j" podUID="adece6cd-0836-4c60-9aab-4681b944afb3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 23 17:02:15.729630 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:02:15.729584 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-k8m6j" podUID="adece6cd-0836-4c60-9aab-4681b944afb3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 23 17:02:25.730517 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:02:25.730470 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-k8m6j" podUID="adece6cd-0836-4c60-9aab-4681b944afb3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 23 17:02:35.730557 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:02:35.730507 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-k8m6j" podUID="adece6cd-0836-4c60-9aab-4681b944afb3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 23 17:02:45.731313 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:02:45.731283 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-k8m6j" Apr 23 17:02:50.393122 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:02:50.393085 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-k8m6j"] Apr 23 17:02:50.393647 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:02:50.393366 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-k8m6j" podUID="adece6cd-0836-4c60-9aab-4681b944afb3" containerName="kserve-container" containerID="cri-o://133f7fba1fcb124fb16da916e7971c4e8c67cfb8e742409260a8e9ef185fedcd" gracePeriod=30 Apr 23 17:02:50.475601 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:02:50.475567 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-27hcj"] Apr 23 17:02:50.475895 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:02:50.475882 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="49f9eef5-75c8-413e-b4d1-f0a3e0ac8ad0" containerName="storage-initializer" Apr 23 17:02:50.475945 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:02:50.475896 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="49f9eef5-75c8-413e-b4d1-f0a3e0ac8ad0" containerName="storage-initializer" Apr 23 17:02:50.475945 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:02:50.475917 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="49f9eef5-75c8-413e-b4d1-f0a3e0ac8ad0" containerName="kserve-container" Apr 23 17:02:50.475945 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:02:50.475922 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="49f9eef5-75c8-413e-b4d1-f0a3e0ac8ad0" containerName="kserve-container" Apr 23 17:02:50.476041 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:02:50.475973 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="49f9eef5-75c8-413e-b4d1-f0a3e0ac8ad0" containerName="kserve-container" Apr 23 17:02:50.478686 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:02:50.478671 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-27hcj" Apr 23 17:02:50.486440 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:02:50.486416 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-27hcj"] Apr 23 17:02:50.644101 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:02:50.644013 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8c2bec93-bd43-4867-8740-247e44f4db47-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-75b87ff64c-27hcj\" (UID: \"8c2bec93-bd43-4867-8740-247e44f4db47\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-27hcj" Apr 23 17:02:50.745504 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:02:50.745469 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8c2bec93-bd43-4867-8740-247e44f4db47-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-75b87ff64c-27hcj\" (UID: \"8c2bec93-bd43-4867-8740-247e44f4db47\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-27hcj" Apr 23 17:02:50.745902 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:02:50.745883 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8c2bec93-bd43-4867-8740-247e44f4db47-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-75b87ff64c-27hcj\" (UID: \"8c2bec93-bd43-4867-8740-247e44f4db47\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-27hcj" Apr 23 17:02:50.789555 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:02:50.789532 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-27hcj" Apr 23 17:02:50.910295 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:02:50.910273 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-27hcj"] Apr 23 17:02:50.912274 ip-10-0-135-57 kubenswrapper[2562]: W0423 17:02:50.912249 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c2bec93_bd43_4867_8740_247e44f4db47.slice/crio-894d094b8cd54986c92b7343840cc1ad71dbd5e17b6d9826581b1b7e2032741c WatchSource:0}: Error finding container 894d094b8cd54986c92b7343840cc1ad71dbd5e17b6d9826581b1b7e2032741c: Status 404 returned error can't find the container with id 894d094b8cd54986c92b7343840cc1ad71dbd5e17b6d9826581b1b7e2032741c Apr 23 17:02:50.982957 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:02:50.982928 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-27hcj" event={"ID":"8c2bec93-bd43-4867-8740-247e44f4db47","Type":"ContainerStarted","Data":"ad68d2e55d5be35e26fe26fa04d81dbf07b9bde846cda165de0661547a694728"} Apr 23 17:02:50.983121 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:02:50.982962 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-27hcj" event={"ID":"8c2bec93-bd43-4867-8740-247e44f4db47","Type":"ContainerStarted","Data":"894d094b8cd54986c92b7343840cc1ad71dbd5e17b6d9826581b1b7e2032741c"} Apr 23 17:02:54.132542 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:02:54.132510 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-k8m6j" Apr 23 17:02:54.276379 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:02:54.276283 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/adece6cd-0836-4c60-9aab-4681b944afb3-kserve-provision-location\") pod \"adece6cd-0836-4c60-9aab-4681b944afb3\" (UID: \"adece6cd-0836-4c60-9aab-4681b944afb3\") " Apr 23 17:02:54.276642 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:02:54.276615 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/adece6cd-0836-4c60-9aab-4681b944afb3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "adece6cd-0836-4c60-9aab-4681b944afb3" (UID: "adece6cd-0836-4c60-9aab-4681b944afb3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:02:54.377720 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:02:54.377683 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/adece6cd-0836-4c60-9aab-4681b944afb3-kserve-provision-location\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 17:02:54.995806 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:02:54.995772 2562 generic.go:358] "Generic (PLEG): container finished" podID="adece6cd-0836-4c60-9aab-4681b944afb3" containerID="133f7fba1fcb124fb16da916e7971c4e8c67cfb8e742409260a8e9ef185fedcd" exitCode=0 Apr 23 17:02:54.995984 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:02:54.995844 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-k8m6j" event={"ID":"adece6cd-0836-4c60-9aab-4681b944afb3","Type":"ContainerDied","Data":"133f7fba1fcb124fb16da916e7971c4e8c67cfb8e742409260a8e9ef185fedcd"} Apr 23 17:02:54.995984 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:02:54.995892 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-k8m6j" event={"ID":"adece6cd-0836-4c60-9aab-4681b944afb3","Type":"ContainerDied","Data":"c8282c096e1c2a86c35c7e12f73516e4b21e83558122c8c1f8a6c1a2ffde6097"} Apr 23 17:02:54.995984 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:02:54.995902 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-k8m6j" Apr 23 17:02:54.995984 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:02:54.995912 2562 scope.go:117] "RemoveContainer" containerID="133f7fba1fcb124fb16da916e7971c4e8c67cfb8e742409260a8e9ef185fedcd" Apr 23 17:02:54.997161 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:02:54.997138 2562 generic.go:358] "Generic (PLEG): container finished" podID="8c2bec93-bd43-4867-8740-247e44f4db47" containerID="ad68d2e55d5be35e26fe26fa04d81dbf07b9bde846cda165de0661547a694728" exitCode=0 Apr 23 17:02:54.997288 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:02:54.997210 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-27hcj" event={"ID":"8c2bec93-bd43-4867-8740-247e44f4db47","Type":"ContainerDied","Data":"ad68d2e55d5be35e26fe26fa04d81dbf07b9bde846cda165de0661547a694728"} Apr 23 17:02:55.003785 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:02:55.003764 2562 scope.go:117] "RemoveContainer" containerID="7c6390aae5d26999290b3899177bc276bc84c539058a7d0259ad290a4c22f8c7" Apr 23 17:02:55.010851 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:02:55.010831 2562 scope.go:117] "RemoveContainer" containerID="133f7fba1fcb124fb16da916e7971c4e8c67cfb8e742409260a8e9ef185fedcd" Apr 23 17:02:55.011101 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:02:55.011077 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"133f7fba1fcb124fb16da916e7971c4e8c67cfb8e742409260a8e9ef185fedcd\": container with ID starting with 133f7fba1fcb124fb16da916e7971c4e8c67cfb8e742409260a8e9ef185fedcd not found: ID does not exist" containerID="133f7fba1fcb124fb16da916e7971c4e8c67cfb8e742409260a8e9ef185fedcd" Apr 23 17:02:55.011195 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:02:55.011111 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"133f7fba1fcb124fb16da916e7971c4e8c67cfb8e742409260a8e9ef185fedcd"} err="failed to get container status \"133f7fba1fcb124fb16da916e7971c4e8c67cfb8e742409260a8e9ef185fedcd\": rpc error: code = NotFound desc = could not find container \"133f7fba1fcb124fb16da916e7971c4e8c67cfb8e742409260a8e9ef185fedcd\": container with ID starting with 133f7fba1fcb124fb16da916e7971c4e8c67cfb8e742409260a8e9ef185fedcd not found: ID does not exist" Apr 23 17:02:55.011195 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:02:55.011135 2562 scope.go:117] "RemoveContainer" containerID="7c6390aae5d26999290b3899177bc276bc84c539058a7d0259ad290a4c22f8c7" Apr 23 17:02:55.011397 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:02:55.011380 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c6390aae5d26999290b3899177bc276bc84c539058a7d0259ad290a4c22f8c7\": container with ID starting with 7c6390aae5d26999290b3899177bc276bc84c539058a7d0259ad290a4c22f8c7 not found: ID does not exist" containerID="7c6390aae5d26999290b3899177bc276bc84c539058a7d0259ad290a4c22f8c7" Apr 23 17:02:55.011456 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:02:55.011405 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c6390aae5d26999290b3899177bc276bc84c539058a7d0259ad290a4c22f8c7"} err="failed to get container status \"7c6390aae5d26999290b3899177bc276bc84c539058a7d0259ad290a4c22f8c7\": rpc error: code = NotFound desc = could not find container \"7c6390aae5d26999290b3899177bc276bc84c539058a7d0259ad290a4c22f8c7\": container with ID starting with 7c6390aae5d26999290b3899177bc276bc84c539058a7d0259ad290a4c22f8c7 not found: ID does not exist" Apr 23 17:02:55.028113 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:02:55.028092 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-k8m6j"] Apr 23 17:02:55.030531 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:02:55.030511 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-k8m6j"] Apr 23 17:02:56.002889 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:02:56.002853 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-27hcj" event={"ID":"8c2bec93-bd43-4867-8740-247e44f4db47","Type":"ContainerStarted","Data":"e4d77aa65c31f19ffb908f53c82e78c48f1f51ee791c2807783de26223e3dd5d"} Apr 23 17:02:56.003351 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:02:56.003154 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-27hcj" Apr 23 17:02:56.004534 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:02:56.004501 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-27hcj" podUID="8c2bec93-bd43-4867-8740-247e44f4db47" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 23 17:02:56.020346 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:02:56.020262 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-27hcj" podStartSLOduration=6.020249701 podStartE2EDuration="6.020249701s" podCreationTimestamp="2026-04-23 17:02:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:02:56.018875555 +0000 UTC m=+1655.871690631" watchObservedRunningTime="2026-04-23 17:02:56.020249701 +0000 UTC m=+1655.873064764" Apr 23 17:02:56.743109 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:02:56.743074 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adece6cd-0836-4c60-9aab-4681b944afb3" path="/var/lib/kubelet/pods/adece6cd-0836-4c60-9aab-4681b944afb3/volumes" Apr 23 17:02:57.006174 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:02:57.006086 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-27hcj" podUID="8c2bec93-bd43-4867-8740-247e44f4db47" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 23 17:03:07.006762 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:03:07.006691 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-27hcj" podUID="8c2bec93-bd43-4867-8740-247e44f4db47" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 23 17:03:17.006379 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:03:17.006337 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-27hcj" podUID="8c2bec93-bd43-4867-8740-247e44f4db47" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 23 17:03:27.006577 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:03:27.006532 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-27hcj" podUID="8c2bec93-bd43-4867-8740-247e44f4db47" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 23 17:03:37.006891 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:03:37.006840 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-27hcj" podUID="8c2bec93-bd43-4867-8740-247e44f4db47" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 23 17:03:47.006691 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:03:47.006605 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-27hcj" podUID="8c2bec93-bd43-4867-8740-247e44f4db47" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 23 17:03:57.006877 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:03:57.006823 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-27hcj" podUID="8c2bec93-bd43-4867-8740-247e44f4db47" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 23 17:04:07.006415 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:04:07.006358 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-27hcj" podUID="8c2bec93-bd43-4867-8740-247e44f4db47" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 23 17:04:17.006900 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:04:17.006873 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-27hcj" Apr 23 17:04:21.588961 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:04:21.588924 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-27hcj"] Apr 23 17:04:21.589420 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:04:21.589274 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-27hcj" podUID="8c2bec93-bd43-4867-8740-247e44f4db47" containerName="kserve-container" containerID="cri-o://e4d77aa65c31f19ffb908f53c82e78c48f1f51ee791c2807783de26223e3dd5d" gracePeriod=30 Apr 23 17:04:21.683466 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:04:21.683429 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-75ab52-predictor-686df6f4f7-f785k"] Apr 23 17:04:21.683770 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:04:21.683756 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="adece6cd-0836-4c60-9aab-4681b944afb3" containerName="storage-initializer" Apr 23 17:04:21.683770 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:04:21.683771 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="adece6cd-0836-4c60-9aab-4681b944afb3" containerName="storage-initializer" Apr 23 17:04:21.683866 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:04:21.683792 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="adece6cd-0836-4c60-9aab-4681b944afb3" containerName="kserve-container" Apr 23 17:04:21.683866 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:04:21.683798 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="adece6cd-0836-4c60-9aab-4681b944afb3" containerName="kserve-container" Apr 23 17:04:21.683866 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:04:21.683843 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="adece6cd-0836-4c60-9aab-4681b944afb3" containerName="kserve-container" Apr 23 17:04:21.686690 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:04:21.686673 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-75ab52-predictor-686df6f4f7-f785k" Apr 23 17:04:21.697501 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:04:21.697476 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-75ab52-predictor-686df6f4f7-f785k"] Apr 23 17:04:21.824216 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:04:21.824183 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/22efa691-c303-4485-b21a-8bec25bfe284-kserve-provision-location\") pod \"isvc-primary-75ab52-predictor-686df6f4f7-f785k\" (UID: \"22efa691-c303-4485-b21a-8bec25bfe284\") " pod="kserve-ci-e2e-test/isvc-primary-75ab52-predictor-686df6f4f7-f785k" Apr 23 17:04:21.925432 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:04:21.925392 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/22efa691-c303-4485-b21a-8bec25bfe284-kserve-provision-location\") pod \"isvc-primary-75ab52-predictor-686df6f4f7-f785k\" (UID: \"22efa691-c303-4485-b21a-8bec25bfe284\") " pod="kserve-ci-e2e-test/isvc-primary-75ab52-predictor-686df6f4f7-f785k" Apr 23 17:04:21.925795 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:04:21.925776 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/22efa691-c303-4485-b21a-8bec25bfe284-kserve-provision-location\") pod \"isvc-primary-75ab52-predictor-686df6f4f7-f785k\" (UID: \"22efa691-c303-4485-b21a-8bec25bfe284\") " pod="kserve-ci-e2e-test/isvc-primary-75ab52-predictor-686df6f4f7-f785k" Apr 23 17:04:21.996535 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:04:21.996504 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-75ab52-predictor-686df6f4f7-f785k" Apr 23 17:04:22.121206 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:04:22.121183 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-75ab52-predictor-686df6f4f7-f785k"] Apr 23 17:04:22.123398 ip-10-0-135-57 kubenswrapper[2562]: W0423 17:04:22.123371 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22efa691_c303_4485_b21a_8bec25bfe284.slice/crio-251ae82d02049724310e9c5f4a9e19e5db44e0b0d2c3c5ddda1070b6f704890d WatchSource:0}: Error finding container 251ae82d02049724310e9c5f4a9e19e5db44e0b0d2c3c5ddda1070b6f704890d: Status 404 returned error can't find the container with id 251ae82d02049724310e9c5f4a9e19e5db44e0b0d2c3c5ddda1070b6f704890d Apr 23 17:04:22.272816 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:04:22.272774 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-75ab52-predictor-686df6f4f7-f785k" event={"ID":"22efa691-c303-4485-b21a-8bec25bfe284","Type":"ContainerStarted","Data":"d68e5e193b9e26aa0b74e80226e7f99178ec1da63216dfd7a6e0d6e64ac7472e"} Apr 23 17:04:22.272816 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:04:22.272816 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-75ab52-predictor-686df6f4f7-f785k" event={"ID":"22efa691-c303-4485-b21a-8bec25bfe284","Type":"ContainerStarted","Data":"251ae82d02049724310e9c5f4a9e19e5db44e0b0d2c3c5ddda1070b6f704890d"} Apr 23 17:04:25.283178 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:04:25.283146 2562 generic.go:358] "Generic (PLEG): container finished" podID="8c2bec93-bd43-4867-8740-247e44f4db47" containerID="e4d77aa65c31f19ffb908f53c82e78c48f1f51ee791c2807783de26223e3dd5d" exitCode=0 Apr 23 17:04:25.283623 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:04:25.283198 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-27hcj" event={"ID":"8c2bec93-bd43-4867-8740-247e44f4db47","Type":"ContainerDied","Data":"e4d77aa65c31f19ffb908f53c82e78c48f1f51ee791c2807783de26223e3dd5d"} Apr 23 17:04:25.326601 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:04:25.326575 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-27hcj" Apr 23 17:04:25.455932 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:04:25.455901 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8c2bec93-bd43-4867-8740-247e44f4db47-kserve-provision-location\") pod \"8c2bec93-bd43-4867-8740-247e44f4db47\" (UID: \"8c2bec93-bd43-4867-8740-247e44f4db47\") " Apr 23 17:04:25.456307 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:04:25.456275 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c2bec93-bd43-4867-8740-247e44f4db47-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8c2bec93-bd43-4867-8740-247e44f4db47" (UID: "8c2bec93-bd43-4867-8740-247e44f4db47"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:04:25.557062 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:04:25.557022 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8c2bec93-bd43-4867-8740-247e44f4db47-kserve-provision-location\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 17:04:26.287338 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:04:26.287300 2562 generic.go:358] "Generic (PLEG): container finished" podID="22efa691-c303-4485-b21a-8bec25bfe284" containerID="d68e5e193b9e26aa0b74e80226e7f99178ec1da63216dfd7a6e0d6e64ac7472e" exitCode=0 Apr 23 17:04:26.287766 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:04:26.287373 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-75ab52-predictor-686df6f4f7-f785k" event={"ID":"22efa691-c303-4485-b21a-8bec25bfe284","Type":"ContainerDied","Data":"d68e5e193b9e26aa0b74e80226e7f99178ec1da63216dfd7a6e0d6e64ac7472e"} Apr 23 17:04:26.288889 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:04:26.288867 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-27hcj" event={"ID":"8c2bec93-bd43-4867-8740-247e44f4db47","Type":"ContainerDied","Data":"894d094b8cd54986c92b7343840cc1ad71dbd5e17b6d9826581b1b7e2032741c"} Apr 23 17:04:26.288964 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:04:26.288900 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-27hcj" Apr 23 17:04:26.289007 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:04:26.288907 2562 scope.go:117] "RemoveContainer" containerID="e4d77aa65c31f19ffb908f53c82e78c48f1f51ee791c2807783de26223e3dd5d" Apr 23 17:04:26.297369 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:04:26.297342 2562 scope.go:117] "RemoveContainer" containerID="ad68d2e55d5be35e26fe26fa04d81dbf07b9bde846cda165de0661547a694728" Apr 23 17:04:26.316551 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:04:26.316528 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-27hcj"] Apr 23 17:04:26.320137 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:04:26.320115 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-27hcj"] Apr 23 17:04:26.743078 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:04:26.743041 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c2bec93-bd43-4867-8740-247e44f4db47" path="/var/lib/kubelet/pods/8c2bec93-bd43-4867-8740-247e44f4db47/volumes" Apr 23 17:04:27.292827 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:04:27.292794 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-75ab52-predictor-686df6f4f7-f785k" event={"ID":"22efa691-c303-4485-b21a-8bec25bfe284","Type":"ContainerStarted","Data":"5fbf160c71979e1002b0d3d29540e285f4e94779e9fd07b158eb0036aa02f83b"} Apr 23 17:04:27.293242 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:04:27.293111 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-75ab52-predictor-686df6f4f7-f785k" Apr 23 17:04:27.294485 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:04:27.294455 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-75ab52-predictor-686df6f4f7-f785k" podUID="22efa691-c303-4485-b21a-8bec25bfe284" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 23 17:04:27.311190 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:04:27.311093 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-75ab52-predictor-686df6f4f7-f785k" podStartSLOduration=6.311079735 podStartE2EDuration="6.311079735s" podCreationTimestamp="2026-04-23 17:04:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:04:27.31008443 +0000 UTC m=+1747.162899485" watchObservedRunningTime="2026-04-23 17:04:27.311079735 +0000 UTC m=+1747.163894836" Apr 23 17:04:28.296944 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:04:28.296903 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-75ab52-predictor-686df6f4f7-f785k" podUID="22efa691-c303-4485-b21a-8bec25bfe284" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 23 17:04:38.297380 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:04:38.297312 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-75ab52-predictor-686df6f4f7-f785k" podUID="22efa691-c303-4485-b21a-8bec25bfe284" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 23 17:04:48.297235 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:04:48.297186 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-75ab52-predictor-686df6f4f7-f785k" podUID="22efa691-c303-4485-b21a-8bec25bfe284" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 23 17:04:58.297293 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:04:58.297248 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-75ab52-predictor-686df6f4f7-f785k" podUID="22efa691-c303-4485-b21a-8bec25bfe284" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 23 17:05:08.297154 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:08.297109 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-75ab52-predictor-686df6f4f7-f785k" podUID="22efa691-c303-4485-b21a-8bec25bfe284" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 23 17:05:18.297461 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:18.297417 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-75ab52-predictor-686df6f4f7-f785k" podUID="22efa691-c303-4485-b21a-8bec25bfe284" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 23 17:05:28.297088 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:28.297040 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-75ab52-predictor-686df6f4f7-f785k" podUID="22efa691-c303-4485-b21a-8bec25bfe284" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 23 17:05:37.738971 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:37.738934 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-75ab52-predictor-686df6f4f7-f785k" Apr 23 17:05:41.886584 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:41.886553 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-75ab52-predictor-747fcf8956-dcqfw"] Apr 23 17:05:41.887049 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:41.886894 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8c2bec93-bd43-4867-8740-247e44f4db47" containerName="storage-initializer" Apr 23 17:05:41.887049 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:41.886906 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c2bec93-bd43-4867-8740-247e44f4db47" containerName="storage-initializer" Apr 23 17:05:41.887049 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:41.886914 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8c2bec93-bd43-4867-8740-247e44f4db47" containerName="kserve-container" Apr 23 17:05:41.887049 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:41.886920 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c2bec93-bd43-4867-8740-247e44f4db47" containerName="kserve-container" Apr 23 17:05:41.887049 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:41.886971 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="8c2bec93-bd43-4867-8740-247e44f4db47" containerName="kserve-container" Apr 23 17:05:41.891163 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:41.891147 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-75ab52-predictor-747fcf8956-dcqfw" Apr 23 17:05:41.894226 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:41.894202 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 23 17:05:41.894355 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:41.894238 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-75ab52-dockercfg-rvx95\"" Apr 23 17:05:41.895297 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:41.895281 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-75ab52\"" Apr 23 17:05:41.899692 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:41.899671 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-75ab52-predictor-747fcf8956-dcqfw"] Apr 23 17:05:41.985982 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:41.985943 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7c20d0a4-b64b-4814-add9-1bee2e8fd056-kserve-provision-location\") pod \"isvc-secondary-75ab52-predictor-747fcf8956-dcqfw\" (UID: \"7c20d0a4-b64b-4814-add9-1bee2e8fd056\") " pod="kserve-ci-e2e-test/isvc-secondary-75ab52-predictor-747fcf8956-dcqfw" Apr 23 17:05:41.985982 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:41.985986 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/7c20d0a4-b64b-4814-add9-1bee2e8fd056-cabundle-cert\") pod \"isvc-secondary-75ab52-predictor-747fcf8956-dcqfw\" (UID: \"7c20d0a4-b64b-4814-add9-1bee2e8fd056\") " pod="kserve-ci-e2e-test/isvc-secondary-75ab52-predictor-747fcf8956-dcqfw" Apr 23 17:05:42.086960 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:42.086926 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7c20d0a4-b64b-4814-add9-1bee2e8fd056-kserve-provision-location\") pod \"isvc-secondary-75ab52-predictor-747fcf8956-dcqfw\" (UID: \"7c20d0a4-b64b-4814-add9-1bee2e8fd056\") " pod="kserve-ci-e2e-test/isvc-secondary-75ab52-predictor-747fcf8956-dcqfw" Apr 23 17:05:42.086960 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:42.086965 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/7c20d0a4-b64b-4814-add9-1bee2e8fd056-cabundle-cert\") pod \"isvc-secondary-75ab52-predictor-747fcf8956-dcqfw\" (UID: \"7c20d0a4-b64b-4814-add9-1bee2e8fd056\") " pod="kserve-ci-e2e-test/isvc-secondary-75ab52-predictor-747fcf8956-dcqfw" Apr 23 17:05:42.087311 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:42.087291 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7c20d0a4-b64b-4814-add9-1bee2e8fd056-kserve-provision-location\") pod \"isvc-secondary-75ab52-predictor-747fcf8956-dcqfw\" (UID: \"7c20d0a4-b64b-4814-add9-1bee2e8fd056\") " pod="kserve-ci-e2e-test/isvc-secondary-75ab52-predictor-747fcf8956-dcqfw" Apr 23 17:05:42.087550 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:42.087533 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/7c20d0a4-b64b-4814-add9-1bee2e8fd056-cabundle-cert\") pod \"isvc-secondary-75ab52-predictor-747fcf8956-dcqfw\" (UID: \"7c20d0a4-b64b-4814-add9-1bee2e8fd056\") " pod="kserve-ci-e2e-test/isvc-secondary-75ab52-predictor-747fcf8956-dcqfw" Apr 23 17:05:42.201883 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:42.201847 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-75ab52-predictor-747fcf8956-dcqfw" Apr 23 17:05:42.324655 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:42.324624 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-75ab52-predictor-747fcf8956-dcqfw"] Apr 23 17:05:42.327044 ip-10-0-135-57 kubenswrapper[2562]: W0423 17:05:42.327019 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c20d0a4_b64b_4814_add9_1bee2e8fd056.slice/crio-ec9af64b396e1c3cef26ae9148e4ce8a32c55b9b07f3573af0c0564ef6d9351a WatchSource:0}: Error finding container ec9af64b396e1c3cef26ae9148e4ce8a32c55b9b07f3573af0c0564ef6d9351a: Status 404 returned error can't find the container with id ec9af64b396e1c3cef26ae9148e4ce8a32c55b9b07f3573af0c0564ef6d9351a Apr 23 17:05:42.328888 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:42.328871 2562 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 17:05:42.516378 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:42.516299 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-75ab52-predictor-747fcf8956-dcqfw" event={"ID":"7c20d0a4-b64b-4814-add9-1bee2e8fd056","Type":"ContainerStarted","Data":"cb5e34ba013df8edd312b31a7ecfd829d3c2cf9fb888f7508325685c62ae157c"} Apr 23 17:05:42.516378 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:42.516336 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-75ab52-predictor-747fcf8956-dcqfw" event={"ID":"7c20d0a4-b64b-4814-add9-1bee2e8fd056","Type":"ContainerStarted","Data":"ec9af64b396e1c3cef26ae9148e4ce8a32c55b9b07f3573af0c0564ef6d9351a"} Apr 23 17:05:45.526039 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:45.526011 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-75ab52-predictor-747fcf8956-dcqfw_7c20d0a4-b64b-4814-add9-1bee2e8fd056/storage-initializer/0.log" Apr 23 17:05:45.526431 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:45.526049 2562 generic.go:358] "Generic (PLEG): container finished" podID="7c20d0a4-b64b-4814-add9-1bee2e8fd056" containerID="cb5e34ba013df8edd312b31a7ecfd829d3c2cf9fb888f7508325685c62ae157c" exitCode=1 Apr 23 17:05:45.526431 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:45.526130 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-75ab52-predictor-747fcf8956-dcqfw" event={"ID":"7c20d0a4-b64b-4814-add9-1bee2e8fd056","Type":"ContainerDied","Data":"cb5e34ba013df8edd312b31a7ecfd829d3c2cf9fb888f7508325685c62ae157c"} Apr 23 17:05:46.531062 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:46.531035 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-75ab52-predictor-747fcf8956-dcqfw_7c20d0a4-b64b-4814-add9-1bee2e8fd056/storage-initializer/0.log" Apr 23 17:05:46.531446 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:46.531116 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-75ab52-predictor-747fcf8956-dcqfw" event={"ID":"7c20d0a4-b64b-4814-add9-1bee2e8fd056","Type":"ContainerStarted","Data":"3410d3b5ab0e25e2526d11a3a80ab831fcde5575a6704e8c1f2c5e4a30abd902"} Apr 23 17:05:50.544139 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:50.544112 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-75ab52-predictor-747fcf8956-dcqfw_7c20d0a4-b64b-4814-add9-1bee2e8fd056/storage-initializer/1.log" Apr 23 17:05:50.544557 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:50.544440 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-75ab52-predictor-747fcf8956-dcqfw_7c20d0a4-b64b-4814-add9-1bee2e8fd056/storage-initializer/0.log" Apr 23 17:05:50.544557 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:50.544473 2562 generic.go:358] "Generic (PLEG): container finished" podID="7c20d0a4-b64b-4814-add9-1bee2e8fd056" containerID="3410d3b5ab0e25e2526d11a3a80ab831fcde5575a6704e8c1f2c5e4a30abd902" exitCode=1 Apr 23 17:05:50.544680 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:50.544555 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-75ab52-predictor-747fcf8956-dcqfw" event={"ID":"7c20d0a4-b64b-4814-add9-1bee2e8fd056","Type":"ContainerDied","Data":"3410d3b5ab0e25e2526d11a3a80ab831fcde5575a6704e8c1f2c5e4a30abd902"} Apr 23 17:05:50.544680 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:50.544609 2562 scope.go:117] "RemoveContainer" containerID="cb5e34ba013df8edd312b31a7ecfd829d3c2cf9fb888f7508325685c62ae157c" Apr 23 17:05:50.545038 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:50.545018 2562 scope.go:117] "RemoveContainer" containerID="cb5e34ba013df8edd312b31a7ecfd829d3c2cf9fb888f7508325685c62ae157c" Apr 23 17:05:50.554863 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:05:50.554834 2562 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-75ab52-predictor-747fcf8956-dcqfw_kserve-ci-e2e-test_7c20d0a4-b64b-4814-add9-1bee2e8fd056_0 in pod sandbox ec9af64b396e1c3cef26ae9148e4ce8a32c55b9b07f3573af0c0564ef6d9351a from index: no such id: 'cb5e34ba013df8edd312b31a7ecfd829d3c2cf9fb888f7508325685c62ae157c'" containerID="cb5e34ba013df8edd312b31a7ecfd829d3c2cf9fb888f7508325685c62ae157c" Apr 23 17:05:50.554946 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:05:50.554888 2562 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-75ab52-predictor-747fcf8956-dcqfw_kserve-ci-e2e-test_7c20d0a4-b64b-4814-add9-1bee2e8fd056_0 in pod sandbox ec9af64b396e1c3cef26ae9148e4ce8a32c55b9b07f3573af0c0564ef6d9351a from index: no such id: 'cb5e34ba013df8edd312b31a7ecfd829d3c2cf9fb888f7508325685c62ae157c'; Skipping pod \"isvc-secondary-75ab52-predictor-747fcf8956-dcqfw_kserve-ci-e2e-test(7c20d0a4-b64b-4814-add9-1bee2e8fd056)\"" logger="UnhandledError" Apr 23 17:05:50.556235 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:05:50.556215 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-75ab52-predictor-747fcf8956-dcqfw_kserve-ci-e2e-test(7c20d0a4-b64b-4814-add9-1bee2e8fd056)\"" pod="kserve-ci-e2e-test/isvc-secondary-75ab52-predictor-747fcf8956-dcqfw" podUID="7c20d0a4-b64b-4814-add9-1bee2e8fd056" Apr 23 17:05:51.554468 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:51.554438 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-75ab52-predictor-747fcf8956-dcqfw_7c20d0a4-b64b-4814-add9-1bee2e8fd056/storage-initializer/1.log" Apr 23 17:05:55.918812 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:55.918775 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-75ab52-predictor-747fcf8956-dcqfw"] Apr 23 17:05:55.973534 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:55.973503 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-75ab52-predictor-686df6f4f7-f785k"] Apr 23 17:05:55.973921 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:55.973891 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-75ab52-predictor-686df6f4f7-f785k" podUID="22efa691-c303-4485-b21a-8bec25bfe284" containerName="kserve-container" containerID="cri-o://5fbf160c71979e1002b0d3d29540e285f4e94779e9fd07b158eb0036aa02f83b" gracePeriod=30 Apr 23 17:05:56.054759 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:56.054722 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-75ab52-predictor-747fcf8956-dcqfw_7c20d0a4-b64b-4814-add9-1bee2e8fd056/storage-initializer/1.log" Apr 23 17:05:56.054887 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:56.054808 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-75ab52-predictor-747fcf8956-dcqfw" Apr 23 17:05:56.076011 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:56.075978 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-1e3fac-predictor-6c48c449dd-lqbkq"] Apr 23 17:05:56.076324 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:56.076310 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7c20d0a4-b64b-4814-add9-1bee2e8fd056" containerName="storage-initializer" Apr 23 17:05:56.076370 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:56.076326 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c20d0a4-b64b-4814-add9-1bee2e8fd056" containerName="storage-initializer" Apr 23 17:05:56.076370 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:56.076346 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7c20d0a4-b64b-4814-add9-1bee2e8fd056" containerName="storage-initializer" Apr 23 17:05:56.076370 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:56.076351 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c20d0a4-b64b-4814-add9-1bee2e8fd056" containerName="storage-initializer" Apr 23 17:05:56.076467 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:56.076404 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="7c20d0a4-b64b-4814-add9-1bee2e8fd056" containerName="storage-initializer" Apr 23 17:05:56.076467 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:56.076414 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="7c20d0a4-b64b-4814-add9-1bee2e8fd056" containerName="storage-initializer" Apr 23 17:05:56.079474 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:56.079459 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-1e3fac-predictor-6c48c449dd-lqbkq" Apr 23 17:05:56.081977 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:56.081954 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-1e3fac\"" Apr 23 17:05:56.081977 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:56.081968 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-1e3fac-dockercfg-lc2t9\"" Apr 23 17:05:56.085927 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:56.085901 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-1e3fac-predictor-6c48c449dd-lqbkq"] Apr 23 17:05:56.106307 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:56.106281 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7c20d0a4-b64b-4814-add9-1bee2e8fd056-kserve-provision-location\") pod \"7c20d0a4-b64b-4814-add9-1bee2e8fd056\" (UID: \"7c20d0a4-b64b-4814-add9-1bee2e8fd056\") " Apr 23 17:05:56.106469 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:56.106341 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/7c20d0a4-b64b-4814-add9-1bee2e8fd056-cabundle-cert\") pod \"7c20d0a4-b64b-4814-add9-1bee2e8fd056\" (UID: \"7c20d0a4-b64b-4814-add9-1bee2e8fd056\") " Apr 23 17:05:56.106547 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:56.106496 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fee12420-77f1-4d1e-9c5c-918d905413cf-kserve-provision-location\") pod \"isvc-init-fail-1e3fac-predictor-6c48c449dd-lqbkq\" (UID: \"fee12420-77f1-4d1e-9c5c-918d905413cf\") " pod="kserve-ci-e2e-test/isvc-init-fail-1e3fac-predictor-6c48c449dd-lqbkq" Apr 23 17:05:56.106614 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:56.106555 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/fee12420-77f1-4d1e-9c5c-918d905413cf-cabundle-cert\") pod \"isvc-init-fail-1e3fac-predictor-6c48c449dd-lqbkq\" (UID: \"fee12420-77f1-4d1e-9c5c-918d905413cf\") " pod="kserve-ci-e2e-test/isvc-init-fail-1e3fac-predictor-6c48c449dd-lqbkq" Apr 23 17:05:56.106614 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:56.106587 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c20d0a4-b64b-4814-add9-1bee2e8fd056-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7c20d0a4-b64b-4814-add9-1bee2e8fd056" (UID: "7c20d0a4-b64b-4814-add9-1bee2e8fd056"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:05:56.106686 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:56.106672 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c20d0a4-b64b-4814-add9-1bee2e8fd056-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "7c20d0a4-b64b-4814-add9-1bee2e8fd056" (UID: "7c20d0a4-b64b-4814-add9-1bee2e8fd056"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:05:56.207885 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:56.207801 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fee12420-77f1-4d1e-9c5c-918d905413cf-kserve-provision-location\") pod \"isvc-init-fail-1e3fac-predictor-6c48c449dd-lqbkq\" (UID: \"fee12420-77f1-4d1e-9c5c-918d905413cf\") " pod="kserve-ci-e2e-test/isvc-init-fail-1e3fac-predictor-6c48c449dd-lqbkq" Apr 23 17:05:56.208044 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:56.207897 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/fee12420-77f1-4d1e-9c5c-918d905413cf-cabundle-cert\") pod \"isvc-init-fail-1e3fac-predictor-6c48c449dd-lqbkq\" (UID: \"fee12420-77f1-4d1e-9c5c-918d905413cf\") " pod="kserve-ci-e2e-test/isvc-init-fail-1e3fac-predictor-6c48c449dd-lqbkq" Apr 23 17:05:56.208044 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:56.207942 2562 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/7c20d0a4-b64b-4814-add9-1bee2e8fd056-cabundle-cert\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 17:05:56.208044 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:56.207958 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7c20d0a4-b64b-4814-add9-1bee2e8fd056-kserve-provision-location\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 17:05:56.208179 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:56.208159 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fee12420-77f1-4d1e-9c5c-918d905413cf-kserve-provision-location\") pod \"isvc-init-fail-1e3fac-predictor-6c48c449dd-lqbkq\" (UID: \"fee12420-77f1-4d1e-9c5c-918d905413cf\") " pod="kserve-ci-e2e-test/isvc-init-fail-1e3fac-predictor-6c48c449dd-lqbkq" Apr 23 17:05:56.208435 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:56.208419 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/fee12420-77f1-4d1e-9c5c-918d905413cf-cabundle-cert\") pod \"isvc-init-fail-1e3fac-predictor-6c48c449dd-lqbkq\" (UID: \"fee12420-77f1-4d1e-9c5c-918d905413cf\") " pod="kserve-ci-e2e-test/isvc-init-fail-1e3fac-predictor-6c48c449dd-lqbkq" Apr 23 17:05:56.389866 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:56.389835 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-1e3fac-predictor-6c48c449dd-lqbkq" Apr 23 17:05:56.511642 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:56.511606 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-1e3fac-predictor-6c48c449dd-lqbkq"] Apr 23 17:05:56.514466 ip-10-0-135-57 kubenswrapper[2562]: W0423 17:05:56.514439 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfee12420_77f1_4d1e_9c5c_918d905413cf.slice/crio-33eca2846390becd19173a62841a5326dc132579e47a81344fa8c3346ee40f38 WatchSource:0}: Error finding container 33eca2846390becd19173a62841a5326dc132579e47a81344fa8c3346ee40f38: Status 404 returned error can't find the container with id 33eca2846390becd19173a62841a5326dc132579e47a81344fa8c3346ee40f38 Apr 23 17:05:56.569796 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:56.569775 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-75ab52-predictor-747fcf8956-dcqfw_7c20d0a4-b64b-4814-add9-1bee2e8fd056/storage-initializer/1.log" Apr 23 17:05:56.569910 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:56.569865 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-75ab52-predictor-747fcf8956-dcqfw" event={"ID":"7c20d0a4-b64b-4814-add9-1bee2e8fd056","Type":"ContainerDied","Data":"ec9af64b396e1c3cef26ae9148e4ce8a32c55b9b07f3573af0c0564ef6d9351a"} Apr 23 17:05:56.569910 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:56.569894 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-75ab52-predictor-747fcf8956-dcqfw" Apr 23 17:05:56.569910 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:56.569904 2562 scope.go:117] "RemoveContainer" containerID="3410d3b5ab0e25e2526d11a3a80ab831fcde5575a6704e8c1f2c5e4a30abd902" Apr 23 17:05:56.571423 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:56.571399 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-1e3fac-predictor-6c48c449dd-lqbkq" event={"ID":"fee12420-77f1-4d1e-9c5c-918d905413cf","Type":"ContainerStarted","Data":"33eca2846390becd19173a62841a5326dc132579e47a81344fa8c3346ee40f38"} Apr 23 17:05:56.604050 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:56.604018 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-75ab52-predictor-747fcf8956-dcqfw"] Apr 23 17:05:56.606420 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:56.606394 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-75ab52-predictor-747fcf8956-dcqfw"] Apr 23 17:05:56.742837 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:56.742736 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c20d0a4-b64b-4814-add9-1bee2e8fd056" path="/var/lib/kubelet/pods/7c20d0a4-b64b-4814-add9-1bee2e8fd056/volumes" Apr 23 17:05:57.576256 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:57.576225 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-1e3fac-predictor-6c48c449dd-lqbkq" event={"ID":"fee12420-77f1-4d1e-9c5c-918d905413cf","Type":"ContainerStarted","Data":"676da95c30c549042f33d0f6c7b69010b11679a114b6bf1a07fff338862fcf93"} Apr 23 17:05:57.738398 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:05:57.738348 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-75ab52-predictor-686df6f4f7-f785k" podUID="22efa691-c303-4485-b21a-8bec25bfe284" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 23 17:06:00.408234 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:00.408211 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-75ab52-predictor-686df6f4f7-f785k" Apr 23 17:06:00.443517 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:00.443490 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/22efa691-c303-4485-b21a-8bec25bfe284-kserve-provision-location\") pod \"22efa691-c303-4485-b21a-8bec25bfe284\" (UID: \"22efa691-c303-4485-b21a-8bec25bfe284\") " Apr 23 17:06:00.443843 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:00.443817 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22efa691-c303-4485-b21a-8bec25bfe284-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "22efa691-c303-4485-b21a-8bec25bfe284" (UID: "22efa691-c303-4485-b21a-8bec25bfe284"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:06:00.544292 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:00.544202 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/22efa691-c303-4485-b21a-8bec25bfe284-kserve-provision-location\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 17:06:00.587806 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:00.587777 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-1e3fac-predictor-6c48c449dd-lqbkq_fee12420-77f1-4d1e-9c5c-918d905413cf/storage-initializer/0.log" Apr 23 17:06:00.587983 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:00.587816 2562 generic.go:358] "Generic (PLEG): container finished" podID="fee12420-77f1-4d1e-9c5c-918d905413cf" containerID="676da95c30c549042f33d0f6c7b69010b11679a114b6bf1a07fff338862fcf93" exitCode=1 Apr 23 17:06:00.587983 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:00.587899 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-1e3fac-predictor-6c48c449dd-lqbkq" event={"ID":"fee12420-77f1-4d1e-9c5c-918d905413cf","Type":"ContainerDied","Data":"676da95c30c549042f33d0f6c7b69010b11679a114b6bf1a07fff338862fcf93"} Apr 23 17:06:00.589342 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:00.589315 2562 generic.go:358] "Generic (PLEG): container finished" podID="22efa691-c303-4485-b21a-8bec25bfe284" containerID="5fbf160c71979e1002b0d3d29540e285f4e94779e9fd07b158eb0036aa02f83b" exitCode=0 Apr 23 17:06:00.589436 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:00.589348 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-75ab52-predictor-686df6f4f7-f785k" event={"ID":"22efa691-c303-4485-b21a-8bec25bfe284","Type":"ContainerDied","Data":"5fbf160c71979e1002b0d3d29540e285f4e94779e9fd07b158eb0036aa02f83b"} Apr 23 17:06:00.589436 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:00.589379 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-75ab52-predictor-686df6f4f7-f785k" event={"ID":"22efa691-c303-4485-b21a-8bec25bfe284","Type":"ContainerDied","Data":"251ae82d02049724310e9c5f4a9e19e5db44e0b0d2c3c5ddda1070b6f704890d"} Apr 23 17:06:00.589436 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:00.589395 2562 scope.go:117] "RemoveContainer" containerID="5fbf160c71979e1002b0d3d29540e285f4e94779e9fd07b158eb0036aa02f83b" Apr 23 17:06:00.589559 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:00.589391 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-75ab52-predictor-686df6f4f7-f785k" Apr 23 17:06:00.597197 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:00.597135 2562 scope.go:117] "RemoveContainer" containerID="d68e5e193b9e26aa0b74e80226e7f99178ec1da63216dfd7a6e0d6e64ac7472e" Apr 23 17:06:00.604382 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:00.604360 2562 scope.go:117] "RemoveContainer" containerID="5fbf160c71979e1002b0d3d29540e285f4e94779e9fd07b158eb0036aa02f83b" Apr 23 17:06:00.604712 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:06:00.604669 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fbf160c71979e1002b0d3d29540e285f4e94779e9fd07b158eb0036aa02f83b\": container with ID starting with 5fbf160c71979e1002b0d3d29540e285f4e94779e9fd07b158eb0036aa02f83b not found: ID does not exist" containerID="5fbf160c71979e1002b0d3d29540e285f4e94779e9fd07b158eb0036aa02f83b" Apr 23 17:06:00.604712 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:00.604703 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fbf160c71979e1002b0d3d29540e285f4e94779e9fd07b158eb0036aa02f83b"} err="failed to get container status \"5fbf160c71979e1002b0d3d29540e285f4e94779e9fd07b158eb0036aa02f83b\": rpc error: code = NotFound desc = could not find container \"5fbf160c71979e1002b0d3d29540e285f4e94779e9fd07b158eb0036aa02f83b\": container with ID starting with 5fbf160c71979e1002b0d3d29540e285f4e94779e9fd07b158eb0036aa02f83b not found: ID does not exist" Apr 23 17:06:00.604967 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:00.604721 2562 scope.go:117] "RemoveContainer" containerID="d68e5e193b9e26aa0b74e80226e7f99178ec1da63216dfd7a6e0d6e64ac7472e" Apr 23 17:06:00.605054 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:06:00.605029 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d68e5e193b9e26aa0b74e80226e7f99178ec1da63216dfd7a6e0d6e64ac7472e\": container with ID starting with d68e5e193b9e26aa0b74e80226e7f99178ec1da63216dfd7a6e0d6e64ac7472e not found: ID does not exist" containerID="d68e5e193b9e26aa0b74e80226e7f99178ec1da63216dfd7a6e0d6e64ac7472e" Apr 23 17:06:00.605116 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:00.605057 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d68e5e193b9e26aa0b74e80226e7f99178ec1da63216dfd7a6e0d6e64ac7472e"} err="failed to get container status \"d68e5e193b9e26aa0b74e80226e7f99178ec1da63216dfd7a6e0d6e64ac7472e\": rpc error: code = NotFound desc = could not find container \"d68e5e193b9e26aa0b74e80226e7f99178ec1da63216dfd7a6e0d6e64ac7472e\": container with ID starting with d68e5e193b9e26aa0b74e80226e7f99178ec1da63216dfd7a6e0d6e64ac7472e not found: ID does not exist" Apr 23 17:06:00.619076 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:00.619037 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-75ab52-predictor-686df6f4f7-f785k"] Apr 23 17:06:00.621818 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:00.621794 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-75ab52-predictor-686df6f4f7-f785k"] Apr 23 17:06:00.742140 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:00.742107 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22efa691-c303-4485-b21a-8bec25bfe284" path="/var/lib/kubelet/pods/22efa691-c303-4485-b21a-8bec25bfe284/volumes" Apr 23 17:06:01.098152 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:01.098118 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-1e3fac-predictor-6c48c449dd-lqbkq"] Apr 23 17:06:01.209771 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:01.209727 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-drgh5"] Apr 23 17:06:01.210064 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:01.210053 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="22efa691-c303-4485-b21a-8bec25bfe284" containerName="kserve-container" Apr 23 17:06:01.210123 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:01.210066 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="22efa691-c303-4485-b21a-8bec25bfe284" containerName="kserve-container" Apr 23 17:06:01.210123 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:01.210081 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="22efa691-c303-4485-b21a-8bec25bfe284" containerName="storage-initializer" Apr 23 17:06:01.210123 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:01.210086 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="22efa691-c303-4485-b21a-8bec25bfe284" containerName="storage-initializer" Apr 23 17:06:01.210224 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:01.210136 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="22efa691-c303-4485-b21a-8bec25bfe284" containerName="kserve-container" Apr 23 17:06:01.213411 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:01.213393 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-drgh5" Apr 23 17:06:01.216297 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:01.216277 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-rtrj5\"" Apr 23 17:06:01.223392 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:01.223369 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-drgh5"] Apr 23 17:06:01.249976 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:01.249936 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b357a94-22ab-4fe2-aa0e-e269237d3b64-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-85bccb8945-drgh5\" (UID: \"5b357a94-22ab-4fe2-aa0e-e269237d3b64\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-drgh5" Apr 23 17:06:01.350502 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:01.350417 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b357a94-22ab-4fe2-aa0e-e269237d3b64-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-85bccb8945-drgh5\" (UID: \"5b357a94-22ab-4fe2-aa0e-e269237d3b64\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-drgh5" Apr 23 17:06:01.350869 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:01.350846 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b357a94-22ab-4fe2-aa0e-e269237d3b64-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-85bccb8945-drgh5\" (UID: \"5b357a94-22ab-4fe2-aa0e-e269237d3b64\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-drgh5" Apr 23 17:06:01.524769 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:01.524710 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-drgh5" Apr 23 17:06:01.593856 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:01.593830 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-1e3fac-predictor-6c48c449dd-lqbkq_fee12420-77f1-4d1e-9c5c-918d905413cf/storage-initializer/0.log" Apr 23 17:06:01.594136 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:01.594092 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-init-fail-1e3fac-predictor-6c48c449dd-lqbkq" podUID="fee12420-77f1-4d1e-9c5c-918d905413cf" containerName="storage-initializer" containerID="cri-o://730c55a481a6fbe633a242e7a2410a88318873942f35f0d6a1f438d8735795e5" gracePeriod=30 Apr 23 17:06:01.594381 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:01.594354 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-1e3fac-predictor-6c48c449dd-lqbkq" event={"ID":"fee12420-77f1-4d1e-9c5c-918d905413cf","Type":"ContainerStarted","Data":"730c55a481a6fbe633a242e7a2410a88318873942f35f0d6a1f438d8735795e5"} Apr 23 17:06:01.648272 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:01.648085 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-drgh5"] Apr 23 17:06:01.650809 ip-10-0-135-57 kubenswrapper[2562]: W0423 17:06:01.650778 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b357a94_22ab_4fe2_aa0e_e269237d3b64.slice/crio-c07510eb51fb2bfadfb049530a6ac79a811cccc254069743a4217e2d669e9723 WatchSource:0}: Error finding container c07510eb51fb2bfadfb049530a6ac79a811cccc254069743a4217e2d669e9723: Status 404 returned error can't find the container with id c07510eb51fb2bfadfb049530a6ac79a811cccc254069743a4217e2d669e9723 Apr 23 17:06:02.599707 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:02.599668 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-drgh5" event={"ID":"5b357a94-22ab-4fe2-aa0e-e269237d3b64","Type":"ContainerStarted","Data":"430491bb5e9f0e1188448f3aa116cd388f3c2d0405660abbe12c947de004cf3c"} Apr 23 17:06:02.600124 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:02.599715 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-drgh5" event={"ID":"5b357a94-22ab-4fe2-aa0e-e269237d3b64","Type":"ContainerStarted","Data":"c07510eb51fb2bfadfb049530a6ac79a811cccc254069743a4217e2d669e9723"} Apr 23 17:06:05.609851 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:05.609760 2562 generic.go:358] "Generic (PLEG): container finished" podID="5b357a94-22ab-4fe2-aa0e-e269237d3b64" containerID="430491bb5e9f0e1188448f3aa116cd388f3c2d0405660abbe12c947de004cf3c" exitCode=0 Apr 23 17:06:05.609851 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:05.609833 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-drgh5" event={"ID":"5b357a94-22ab-4fe2-aa0e-e269237d3b64","Type":"ContainerDied","Data":"430491bb5e9f0e1188448f3aa116cd388f3c2d0405660abbe12c947de004cf3c"} Apr 23 17:06:06.128380 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:06.128356 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-1e3fac-predictor-6c48c449dd-lqbkq_fee12420-77f1-4d1e-9c5c-918d905413cf/storage-initializer/1.log" Apr 23 17:06:06.128773 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:06.128732 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-1e3fac-predictor-6c48c449dd-lqbkq_fee12420-77f1-4d1e-9c5c-918d905413cf/storage-initializer/0.log" Apr 23 17:06:06.128863 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:06.128826 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-1e3fac-predictor-6c48c449dd-lqbkq" Apr 23 17:06:06.193375 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:06.193347 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/fee12420-77f1-4d1e-9c5c-918d905413cf-cabundle-cert\") pod \"fee12420-77f1-4d1e-9c5c-918d905413cf\" (UID: \"fee12420-77f1-4d1e-9c5c-918d905413cf\") " Apr 23 17:06:06.193563 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:06.193400 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fee12420-77f1-4d1e-9c5c-918d905413cf-kserve-provision-location\") pod \"fee12420-77f1-4d1e-9c5c-918d905413cf\" (UID: \"fee12420-77f1-4d1e-9c5c-918d905413cf\") " Apr 23 17:06:06.193729 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:06.193705 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fee12420-77f1-4d1e-9c5c-918d905413cf-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "fee12420-77f1-4d1e-9c5c-918d905413cf" (UID: "fee12420-77f1-4d1e-9c5c-918d905413cf"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:06:06.193729 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:06.193716 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fee12420-77f1-4d1e-9c5c-918d905413cf-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "fee12420-77f1-4d1e-9c5c-918d905413cf" (UID: "fee12420-77f1-4d1e-9c5c-918d905413cf"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:06:06.295032 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:06.294921 2562 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/fee12420-77f1-4d1e-9c5c-918d905413cf-cabundle-cert\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 17:06:06.295032 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:06.294962 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fee12420-77f1-4d1e-9c5c-918d905413cf-kserve-provision-location\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 17:06:06.614337 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:06.614257 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-1e3fac-predictor-6c48c449dd-lqbkq_fee12420-77f1-4d1e-9c5c-918d905413cf/storage-initializer/1.log" Apr 23 17:06:06.614715 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:06.614615 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-1e3fac-predictor-6c48c449dd-lqbkq_fee12420-77f1-4d1e-9c5c-918d905413cf/storage-initializer/0.log" Apr 23 17:06:06.614715 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:06.614645 2562 generic.go:358] "Generic (PLEG): container finished" podID="fee12420-77f1-4d1e-9c5c-918d905413cf" containerID="730c55a481a6fbe633a242e7a2410a88318873942f35f0d6a1f438d8735795e5" exitCode=1 Apr 23 17:06:06.614836 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:06.614718 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-1e3fac-predictor-6c48c449dd-lqbkq" Apr 23 17:06:06.614836 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:06.614732 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-1e3fac-predictor-6c48c449dd-lqbkq" event={"ID":"fee12420-77f1-4d1e-9c5c-918d905413cf","Type":"ContainerDied","Data":"730c55a481a6fbe633a242e7a2410a88318873942f35f0d6a1f438d8735795e5"} Apr 23 17:06:06.614836 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:06.614777 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-1e3fac-predictor-6c48c449dd-lqbkq" event={"ID":"fee12420-77f1-4d1e-9c5c-918d905413cf","Type":"ContainerDied","Data":"33eca2846390becd19173a62841a5326dc132579e47a81344fa8c3346ee40f38"} Apr 23 17:06:06.614836 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:06.614793 2562 scope.go:117] "RemoveContainer" containerID="730c55a481a6fbe633a242e7a2410a88318873942f35f0d6a1f438d8735795e5" Apr 23 17:06:06.623121 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:06.623101 2562 scope.go:117] "RemoveContainer" containerID="676da95c30c549042f33d0f6c7b69010b11679a114b6bf1a07fff338862fcf93" Apr 23 17:06:06.630216 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:06.630197 2562 scope.go:117] "RemoveContainer" containerID="730c55a481a6fbe633a242e7a2410a88318873942f35f0d6a1f438d8735795e5" Apr 23 17:06:06.630475 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:06:06.630453 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"730c55a481a6fbe633a242e7a2410a88318873942f35f0d6a1f438d8735795e5\": container with ID starting with 730c55a481a6fbe633a242e7a2410a88318873942f35f0d6a1f438d8735795e5 not found: ID does not exist" containerID="730c55a481a6fbe633a242e7a2410a88318873942f35f0d6a1f438d8735795e5" Apr 23 17:06:06.630524 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:06.630483 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"730c55a481a6fbe633a242e7a2410a88318873942f35f0d6a1f438d8735795e5"} err="failed to get container status \"730c55a481a6fbe633a242e7a2410a88318873942f35f0d6a1f438d8735795e5\": rpc error: code = NotFound desc = could not find container \"730c55a481a6fbe633a242e7a2410a88318873942f35f0d6a1f438d8735795e5\": container with ID starting with 730c55a481a6fbe633a242e7a2410a88318873942f35f0d6a1f438d8735795e5 not found: ID does not exist" Apr 23 17:06:06.630524 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:06.630501 2562 scope.go:117] "RemoveContainer" containerID="676da95c30c549042f33d0f6c7b69010b11679a114b6bf1a07fff338862fcf93" Apr 23 17:06:06.630761 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:06:06.630728 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"676da95c30c549042f33d0f6c7b69010b11679a114b6bf1a07fff338862fcf93\": container with ID starting with 676da95c30c549042f33d0f6c7b69010b11679a114b6bf1a07fff338862fcf93 not found: ID does not exist" containerID="676da95c30c549042f33d0f6c7b69010b11679a114b6bf1a07fff338862fcf93" Apr 23 17:06:06.630837 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:06.630769 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"676da95c30c549042f33d0f6c7b69010b11679a114b6bf1a07fff338862fcf93"} err="failed to get container status \"676da95c30c549042f33d0f6c7b69010b11679a114b6bf1a07fff338862fcf93\": rpc error: code = NotFound desc = could not find container \"676da95c30c549042f33d0f6c7b69010b11679a114b6bf1a07fff338862fcf93\": container with ID starting with 676da95c30c549042f33d0f6c7b69010b11679a114b6bf1a07fff338862fcf93 not found: ID does not exist" Apr 23 17:06:06.650914 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:06.650886 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-1e3fac-predictor-6c48c449dd-lqbkq"] Apr 23 17:06:06.652249 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:06.652227 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-1e3fac-predictor-6c48c449dd-lqbkq"] Apr 23 17:06:06.743910 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:06.743875 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fee12420-77f1-4d1e-9c5c-918d905413cf" path="/var/lib/kubelet/pods/fee12420-77f1-4d1e-9c5c-918d905413cf/volumes" Apr 23 17:06:27.682383 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:27.682338 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-drgh5" event={"ID":"5b357a94-22ab-4fe2-aa0e-e269237d3b64","Type":"ContainerStarted","Data":"04ac9357fd697bf267a5f13229dc40f8ab8aa5750c81c1ba4fdeda42c047cbcc"} Apr 23 17:06:27.682873 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:27.682629 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-drgh5" Apr 23 17:06:27.683777 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:27.683726 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-drgh5" podUID="5b357a94-22ab-4fe2-aa0e-e269237d3b64" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 23 17:06:27.699844 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:27.699798 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-drgh5" podStartSLOduration=5.391657261 podStartE2EDuration="26.699785104s" podCreationTimestamp="2026-04-23 17:06:01 +0000 UTC" firstStartedPulling="2026-04-23 17:06:05.610948393 +0000 UTC m=+1845.463763434" lastFinishedPulling="2026-04-23 17:06:26.919076221 +0000 UTC m=+1866.771891277" observedRunningTime="2026-04-23 17:06:27.697918206 +0000 UTC m=+1867.550733271" watchObservedRunningTime="2026-04-23 17:06:27.699785104 +0000 UTC m=+1867.552600167" Apr 23 17:06:28.685721 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:28.685683 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-drgh5" podUID="5b357a94-22ab-4fe2-aa0e-e269237d3b64" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 23 17:06:38.686575 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:38.686523 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-drgh5" podUID="5b357a94-22ab-4fe2-aa0e-e269237d3b64" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 23 17:06:48.686041 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:48.685997 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-drgh5" podUID="5b357a94-22ab-4fe2-aa0e-e269237d3b64" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 23 17:06:58.686360 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:06:58.686310 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-drgh5" podUID="5b357a94-22ab-4fe2-aa0e-e269237d3b64" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 23 17:07:08.686017 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:07:08.685970 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-drgh5" podUID="5b357a94-22ab-4fe2-aa0e-e269237d3b64" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 23 17:07:18.685729 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:07:18.685686 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-drgh5" podUID="5b357a94-22ab-4fe2-aa0e-e269237d3b64" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 23 17:07:28.686281 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:07:28.686204 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-drgh5" podUID="5b357a94-22ab-4fe2-aa0e-e269237d3b64" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 23 17:07:38.686295 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:07:38.686247 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-drgh5" podUID="5b357a94-22ab-4fe2-aa0e-e269237d3b64" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 23 17:07:48.686800 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:07:48.686768 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-drgh5" Apr 23 17:07:51.422199 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:07:51.422162 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-drgh5"] Apr 23 17:07:51.422581 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:07:51.422446 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-drgh5" podUID="5b357a94-22ab-4fe2-aa0e-e269237d3b64" containerName="kserve-container" containerID="cri-o://04ac9357fd697bf267a5f13229dc40f8ab8aa5750c81c1ba4fdeda42c047cbcc" gracePeriod=30 Apr 23 17:07:51.486339 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:07:51.486305 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-79qbb"] Apr 23 17:07:51.486666 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:07:51.486652 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fee12420-77f1-4d1e-9c5c-918d905413cf" containerName="storage-initializer" Apr 23 17:07:51.486711 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:07:51.486668 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="fee12420-77f1-4d1e-9c5c-918d905413cf" containerName="storage-initializer" Apr 23 17:07:51.486711 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:07:51.486684 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fee12420-77f1-4d1e-9c5c-918d905413cf" containerName="storage-initializer" Apr 23 17:07:51.486711 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:07:51.486690 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="fee12420-77f1-4d1e-9c5c-918d905413cf" containerName="storage-initializer" Apr 23 17:07:51.486851 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:07:51.486761 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="fee12420-77f1-4d1e-9c5c-918d905413cf" containerName="storage-initializer" Apr 23 17:07:51.486886 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:07:51.486879 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="fee12420-77f1-4d1e-9c5c-918d905413cf" containerName="storage-initializer" Apr 23 17:07:51.489964 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:07:51.489948 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-79qbb" Apr 23 17:07:51.498137 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:07:51.498109 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-79qbb"] Apr 23 17:07:51.624918 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:07:51.624873 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e67e442c-1c6c-4df1-b237-5d1d04cfd134-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-794b54b9b4-79qbb\" (UID: \"e67e442c-1c6c-4df1-b237-5d1d04cfd134\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-79qbb" Apr 23 17:07:51.726136 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:07:51.726035 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e67e442c-1c6c-4df1-b237-5d1d04cfd134-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-794b54b9b4-79qbb\" (UID: \"e67e442c-1c6c-4df1-b237-5d1d04cfd134\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-79qbb" Apr 23 17:07:51.726439 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:07:51.726420 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e67e442c-1c6c-4df1-b237-5d1d04cfd134-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-794b54b9b4-79qbb\" (UID: \"e67e442c-1c6c-4df1-b237-5d1d04cfd134\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-79qbb" Apr 23 17:07:51.800206 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:07:51.800167 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-79qbb" Apr 23 17:07:51.920624 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:07:51.920553 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-79qbb"] Apr 23 17:07:51.923050 ip-10-0-135-57 kubenswrapper[2562]: W0423 17:07:51.923020 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode67e442c_1c6c_4df1_b237_5d1d04cfd134.slice/crio-df47605f661c53fd4f350140df9aa9aa0a809c727ac90fc540aecb692336d995 WatchSource:0}: Error finding container df47605f661c53fd4f350140df9aa9aa0a809c727ac90fc540aecb692336d995: Status 404 returned error can't find the container with id df47605f661c53fd4f350140df9aa9aa0a809c727ac90fc540aecb692336d995 Apr 23 17:07:52.928818 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:07:52.928782 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-79qbb" event={"ID":"e67e442c-1c6c-4df1-b237-5d1d04cfd134","Type":"ContainerStarted","Data":"eeea2c961f36f8621384c9f6b364f00cf030fe7bf3438ff2375c14858673b229"} Apr 23 17:07:52.928818 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:07:52.928820 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-79qbb" event={"ID":"e67e442c-1c6c-4df1-b237-5d1d04cfd134","Type":"ContainerStarted","Data":"df47605f661c53fd4f350140df9aa9aa0a809c727ac90fc540aecb692336d995"} Apr 23 17:07:55.943621 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:07:55.943589 2562 generic.go:358] "Generic (PLEG): container finished" podID="e67e442c-1c6c-4df1-b237-5d1d04cfd134" containerID="eeea2c961f36f8621384c9f6b364f00cf030fe7bf3438ff2375c14858673b229" exitCode=0 Apr 23 17:07:55.944006 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:07:55.943669 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-79qbb" event={"ID":"e67e442c-1c6c-4df1-b237-5d1d04cfd134","Type":"ContainerDied","Data":"eeea2c961f36f8621384c9f6b364f00cf030fe7bf3438ff2375c14858673b229"} Apr 23 17:07:56.463198 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:07:56.463174 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-drgh5" Apr 23 17:07:56.558557 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:07:56.558462 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b357a94-22ab-4fe2-aa0e-e269237d3b64-kserve-provision-location\") pod \"5b357a94-22ab-4fe2-aa0e-e269237d3b64\" (UID: \"5b357a94-22ab-4fe2-aa0e-e269237d3b64\") " Apr 23 17:07:56.558803 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:07:56.558780 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b357a94-22ab-4fe2-aa0e-e269237d3b64-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5b357a94-22ab-4fe2-aa0e-e269237d3b64" (UID: "5b357a94-22ab-4fe2-aa0e-e269237d3b64"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:07:56.659259 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:07:56.659222 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b357a94-22ab-4fe2-aa0e-e269237d3b64-kserve-provision-location\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 17:07:56.947927 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:07:56.947891 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-79qbb" event={"ID":"e67e442c-1c6c-4df1-b237-5d1d04cfd134","Type":"ContainerStarted","Data":"5242b0cfc3675311791d1e9944a52bf025a8bf6d28de30bcf78282929feff905"} Apr 23 17:07:56.948363 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:07:56.948195 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-79qbb" Apr 23 17:07:56.949343 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:07:56.949319 2562 generic.go:358] "Generic (PLEG): container finished" podID="5b357a94-22ab-4fe2-aa0e-e269237d3b64" containerID="04ac9357fd697bf267a5f13229dc40f8ab8aa5750c81c1ba4fdeda42c047cbcc" exitCode=0 Apr 23 17:07:56.949468 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:07:56.949395 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-drgh5" Apr 23 17:07:56.949468 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:07:56.949396 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-drgh5" event={"ID":"5b357a94-22ab-4fe2-aa0e-e269237d3b64","Type":"ContainerDied","Data":"04ac9357fd697bf267a5f13229dc40f8ab8aa5750c81c1ba4fdeda42c047cbcc"} Apr 23 17:07:56.949468 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:07:56.949435 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-drgh5" event={"ID":"5b357a94-22ab-4fe2-aa0e-e269237d3b64","Type":"ContainerDied","Data":"c07510eb51fb2bfadfb049530a6ac79a811cccc254069743a4217e2d669e9723"} Apr 23 17:07:56.949468 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:07:56.949455 2562 scope.go:117] "RemoveContainer" containerID="04ac9357fd697bf267a5f13229dc40f8ab8aa5750c81c1ba4fdeda42c047cbcc" Apr 23 17:07:56.949837 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:07:56.949813 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-79qbb" podUID="e67e442c-1c6c-4df1-b237-5d1d04cfd134" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 23 17:07:56.957108 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:07:56.957091 2562 scope.go:117] "RemoveContainer" containerID="430491bb5e9f0e1188448f3aa116cd388f3c2d0405660abbe12c947de004cf3c" Apr 23 17:07:56.963884 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:07:56.963812 2562 scope.go:117] "RemoveContainer" containerID="04ac9357fd697bf267a5f13229dc40f8ab8aa5750c81c1ba4fdeda42c047cbcc" Apr 23 17:07:56.964110 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:07:56.964085 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04ac9357fd697bf267a5f13229dc40f8ab8aa5750c81c1ba4fdeda42c047cbcc\": container with ID starting with 04ac9357fd697bf267a5f13229dc40f8ab8aa5750c81c1ba4fdeda42c047cbcc not found: ID does not exist" containerID="04ac9357fd697bf267a5f13229dc40f8ab8aa5750c81c1ba4fdeda42c047cbcc" Apr 23 17:07:56.964186 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:07:56.964122 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04ac9357fd697bf267a5f13229dc40f8ab8aa5750c81c1ba4fdeda42c047cbcc"} err="failed to get container status \"04ac9357fd697bf267a5f13229dc40f8ab8aa5750c81c1ba4fdeda42c047cbcc\": rpc error: code = NotFound desc = could not find container \"04ac9357fd697bf267a5f13229dc40f8ab8aa5750c81c1ba4fdeda42c047cbcc\": container with ID starting with 04ac9357fd697bf267a5f13229dc40f8ab8aa5750c81c1ba4fdeda42c047cbcc not found: ID does not exist" Apr 23 17:07:56.964186 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:07:56.964146 2562 scope.go:117] "RemoveContainer" containerID="430491bb5e9f0e1188448f3aa116cd388f3c2d0405660abbe12c947de004cf3c" Apr 23 17:07:56.964386 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:07:56.964369 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"430491bb5e9f0e1188448f3aa116cd388f3c2d0405660abbe12c947de004cf3c\": container with ID starting with 430491bb5e9f0e1188448f3aa116cd388f3c2d0405660abbe12c947de004cf3c not found: ID does not exist" containerID="430491bb5e9f0e1188448f3aa116cd388f3c2d0405660abbe12c947de004cf3c" Apr 23 17:07:56.964437 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:07:56.964393 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"430491bb5e9f0e1188448f3aa116cd388f3c2d0405660abbe12c947de004cf3c"} err="failed to get container status \"430491bb5e9f0e1188448f3aa116cd388f3c2d0405660abbe12c947de004cf3c\": rpc error: code = NotFound desc = could not find container \"430491bb5e9f0e1188448f3aa116cd388f3c2d0405660abbe12c947de004cf3c\": container with ID starting with 430491bb5e9f0e1188448f3aa116cd388f3c2d0405660abbe12c947de004cf3c not found: ID does not exist" Apr 23 17:07:56.967048 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:07:56.966991 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-79qbb" podStartSLOduration=5.966980331 podStartE2EDuration="5.966980331s" podCreationTimestamp="2026-04-23 17:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:07:56.964228302 +0000 UTC m=+1956.817043366" watchObservedRunningTime="2026-04-23 17:07:56.966980331 +0000 UTC m=+1956.819795393" Apr 23 17:07:56.977355 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:07:56.977328 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-drgh5"] Apr 23 17:07:56.980515 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:07:56.980493 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-drgh5"] Apr 23 17:07:57.953303 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:07:57.953262 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-79qbb" podUID="e67e442c-1c6c-4df1-b237-5d1d04cfd134" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 23 17:07:58.741970 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:07:58.741940 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b357a94-22ab-4fe2-aa0e-e269237d3b64" path="/var/lib/kubelet/pods/5b357a94-22ab-4fe2-aa0e-e269237d3b64/volumes" Apr 23 17:08:07.953653 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:08:07.953604 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-79qbb" podUID="e67e442c-1c6c-4df1-b237-5d1d04cfd134" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 23 17:08:17.953821 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:08:17.953767 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-79qbb" podUID="e67e442c-1c6c-4df1-b237-5d1d04cfd134" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 23 17:08:27.954201 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:08:27.954154 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-79qbb" podUID="e67e442c-1c6c-4df1-b237-5d1d04cfd134" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 23 17:08:37.954256 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:08:37.954213 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-79qbb" podUID="e67e442c-1c6c-4df1-b237-5d1d04cfd134" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 23 17:08:47.953959 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:08:47.953912 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-79qbb" podUID="e67e442c-1c6c-4df1-b237-5d1d04cfd134" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 23 17:08:57.953804 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:08:57.953730 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-79qbb" podUID="e67e442c-1c6c-4df1-b237-5d1d04cfd134" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 23 17:09:00.739921 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:09:00.739865 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-79qbb" podUID="e67e442c-1c6c-4df1-b237-5d1d04cfd134" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 23 17:09:10.741778 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:09:10.741738 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-79qbb" Apr 23 17:09:11.584512 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:09:11.584472 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-79qbb"] Apr 23 17:09:11.584725 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:09:11.584704 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-79qbb" podUID="e67e442c-1c6c-4df1-b237-5d1d04cfd134" containerName="kserve-container" containerID="cri-o://5242b0cfc3675311791d1e9944a52bf025a8bf6d28de30bcf78282929feff905" gracePeriod=30 Apr 23 17:09:11.654305 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:09:11.654273 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-9q7dp"] Apr 23 17:09:11.654598 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:09:11.654585 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5b357a94-22ab-4fe2-aa0e-e269237d3b64" containerName="storage-initializer" Apr 23 17:09:11.654651 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:09:11.654599 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b357a94-22ab-4fe2-aa0e-e269237d3b64" containerName="storage-initializer" Apr 23 17:09:11.654651 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:09:11.654614 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5b357a94-22ab-4fe2-aa0e-e269237d3b64" containerName="kserve-container" Apr 23 17:09:11.654651 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:09:11.654620 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b357a94-22ab-4fe2-aa0e-e269237d3b64" containerName="kserve-container" Apr 23 17:09:11.654767 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:09:11.654685 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="5b357a94-22ab-4fe2-aa0e-e269237d3b64" containerName="kserve-container" Apr 23 17:09:11.657846 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:09:11.657830 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-9q7dp" Apr 23 17:09:11.667139 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:09:11.667113 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-9q7dp"] Apr 23 17:09:11.780249 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:09:11.780206 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/abb36990-d551-49b0-ae53-571a3c00fd70-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-669896799c-9q7dp\" (UID: \"abb36990-d551-49b0-ae53-571a3c00fd70\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-9q7dp" Apr 23 17:09:11.880943 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:09:11.880866 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/abb36990-d551-49b0-ae53-571a3c00fd70-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-669896799c-9q7dp\" (UID: \"abb36990-d551-49b0-ae53-571a3c00fd70\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-9q7dp" Apr 23 17:09:11.881236 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:09:11.881216 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/abb36990-d551-49b0-ae53-571a3c00fd70-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-669896799c-9q7dp\" (UID: \"abb36990-d551-49b0-ae53-571a3c00fd70\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-9q7dp" Apr 23 17:09:11.968975 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:09:11.968937 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-9q7dp" Apr 23 17:09:12.088577 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:09:12.088554 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-9q7dp"] Apr 23 17:09:12.090356 ip-10-0-135-57 kubenswrapper[2562]: W0423 17:09:12.090323 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabb36990_d551_49b0_ae53_571a3c00fd70.slice/crio-41cc8c6afe08003677e4fd2fbd50a2906aff51be67c59aacd654c339604a8f44 WatchSource:0}: Error finding container 41cc8c6afe08003677e4fd2fbd50a2906aff51be67c59aacd654c339604a8f44: Status 404 returned error can't find the container with id 41cc8c6afe08003677e4fd2fbd50a2906aff51be67c59aacd654c339604a8f44 Apr 23 17:09:12.180883 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:09:12.180850 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-9q7dp" event={"ID":"abb36990-d551-49b0-ae53-571a3c00fd70","Type":"ContainerStarted","Data":"b2c3926714d3281e74ab1bad2b54409a054b66b748a6e2464e1b5b30a9d8c6bf"} Apr 23 17:09:12.180883 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:09:12.180887 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-9q7dp" event={"ID":"abb36990-d551-49b0-ae53-571a3c00fd70","Type":"ContainerStarted","Data":"41cc8c6afe08003677e4fd2fbd50a2906aff51be67c59aacd654c339604a8f44"} Apr 23 17:09:16.197922 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:09:16.197829 2562 generic.go:358] "Generic (PLEG): container finished" podID="abb36990-d551-49b0-ae53-571a3c00fd70" containerID="b2c3926714d3281e74ab1bad2b54409a054b66b748a6e2464e1b5b30a9d8c6bf" exitCode=0 Apr 23 17:09:16.197922 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:09:16.197888 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-9q7dp" event={"ID":"abb36990-d551-49b0-ae53-571a3c00fd70","Type":"ContainerDied","Data":"b2c3926714d3281e74ab1bad2b54409a054b66b748a6e2464e1b5b30a9d8c6bf"} Apr 23 17:09:16.629396 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:09:16.629375 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-79qbb" Apr 23 17:09:16.724564 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:09:16.724478 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e67e442c-1c6c-4df1-b237-5d1d04cfd134-kserve-provision-location\") pod \"e67e442c-1c6c-4df1-b237-5d1d04cfd134\" (UID: \"e67e442c-1c6c-4df1-b237-5d1d04cfd134\") " Apr 23 17:09:16.724816 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:09:16.724792 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e67e442c-1c6c-4df1-b237-5d1d04cfd134-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e67e442c-1c6c-4df1-b237-5d1d04cfd134" (UID: "e67e442c-1c6c-4df1-b237-5d1d04cfd134"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:09:16.825670 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:09:16.825636 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e67e442c-1c6c-4df1-b237-5d1d04cfd134-kserve-provision-location\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 17:09:17.203371 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:09:17.203335 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-9q7dp" event={"ID":"abb36990-d551-49b0-ae53-571a3c00fd70","Type":"ContainerStarted","Data":"2c52f7535c14a638b2016a093aee404fd1919034af6f43e16d2b89c0dc4b7fb8"} Apr 23 17:09:17.203855 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:09:17.203638 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-9q7dp" Apr 23 17:09:17.204679 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:09:17.204653 2562 generic.go:358] "Generic (PLEG): container finished" podID="e67e442c-1c6c-4df1-b237-5d1d04cfd134" containerID="5242b0cfc3675311791d1e9944a52bf025a8bf6d28de30bcf78282929feff905" exitCode=0 Apr 23 17:09:17.204807 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:09:17.204709 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-79qbb" event={"ID":"e67e442c-1c6c-4df1-b237-5d1d04cfd134","Type":"ContainerDied","Data":"5242b0cfc3675311791d1e9944a52bf025a8bf6d28de30bcf78282929feff905"} Apr 23 17:09:17.204807 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:09:17.204715 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-79qbb" Apr 23 17:09:17.204807 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:09:17.204737 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-79qbb" event={"ID":"e67e442c-1c6c-4df1-b237-5d1d04cfd134","Type":"ContainerDied","Data":"df47605f661c53fd4f350140df9aa9aa0a809c727ac90fc540aecb692336d995"} Apr 23 17:09:17.204807 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:09:17.204783 2562 scope.go:117] "RemoveContainer" containerID="5242b0cfc3675311791d1e9944a52bf025a8bf6d28de30bcf78282929feff905" Apr 23 17:09:17.205024 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:09:17.204849 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-9q7dp" podUID="abb36990-d551-49b0-ae53-571a3c00fd70" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 23 17:09:17.212494 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:09:17.212477 2562 scope.go:117] "RemoveContainer" containerID="eeea2c961f36f8621384c9f6b364f00cf030fe7bf3438ff2375c14858673b229" Apr 23 17:09:17.219666 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:09:17.219650 2562 scope.go:117] "RemoveContainer" containerID="5242b0cfc3675311791d1e9944a52bf025a8bf6d28de30bcf78282929feff905" Apr 23 17:09:17.219925 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:09:17.219906 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5242b0cfc3675311791d1e9944a52bf025a8bf6d28de30bcf78282929feff905\": container with ID starting with 5242b0cfc3675311791d1e9944a52bf025a8bf6d28de30bcf78282929feff905 not found: ID does not exist" containerID="5242b0cfc3675311791d1e9944a52bf025a8bf6d28de30bcf78282929feff905" Apr 23 17:09:17.219988 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:09:17.219934 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5242b0cfc3675311791d1e9944a52bf025a8bf6d28de30bcf78282929feff905"} err="failed to get container status \"5242b0cfc3675311791d1e9944a52bf025a8bf6d28de30bcf78282929feff905\": rpc error: code = NotFound desc = could not find container \"5242b0cfc3675311791d1e9944a52bf025a8bf6d28de30bcf78282929feff905\": container with ID starting with 5242b0cfc3675311791d1e9944a52bf025a8bf6d28de30bcf78282929feff905 not found: ID does not exist" Apr 23 17:09:17.219988 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:09:17.219953 2562 scope.go:117] "RemoveContainer" containerID="eeea2c961f36f8621384c9f6b364f00cf030fe7bf3438ff2375c14858673b229" Apr 23 17:09:17.220204 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:09:17.220186 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eeea2c961f36f8621384c9f6b364f00cf030fe7bf3438ff2375c14858673b229\": container with ID starting with eeea2c961f36f8621384c9f6b364f00cf030fe7bf3438ff2375c14858673b229 not found: ID does not exist" containerID="eeea2c961f36f8621384c9f6b364f00cf030fe7bf3438ff2375c14858673b229" Apr 23 17:09:17.220258 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:09:17.220212 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eeea2c961f36f8621384c9f6b364f00cf030fe7bf3438ff2375c14858673b229"} err="failed to get container status \"eeea2c961f36f8621384c9f6b364f00cf030fe7bf3438ff2375c14858673b229\": rpc error: code = NotFound desc = could not find container \"eeea2c961f36f8621384c9f6b364f00cf030fe7bf3438ff2375c14858673b229\": container with ID starting with eeea2c961f36f8621384c9f6b364f00cf030fe7bf3438ff2375c14858673b229 not found: ID does not exist" Apr 23 17:09:17.224430 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:09:17.224363 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-9q7dp" podStartSLOduration=6.224348156 podStartE2EDuration="6.224348156s" podCreationTimestamp="2026-04-23 17:09:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:09:17.223413165 +0000 UTC m=+2037.076228229" watchObservedRunningTime="2026-04-23 17:09:17.224348156 +0000 UTC m=+2037.077163223" Apr 23 17:09:17.237287 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:09:17.237261 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-79qbb"] Apr 23 17:09:17.241838 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:09:17.241813 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-79qbb"] Apr 23 17:09:18.208991 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:09:18.208955 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-9q7dp" podUID="abb36990-d551-49b0-ae53-571a3c00fd70" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 23 17:09:18.742023 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:09:18.741992 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e67e442c-1c6c-4df1-b237-5d1d04cfd134" path="/var/lib/kubelet/pods/e67e442c-1c6c-4df1-b237-5d1d04cfd134/volumes" Apr 23 17:09:28.209080 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:09:28.209032 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-9q7dp" podUID="abb36990-d551-49b0-ae53-571a3c00fd70" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 23 17:09:38.209287 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:09:38.209244 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-9q7dp" podUID="abb36990-d551-49b0-ae53-571a3c00fd70" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 23 17:09:48.209383 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:09:48.209290 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-9q7dp" podUID="abb36990-d551-49b0-ae53-571a3c00fd70" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 23 17:09:58.209088 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:09:58.209038 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-9q7dp" podUID="abb36990-d551-49b0-ae53-571a3c00fd70" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 23 17:10:08.209624 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:10:08.209574 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-9q7dp" podUID="abb36990-d551-49b0-ae53-571a3c00fd70" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 23 17:10:18.209900 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:10:18.209852 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-9q7dp" podUID="abb36990-d551-49b0-ae53-571a3c00fd70" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 23 17:10:28.210098 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:10:28.210051 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-9q7dp" podUID="abb36990-d551-49b0-ae53-571a3c00fd70" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 23 17:10:36.741676 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:10:36.741648 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-9q7dp" Apr 23 17:10:41.888354 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:10:41.888317 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-9q7dp"] Apr 23 17:10:41.888840 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:10:41.888631 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-9q7dp" podUID="abb36990-d551-49b0-ae53-571a3c00fd70" containerName="kserve-container" containerID="cri-o://2c52f7535c14a638b2016a093aee404fd1919034af6f43e16d2b89c0dc4b7fb8" gracePeriod=30 Apr 23 17:10:41.978881 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:10:41.978845 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-scrqx"] Apr 23 17:10:41.979174 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:10:41.979162 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e67e442c-1c6c-4df1-b237-5d1d04cfd134" containerName="kserve-container" Apr 23 17:10:41.979226 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:10:41.979176 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="e67e442c-1c6c-4df1-b237-5d1d04cfd134" containerName="kserve-container" Apr 23 17:10:41.979226 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:10:41.979186 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e67e442c-1c6c-4df1-b237-5d1d04cfd134" containerName="storage-initializer" Apr 23 17:10:41.979226 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:10:41.979191 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="e67e442c-1c6c-4df1-b237-5d1d04cfd134" containerName="storage-initializer" Apr 23 17:10:41.979323 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:10:41.979238 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="e67e442c-1c6c-4df1-b237-5d1d04cfd134" containerName="kserve-container" Apr 23 17:10:41.982207 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:10:41.982192 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-scrqx" Apr 23 17:10:41.988915 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:10:41.988887 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-scrqx"] Apr 23 17:10:42.145530 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:10:42.145439 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b749c39b-a758-4398-a018-5fe3e3a620f9-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-scrqx\" (UID: \"b749c39b-a758-4398-a018-5fe3e3a620f9\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-scrqx" Apr 23 17:10:42.246789 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:10:42.246714 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b749c39b-a758-4398-a018-5fe3e3a620f9-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-scrqx\" (UID: \"b749c39b-a758-4398-a018-5fe3e3a620f9\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-scrqx" Apr 23 17:10:42.247164 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:10:42.247140 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b749c39b-a758-4398-a018-5fe3e3a620f9-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-scrqx\" (UID: \"b749c39b-a758-4398-a018-5fe3e3a620f9\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-scrqx" Apr 23 17:10:42.292880 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:10:42.292844 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-scrqx" Apr 23 17:10:42.411566 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:10:42.411533 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-scrqx"] Apr 23 17:10:42.414018 ip-10-0-135-57 kubenswrapper[2562]: W0423 17:10:42.413989 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb749c39b_a758_4398_a018_5fe3e3a620f9.slice/crio-5c4f65e12ce8ff1b2c6122c00b6704c3a4a0b04c46571221253478cea59ce2c1 WatchSource:0}: Error finding container 5c4f65e12ce8ff1b2c6122c00b6704c3a4a0b04c46571221253478cea59ce2c1: Status 404 returned error can't find the container with id 5c4f65e12ce8ff1b2c6122c00b6704c3a4a0b04c46571221253478cea59ce2c1 Apr 23 17:10:42.415803 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:10:42.415787 2562 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 17:10:42.459803 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:10:42.459775 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-scrqx" event={"ID":"b749c39b-a758-4398-a018-5fe3e3a620f9","Type":"ContainerStarted","Data":"5c4f65e12ce8ff1b2c6122c00b6704c3a4a0b04c46571221253478cea59ce2c1"} Apr 23 17:10:43.463861 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:10:43.463827 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-scrqx" event={"ID":"b749c39b-a758-4398-a018-5fe3e3a620f9","Type":"ContainerStarted","Data":"6ea3321a8a6b3f736b5726934b333325641f5341f62dcbe0a8e861ae87816a32"} Apr 23 17:10:46.477080 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:10:46.476988 2562 generic.go:358] "Generic (PLEG): container finished" podID="b749c39b-a758-4398-a018-5fe3e3a620f9" containerID="6ea3321a8a6b3f736b5726934b333325641f5341f62dcbe0a8e861ae87816a32" exitCode=0 Apr 23 17:10:46.477080 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:10:46.477061 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-scrqx" event={"ID":"b749c39b-a758-4398-a018-5fe3e3a620f9","Type":"ContainerDied","Data":"6ea3321a8a6b3f736b5726934b333325641f5341f62dcbe0a8e861ae87816a32"} Apr 23 17:10:46.739009 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:10:46.738924 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-9q7dp" podUID="abb36990-d551-49b0-ae53-571a3c00fd70" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 23 17:10:47.323762 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:10:47.323722 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-9q7dp" Apr 23 17:10:47.388156 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:10:47.388118 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/abb36990-d551-49b0-ae53-571a3c00fd70-kserve-provision-location\") pod \"abb36990-d551-49b0-ae53-571a3c00fd70\" (UID: \"abb36990-d551-49b0-ae53-571a3c00fd70\") " Apr 23 17:10:47.388546 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:10:47.388516 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/abb36990-d551-49b0-ae53-571a3c00fd70-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "abb36990-d551-49b0-ae53-571a3c00fd70" (UID: "abb36990-d551-49b0-ae53-571a3c00fd70"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:10:47.481927 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:10:47.481831 2562 generic.go:358] "Generic (PLEG): container finished" podID="abb36990-d551-49b0-ae53-571a3c00fd70" containerID="2c52f7535c14a638b2016a093aee404fd1919034af6f43e16d2b89c0dc4b7fb8" exitCode=0 Apr 23 17:10:47.481927 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:10:47.481909 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-9q7dp" Apr 23 17:10:47.482439 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:10:47.481919 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-9q7dp" event={"ID":"abb36990-d551-49b0-ae53-571a3c00fd70","Type":"ContainerDied","Data":"2c52f7535c14a638b2016a093aee404fd1919034af6f43e16d2b89c0dc4b7fb8"} Apr 23 17:10:47.482439 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:10:47.481971 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-9q7dp" event={"ID":"abb36990-d551-49b0-ae53-571a3c00fd70","Type":"ContainerDied","Data":"41cc8c6afe08003677e4fd2fbd50a2906aff51be67c59aacd654c339604a8f44"} Apr 23 17:10:47.482439 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:10:47.481990 2562 scope.go:117] "RemoveContainer" containerID="2c52f7535c14a638b2016a093aee404fd1919034af6f43e16d2b89c0dc4b7fb8" Apr 23 17:10:47.483682 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:10:47.483660 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-scrqx" event={"ID":"b749c39b-a758-4398-a018-5fe3e3a620f9","Type":"ContainerStarted","Data":"90c5e007d457b0a09846dbb590f50706347dfab03fe3bd4e736a8092b5135e92"} Apr 23 17:10:47.483909 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:10:47.483892 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-scrqx" Apr 23 17:10:47.488784 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:10:47.488758 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/abb36990-d551-49b0-ae53-571a3c00fd70-kserve-provision-location\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 17:10:47.490292 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:10:47.490141 2562 scope.go:117] "RemoveContainer" containerID="b2c3926714d3281e74ab1bad2b54409a054b66b748a6e2464e1b5b30a9d8c6bf" Apr 23 17:10:47.497265 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:10:47.497243 2562 scope.go:117] "RemoveContainer" containerID="2c52f7535c14a638b2016a093aee404fd1919034af6f43e16d2b89c0dc4b7fb8" Apr 23 17:10:47.497520 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:10:47.497503 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c52f7535c14a638b2016a093aee404fd1919034af6f43e16d2b89c0dc4b7fb8\": container with ID starting with 2c52f7535c14a638b2016a093aee404fd1919034af6f43e16d2b89c0dc4b7fb8 not found: ID does not exist" containerID="2c52f7535c14a638b2016a093aee404fd1919034af6f43e16d2b89c0dc4b7fb8" Apr 23 17:10:47.497569 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:10:47.497530 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c52f7535c14a638b2016a093aee404fd1919034af6f43e16d2b89c0dc4b7fb8"} err="failed to get container status \"2c52f7535c14a638b2016a093aee404fd1919034af6f43e16d2b89c0dc4b7fb8\": rpc error: code = NotFound desc = could not find container \"2c52f7535c14a638b2016a093aee404fd1919034af6f43e16d2b89c0dc4b7fb8\": container with ID starting with 2c52f7535c14a638b2016a093aee404fd1919034af6f43e16d2b89c0dc4b7fb8 not found: ID does not exist" Apr 23 17:10:47.497569 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:10:47.497548 2562 scope.go:117] "RemoveContainer" containerID="b2c3926714d3281e74ab1bad2b54409a054b66b748a6e2464e1b5b30a9d8c6bf" Apr 23 17:10:47.497814 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:10:47.497800 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2c3926714d3281e74ab1bad2b54409a054b66b748a6e2464e1b5b30a9d8c6bf\": container with ID starting with b2c3926714d3281e74ab1bad2b54409a054b66b748a6e2464e1b5b30a9d8c6bf not found: ID does not exist" containerID="b2c3926714d3281e74ab1bad2b54409a054b66b748a6e2464e1b5b30a9d8c6bf" Apr 23 17:10:47.497868 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:10:47.497831 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2c3926714d3281e74ab1bad2b54409a054b66b748a6e2464e1b5b30a9d8c6bf"} err="failed to get container status \"b2c3926714d3281e74ab1bad2b54409a054b66b748a6e2464e1b5b30a9d8c6bf\": rpc error: code = NotFound desc = could not find container \"b2c3926714d3281e74ab1bad2b54409a054b66b748a6e2464e1b5b30a9d8c6bf\": container with ID starting with b2c3926714d3281e74ab1bad2b54409a054b66b748a6e2464e1b5b30a9d8c6bf not found: ID does not exist" Apr 23 17:10:47.501491 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:10:47.501451 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-scrqx" podStartSLOduration=6.501439952 podStartE2EDuration="6.501439952s" podCreationTimestamp="2026-04-23 17:10:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:10:47.500256194 +0000 UTC m=+2127.353071257" watchObservedRunningTime="2026-04-23 17:10:47.501439952 +0000 UTC m=+2127.354255012" Apr 23 17:10:47.512287 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:10:47.512262 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-9q7dp"] Apr 23 17:10:47.514151 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:10:47.514128 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-9q7dp"] Apr 23 17:10:48.742587 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:10:48.742547 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abb36990-d551-49b0-ae53-571a3c00fd70" path="/var/lib/kubelet/pods/abb36990-d551-49b0-ae53-571a3c00fd70/volumes" Apr 23 17:11:18.490052 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:11:18.490010 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-scrqx" podUID="b749c39b-a758-4398-a018-5fe3e3a620f9" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.45:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.45:8080: connect: connection refused" Apr 23 17:11:28.488305 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:11:28.488249 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-scrqx" podUID="b749c39b-a758-4398-a018-5fe3e3a620f9" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.45:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.45:8080: connect: connection refused" Apr 23 17:11:38.488537 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:11:38.488486 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-scrqx" podUID="b749c39b-a758-4398-a018-5fe3e3a620f9" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.45:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.45:8080: connect: connection refused" Apr 23 17:11:48.488332 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:11:48.488284 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-scrqx" podUID="b749c39b-a758-4398-a018-5fe3e3a620f9" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.45:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.45:8080: connect: connection refused" Apr 23 17:11:52.738546 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:11:52.738498 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-scrqx" podUID="b749c39b-a758-4398-a018-5fe3e3a620f9" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.45:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.45:8080: connect: connection refused" Apr 23 17:12:02.742321 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:12:02.742292 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-scrqx" Apr 23 17:12:12.076703 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:12:12.076672 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-scrqx"] Apr 23 17:12:12.077112 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:12:12.076978 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-scrqx" podUID="b749c39b-a758-4398-a018-5fe3e3a620f9" containerName="kserve-container" containerID="cri-o://90c5e007d457b0a09846dbb590f50706347dfab03fe3bd4e736a8092b5135e92" gracePeriod=30 Apr 23 17:12:12.190168 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:12:12.190134 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-9m674"] Apr 23 17:12:12.190506 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:12:12.190492 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="abb36990-d551-49b0-ae53-571a3c00fd70" containerName="storage-initializer" Apr 23 17:12:12.190554 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:12:12.190508 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="abb36990-d551-49b0-ae53-571a3c00fd70" containerName="storage-initializer" Apr 23 17:12:12.190554 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:12:12.190543 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="abb36990-d551-49b0-ae53-571a3c00fd70" containerName="kserve-container" Apr 23 17:12:12.190554 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:12:12.190550 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="abb36990-d551-49b0-ae53-571a3c00fd70" containerName="kserve-container" Apr 23 17:12:12.190647 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:12:12.190607 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="abb36990-d551-49b0-ae53-571a3c00fd70" containerName="kserve-container" Apr 23 17:12:12.193683 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:12:12.193668 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-9m674" Apr 23 17:12:12.202152 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:12:12.202127 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-9m674"] Apr 23 17:12:12.318641 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:12:12.318606 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f3bfcd2c-f114-46f9-9827-b6edff1f340b-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-67fd65d6cb-9m674\" (UID: \"f3bfcd2c-f114-46f9-9827-b6edff1f340b\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-9m674" Apr 23 17:12:12.419497 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:12:12.419456 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f3bfcd2c-f114-46f9-9827-b6edff1f340b-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-67fd65d6cb-9m674\" (UID: \"f3bfcd2c-f114-46f9-9827-b6edff1f340b\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-9m674" Apr 23 17:12:12.419881 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:12:12.419862 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f3bfcd2c-f114-46f9-9827-b6edff1f340b-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-67fd65d6cb-9m674\" (UID: \"f3bfcd2c-f114-46f9-9827-b6edff1f340b\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-9m674" Apr 23 17:12:12.504593 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:12:12.504562 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-9m674" Apr 23 17:12:12.629086 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:12:12.629055 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-9m674"] Apr 23 17:12:12.631916 ip-10-0-135-57 kubenswrapper[2562]: W0423 17:12:12.631884 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3bfcd2c_f114_46f9_9827_b6edff1f340b.slice/crio-c72f0589815490bea167c03b63077b12dfbd8968fdbc9ff52735a0b886e7e86b WatchSource:0}: Error finding container c72f0589815490bea167c03b63077b12dfbd8968fdbc9ff52735a0b886e7e86b: Status 404 returned error can't find the container with id c72f0589815490bea167c03b63077b12dfbd8968fdbc9ff52735a0b886e7e86b Apr 23 17:12:12.733372 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:12:12.733339 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-9m674" event={"ID":"f3bfcd2c-f114-46f9-9827-b6edff1f340b","Type":"ContainerStarted","Data":"805193cbf040b4420235733ff133dacfcb01c0b7cf861d28daad5694e70ef4df"} Apr 23 17:12:12.733372 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:12:12.733376 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-9m674" event={"ID":"f3bfcd2c-f114-46f9-9827-b6edff1f340b","Type":"ContainerStarted","Data":"c72f0589815490bea167c03b63077b12dfbd8968fdbc9ff52735a0b886e7e86b"} Apr 23 17:12:12.739120 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:12:12.739079 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-scrqx" podUID="b749c39b-a758-4398-a018-5fe3e3a620f9" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.45:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.45:8080: connect: connection refused" Apr 23 17:12:16.749820 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:12:16.749787 2562 generic.go:358] "Generic (PLEG): container finished" podID="f3bfcd2c-f114-46f9-9827-b6edff1f340b" containerID="805193cbf040b4420235733ff133dacfcb01c0b7cf861d28daad5694e70ef4df" exitCode=0 Apr 23 17:12:16.750247 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:12:16.749868 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-9m674" event={"ID":"f3bfcd2c-f114-46f9-9827-b6edff1f340b","Type":"ContainerDied","Data":"805193cbf040b4420235733ff133dacfcb01c0b7cf861d28daad5694e70ef4df"} Apr 23 17:12:17.121065 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:12:17.121043 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-scrqx" Apr 23 17:12:17.259302 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:12:17.259268 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b749c39b-a758-4398-a018-5fe3e3a620f9-kserve-provision-location\") pod \"b749c39b-a758-4398-a018-5fe3e3a620f9\" (UID: \"b749c39b-a758-4398-a018-5fe3e3a620f9\") " Apr 23 17:12:17.259693 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:12:17.259664 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b749c39b-a758-4398-a018-5fe3e3a620f9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b749c39b-a758-4398-a018-5fe3e3a620f9" (UID: "b749c39b-a758-4398-a018-5fe3e3a620f9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:12:17.360099 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:12:17.359995 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b749c39b-a758-4398-a018-5fe3e3a620f9-kserve-provision-location\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 17:12:17.753992 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:12:17.753957 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-9m674" event={"ID":"f3bfcd2c-f114-46f9-9827-b6edff1f340b","Type":"ContainerStarted","Data":"255af23d0d594ee2c3df4e8bf6b23008217cca238d5e11a12c399eca8ac240b4"} Apr 23 17:12:17.754492 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:12:17.754180 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-9m674" Apr 23 17:12:17.755318 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:12:17.755289 2562 generic.go:358] "Generic (PLEG): container finished" podID="b749c39b-a758-4398-a018-5fe3e3a620f9" containerID="90c5e007d457b0a09846dbb590f50706347dfab03fe3bd4e736a8092b5135e92" exitCode=0 Apr 23 17:12:17.755388 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:12:17.755341 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-scrqx" Apr 23 17:12:17.755388 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:12:17.755350 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-scrqx" event={"ID":"b749c39b-a758-4398-a018-5fe3e3a620f9","Type":"ContainerDied","Data":"90c5e007d457b0a09846dbb590f50706347dfab03fe3bd4e736a8092b5135e92"} Apr 23 17:12:17.755388 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:12:17.755377 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-scrqx" event={"ID":"b749c39b-a758-4398-a018-5fe3e3a620f9","Type":"ContainerDied","Data":"5c4f65e12ce8ff1b2c6122c00b6704c3a4a0b04c46571221253478cea59ce2c1"} Apr 23 17:12:17.755492 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:12:17.755394 2562 scope.go:117] "RemoveContainer" containerID="90c5e007d457b0a09846dbb590f50706347dfab03fe3bd4e736a8092b5135e92" Apr 23 17:12:17.763118 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:12:17.762961 2562 scope.go:117] "RemoveContainer" containerID="6ea3321a8a6b3f736b5726934b333325641f5341f62dcbe0a8e861ae87816a32" Apr 23 17:12:17.770343 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:12:17.770322 2562 scope.go:117] "RemoveContainer" containerID="90c5e007d457b0a09846dbb590f50706347dfab03fe3bd4e736a8092b5135e92" Apr 23 17:12:17.770631 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:12:17.770606 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90c5e007d457b0a09846dbb590f50706347dfab03fe3bd4e736a8092b5135e92\": container with ID starting with 90c5e007d457b0a09846dbb590f50706347dfab03fe3bd4e736a8092b5135e92 not found: ID does not exist" containerID="90c5e007d457b0a09846dbb590f50706347dfab03fe3bd4e736a8092b5135e92" Apr 23 17:12:17.770737 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:12:17.770640 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90c5e007d457b0a09846dbb590f50706347dfab03fe3bd4e736a8092b5135e92"} err="failed to get container status \"90c5e007d457b0a09846dbb590f50706347dfab03fe3bd4e736a8092b5135e92\": rpc error: code = NotFound desc = could not find container \"90c5e007d457b0a09846dbb590f50706347dfab03fe3bd4e736a8092b5135e92\": container with ID starting with 90c5e007d457b0a09846dbb590f50706347dfab03fe3bd4e736a8092b5135e92 not found: ID does not exist" Apr 23 17:12:17.770737 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:12:17.770659 2562 scope.go:117] "RemoveContainer" containerID="6ea3321a8a6b3f736b5726934b333325641f5341f62dcbe0a8e861ae87816a32" Apr 23 17:12:17.770945 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:12:17.770921 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ea3321a8a6b3f736b5726934b333325641f5341f62dcbe0a8e861ae87816a32\": container with ID starting with 6ea3321a8a6b3f736b5726934b333325641f5341f62dcbe0a8e861ae87816a32 not found: ID does not exist" containerID="6ea3321a8a6b3f736b5726934b333325641f5341f62dcbe0a8e861ae87816a32" Apr 23 17:12:17.770983 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:12:17.770954 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ea3321a8a6b3f736b5726934b333325641f5341f62dcbe0a8e861ae87816a32"} err="failed to get container status \"6ea3321a8a6b3f736b5726934b333325641f5341f62dcbe0a8e861ae87816a32\": rpc error: code = NotFound desc = could not find container \"6ea3321a8a6b3f736b5726934b333325641f5341f62dcbe0a8e861ae87816a32\": container with ID starting with 6ea3321a8a6b3f736b5726934b333325641f5341f62dcbe0a8e861ae87816a32 not found: ID does not exist" Apr 23 17:12:17.771603 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:12:17.771571 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-9m674" podStartSLOduration=5.77156163 podStartE2EDuration="5.77156163s" podCreationTimestamp="2026-04-23 17:12:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:12:17.769366046 +0000 UTC m=+2217.622181112" watchObservedRunningTime="2026-04-23 17:12:17.77156163 +0000 UTC m=+2217.624376771" Apr 23 17:12:17.781266 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:12:17.781245 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-scrqx"] Apr 23 17:12:17.784654 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:12:17.784632 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-scrqx"] Apr 23 17:12:18.742426 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:12:18.742385 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b749c39b-a758-4398-a018-5fe3e3a620f9" path="/var/lib/kubelet/pods/b749c39b-a758-4398-a018-5fe3e3a620f9/volumes" Apr 23 17:12:48.761592 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:12:48.761556 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-9m674" podUID="f3bfcd2c-f114-46f9-9827-b6edff1f340b" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.46:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.46:8080: connect: connection refused" Apr 23 17:12:58.760472 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:12:58.760430 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-9m674" podUID="f3bfcd2c-f114-46f9-9827-b6edff1f340b" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.46:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.46:8080: connect: connection refused" Apr 23 17:13:08.760316 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:13:08.760271 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-9m674" podUID="f3bfcd2c-f114-46f9-9827-b6edff1f340b" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.46:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.46:8080: connect: connection refused" Apr 23 17:13:18.760758 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:13:18.760697 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-9m674" podUID="f3bfcd2c-f114-46f9-9827-b6edff1f340b" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.46:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.46:8080: connect: connection refused" Apr 23 17:13:28.760684 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:13:28.760637 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-9m674" podUID="f3bfcd2c-f114-46f9-9827-b6edff1f340b" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.46:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.46:8080: connect: connection refused" Apr 23 17:13:38.764802 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:13:38.764771 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-9m674" Apr 23 17:13:42.315272 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:13:42.315228 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-9m674"] Apr 23 17:13:42.315779 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:13:42.315560 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-9m674" podUID="f3bfcd2c-f114-46f9-9827-b6edff1f340b" containerName="kserve-container" containerID="cri-o://255af23d0d594ee2c3df4e8bf6b23008217cca238d5e11a12c399eca8ac240b4" gracePeriod=30 Apr 23 17:13:42.400306 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:13:42.400272 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-r4qzz"] Apr 23 17:13:42.400599 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:13:42.400587 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b749c39b-a758-4398-a018-5fe3e3a620f9" containerName="storage-initializer" Apr 23 17:13:42.400644 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:13:42.400600 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="b749c39b-a758-4398-a018-5fe3e3a620f9" containerName="storage-initializer" Apr 23 17:13:42.400644 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:13:42.400613 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b749c39b-a758-4398-a018-5fe3e3a620f9" containerName="kserve-container" Apr 23 17:13:42.400644 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:13:42.400619 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="b749c39b-a758-4398-a018-5fe3e3a620f9" containerName="kserve-container" Apr 23 17:13:42.400757 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:13:42.400675 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="b749c39b-a758-4398-a018-5fe3e3a620f9" containerName="kserve-container" Apr 23 17:13:42.403618 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:13:42.403600 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-r4qzz" Apr 23 17:13:42.411572 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:13:42.411548 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-r4qzz"] Apr 23 17:13:42.488031 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:13:42.487995 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ff4a9c6-0867-4c59-800e-f0f1a0177183-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-65df45cf79-r4qzz\" (UID: \"4ff4a9c6-0867-4c59-800e-f0f1a0177183\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-r4qzz" Apr 23 17:13:42.588618 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:13:42.588532 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ff4a9c6-0867-4c59-800e-f0f1a0177183-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-65df45cf79-r4qzz\" (UID: \"4ff4a9c6-0867-4c59-800e-f0f1a0177183\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-r4qzz" Apr 23 17:13:42.588942 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:13:42.588922 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ff4a9c6-0867-4c59-800e-f0f1a0177183-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-65df45cf79-r4qzz\" (UID: \"4ff4a9c6-0867-4c59-800e-f0f1a0177183\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-r4qzz" Apr 23 17:13:42.714527 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:13:42.714490 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-r4qzz" Apr 23 17:13:42.841546 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:13:42.841366 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-r4qzz"] Apr 23 17:13:42.845318 ip-10-0-135-57 kubenswrapper[2562]: W0423 17:13:42.845287 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ff4a9c6_0867_4c59_800e_f0f1a0177183.slice/crio-658155f42e61af4ee1e9cb9ce9c1050945a77bc8b7946df16887348177f3d4ea WatchSource:0}: Error finding container 658155f42e61af4ee1e9cb9ce9c1050945a77bc8b7946df16887348177f3d4ea: Status 404 returned error can't find the container with id 658155f42e61af4ee1e9cb9ce9c1050945a77bc8b7946df16887348177f3d4ea Apr 23 17:13:42.997613 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:13:42.997565 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-r4qzz" event={"ID":"4ff4a9c6-0867-4c59-800e-f0f1a0177183","Type":"ContainerStarted","Data":"0db9660f8f8d392383c37bea4a4aff6a642fd7b75e4290b878a5d4f7c86a92b1"} Apr 23 17:13:42.997836 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:13:42.997623 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-r4qzz" event={"ID":"4ff4a9c6-0867-4c59-800e-f0f1a0177183","Type":"ContainerStarted","Data":"658155f42e61af4ee1e9cb9ce9c1050945a77bc8b7946df16887348177f3d4ea"} Apr 23 17:13:47.010512 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:13:47.010413 2562 generic.go:358] "Generic (PLEG): container finished" podID="4ff4a9c6-0867-4c59-800e-f0f1a0177183" containerID="0db9660f8f8d392383c37bea4a4aff6a642fd7b75e4290b878a5d4f7c86a92b1" exitCode=0 Apr 23 17:13:47.010512 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:13:47.010488 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-r4qzz" event={"ID":"4ff4a9c6-0867-4c59-800e-f0f1a0177183","Type":"ContainerDied","Data":"0db9660f8f8d392383c37bea4a4aff6a642fd7b75e4290b878a5d4f7c86a92b1"} Apr 23 17:13:47.251696 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:13:47.251671 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-9m674" Apr 23 17:13:47.328497 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:13:47.328405 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f3bfcd2c-f114-46f9-9827-b6edff1f340b-kserve-provision-location\") pod \"f3bfcd2c-f114-46f9-9827-b6edff1f340b\" (UID: \"f3bfcd2c-f114-46f9-9827-b6edff1f340b\") " Apr 23 17:13:47.328734 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:13:47.328708 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3bfcd2c-f114-46f9-9827-b6edff1f340b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f3bfcd2c-f114-46f9-9827-b6edff1f340b" (UID: "f3bfcd2c-f114-46f9-9827-b6edff1f340b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:13:47.429823 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:13:47.429772 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f3bfcd2c-f114-46f9-9827-b6edff1f340b-kserve-provision-location\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 17:13:48.014865 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:13:48.014832 2562 generic.go:358] "Generic (PLEG): container finished" podID="f3bfcd2c-f114-46f9-9827-b6edff1f340b" containerID="255af23d0d594ee2c3df4e8bf6b23008217cca238d5e11a12c399eca8ac240b4" exitCode=0 Apr 23 17:13:48.015295 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:13:48.014897 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-9m674" Apr 23 17:13:48.015295 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:13:48.014927 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-9m674" event={"ID":"f3bfcd2c-f114-46f9-9827-b6edff1f340b","Type":"ContainerDied","Data":"255af23d0d594ee2c3df4e8bf6b23008217cca238d5e11a12c399eca8ac240b4"} Apr 23 17:13:48.015295 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:13:48.014973 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-9m674" event={"ID":"f3bfcd2c-f114-46f9-9827-b6edff1f340b","Type":"ContainerDied","Data":"c72f0589815490bea167c03b63077b12dfbd8968fdbc9ff52735a0b886e7e86b"} Apr 23 17:13:48.015295 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:13:48.014996 2562 scope.go:117] "RemoveContainer" containerID="255af23d0d594ee2c3df4e8bf6b23008217cca238d5e11a12c399eca8ac240b4" Apr 23 17:13:48.016512 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:13:48.016482 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-r4qzz" event={"ID":"4ff4a9c6-0867-4c59-800e-f0f1a0177183","Type":"ContainerStarted","Data":"eb8968aea9385e1dde655c54c8ea7b9e9dd1d36626471746583622ba42e161be"} Apr 23 17:13:48.016681 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:13:48.016664 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-r4qzz" Apr 23 17:13:48.022904 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:13:48.022879 2562 scope.go:117] "RemoveContainer" containerID="805193cbf040b4420235733ff133dacfcb01c0b7cf861d28daad5694e70ef4df" Apr 23 17:13:48.030227 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:13:48.030208 2562 scope.go:117] "RemoveContainer" containerID="255af23d0d594ee2c3df4e8bf6b23008217cca238d5e11a12c399eca8ac240b4" Apr 23 17:13:48.030457 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:13:48.030439 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"255af23d0d594ee2c3df4e8bf6b23008217cca238d5e11a12c399eca8ac240b4\": container with ID starting with 255af23d0d594ee2c3df4e8bf6b23008217cca238d5e11a12c399eca8ac240b4 not found: ID does not exist" containerID="255af23d0d594ee2c3df4e8bf6b23008217cca238d5e11a12c399eca8ac240b4" Apr 23 17:13:48.030507 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:13:48.030467 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"255af23d0d594ee2c3df4e8bf6b23008217cca238d5e11a12c399eca8ac240b4"} err="failed to get container status \"255af23d0d594ee2c3df4e8bf6b23008217cca238d5e11a12c399eca8ac240b4\": rpc error: code = NotFound desc = could not find container \"255af23d0d594ee2c3df4e8bf6b23008217cca238d5e11a12c399eca8ac240b4\": container with ID starting with 255af23d0d594ee2c3df4e8bf6b23008217cca238d5e11a12c399eca8ac240b4 not found: ID does not exist" Apr 23 17:13:48.030507 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:13:48.030483 2562 scope.go:117] "RemoveContainer" containerID="805193cbf040b4420235733ff133dacfcb01c0b7cf861d28daad5694e70ef4df" Apr 23 17:13:48.030732 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:13:48.030708 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"805193cbf040b4420235733ff133dacfcb01c0b7cf861d28daad5694e70ef4df\": container with ID starting with 805193cbf040b4420235733ff133dacfcb01c0b7cf861d28daad5694e70ef4df not found: ID does not exist" containerID="805193cbf040b4420235733ff133dacfcb01c0b7cf861d28daad5694e70ef4df" Apr 23 17:13:48.030818 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:13:48.030755 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"805193cbf040b4420235733ff133dacfcb01c0b7cf861d28daad5694e70ef4df"} err="failed to get container status \"805193cbf040b4420235733ff133dacfcb01c0b7cf861d28daad5694e70ef4df\": rpc error: code = NotFound desc = could not find container \"805193cbf040b4420235733ff133dacfcb01c0b7cf861d28daad5694e70ef4df\": container with ID starting with 805193cbf040b4420235733ff133dacfcb01c0b7cf861d28daad5694e70ef4df not found: ID does not exist" Apr 23 17:13:48.034953 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:13:48.034915 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-r4qzz" podStartSLOduration=6.034903882 podStartE2EDuration="6.034903882s" podCreationTimestamp="2026-04-23 17:13:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:13:48.033287083 +0000 UTC m=+2307.886102146" watchObservedRunningTime="2026-04-23 17:13:48.034903882 +0000 UTC m=+2307.887718945" Apr 23 17:13:48.045187 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:13:48.045159 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-9m674"] Apr 23 17:13:48.047472 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:13:48.047445 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-9m674"] Apr 23 17:13:48.742450 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:13:48.742415 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3bfcd2c-f114-46f9-9827-b6edff1f340b" path="/var/lib/kubelet/pods/f3bfcd2c-f114-46f9-9827-b6edff1f340b/volumes" Apr 23 17:14:19.022412 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:14:19.022358 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-r4qzz" podUID="4ff4a9c6-0867-4c59-800e-f0f1a0177183" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.47:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.47:8080: connect: connection refused" Apr 23 17:14:29.021084 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:14:29.021041 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-r4qzz" podUID="4ff4a9c6-0867-4c59-800e-f0f1a0177183" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.47:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.47:8080: connect: connection refused" Apr 23 17:14:39.021037 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:14:39.020993 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-r4qzz" podUID="4ff4a9c6-0867-4c59-800e-f0f1a0177183" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.47:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.47:8080: connect: connection refused" Apr 23 17:14:49.020589 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:14:49.020543 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-r4qzz" podUID="4ff4a9c6-0867-4c59-800e-f0f1a0177183" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.47:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.47:8080: connect: connection refused" Apr 23 17:14:52.738402 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:14:52.738351 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-r4qzz" podUID="4ff4a9c6-0867-4c59-800e-f0f1a0177183" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.47:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.47:8080: connect: connection refused" Apr 23 17:15:02.742866 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:15:02.742832 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-r4qzz" Apr 23 17:15:12.519419 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:15:12.519387 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-r4qzz"] Apr 23 17:15:12.519850 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:15:12.519642 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-r4qzz" podUID="4ff4a9c6-0867-4c59-800e-f0f1a0177183" containerName="kserve-container" containerID="cri-o://eb8968aea9385e1dde655c54c8ea7b9e9dd1d36626471746583622ba42e161be" gracePeriod=30 Apr 23 17:15:12.739013 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:15:12.738969 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-r4qzz" podUID="4ff4a9c6-0867-4c59-800e-f0f1a0177183" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.47:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.47:8080: connect: connection refused" Apr 23 17:15:14.679884 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:15:14.679828 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-7cdb7c9b74-lzt55"] Apr 23 17:15:14.680275 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:15:14.680169 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f3bfcd2c-f114-46f9-9827-b6edff1f340b" containerName="kserve-container" Apr 23 17:15:14.680275 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:15:14.680180 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3bfcd2c-f114-46f9-9827-b6edff1f340b" containerName="kserve-container" Apr 23 17:15:14.680275 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:15:14.680193 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f3bfcd2c-f114-46f9-9827-b6edff1f340b" containerName="storage-initializer" Apr 23 17:15:14.680275 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:15:14.680199 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3bfcd2c-f114-46f9-9827-b6edff1f340b" containerName="storage-initializer" Apr 23 17:15:14.680275 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:15:14.680257 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="f3bfcd2c-f114-46f9-9827-b6edff1f340b" containerName="kserve-container" Apr 23 17:15:14.683245 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:15:14.683229 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7cdb7c9b74-lzt55" Apr 23 17:15:14.693516 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:15:14.693490 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-7cdb7c9b74-lzt55"] Apr 23 17:15:14.779620 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:15:14.779581 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41bebb8a-5ad6-4341-a8f8-4735ee6f3a49-kserve-provision-location\") pod \"isvc-sklearn-predictor-7cdb7c9b74-lzt55\" (UID: \"41bebb8a-5ad6-4341-a8f8-4735ee6f3a49\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7cdb7c9b74-lzt55" Apr 23 17:15:14.880216 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:15:14.880170 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41bebb8a-5ad6-4341-a8f8-4735ee6f3a49-kserve-provision-location\") pod \"isvc-sklearn-predictor-7cdb7c9b74-lzt55\" (UID: \"41bebb8a-5ad6-4341-a8f8-4735ee6f3a49\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7cdb7c9b74-lzt55" Apr 23 17:15:14.880549 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:15:14.880529 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41bebb8a-5ad6-4341-a8f8-4735ee6f3a49-kserve-provision-location\") pod \"isvc-sklearn-predictor-7cdb7c9b74-lzt55\" (UID: \"41bebb8a-5ad6-4341-a8f8-4735ee6f3a49\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7cdb7c9b74-lzt55" Apr 23 17:15:14.993515 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:15:14.993424 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7cdb7c9b74-lzt55" Apr 23 17:15:15.113459 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:15:15.113426 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-7cdb7c9b74-lzt55"] Apr 23 17:15:15.116416 ip-10-0-135-57 kubenswrapper[2562]: W0423 17:15:15.116383 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41bebb8a_5ad6_4341_a8f8_4735ee6f3a49.slice/crio-968ec26ffd0bf6d0d65e4fd2026878ab5898083fdf962a2c9dfc2852a3395df6 WatchSource:0}: Error finding container 968ec26ffd0bf6d0d65e4fd2026878ab5898083fdf962a2c9dfc2852a3395df6: Status 404 returned error can't find the container with id 968ec26ffd0bf6d0d65e4fd2026878ab5898083fdf962a2c9dfc2852a3395df6 Apr 23 17:15:15.265423 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:15:15.265335 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7cdb7c9b74-lzt55" event={"ID":"41bebb8a-5ad6-4341-a8f8-4735ee6f3a49","Type":"ContainerStarted","Data":"450d229e340fe5fb9a4e69bb0d544efd11cd49d9fc9c7a793c8266c6b31be37c"} Apr 23 17:15:15.265423 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:15:15.265378 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7cdb7c9b74-lzt55" event={"ID":"41bebb8a-5ad6-4341-a8f8-4735ee6f3a49","Type":"ContainerStarted","Data":"968ec26ffd0bf6d0d65e4fd2026878ab5898083fdf962a2c9dfc2852a3395df6"} Apr 23 17:15:17.953527 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:15:17.953503 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-r4qzz" Apr 23 17:15:18.011101 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:15:18.011062 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ff4a9c6-0867-4c59-800e-f0f1a0177183-kserve-provision-location\") pod \"4ff4a9c6-0867-4c59-800e-f0f1a0177183\" (UID: \"4ff4a9c6-0867-4c59-800e-f0f1a0177183\") " Apr 23 17:15:18.011415 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:15:18.011389 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ff4a9c6-0867-4c59-800e-f0f1a0177183-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4ff4a9c6-0867-4c59-800e-f0f1a0177183" (UID: "4ff4a9c6-0867-4c59-800e-f0f1a0177183"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:15:18.111861 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:15:18.111782 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ff4a9c6-0867-4c59-800e-f0f1a0177183-kserve-provision-location\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 17:15:18.276269 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:15:18.276236 2562 generic.go:358] "Generic (PLEG): container finished" podID="4ff4a9c6-0867-4c59-800e-f0f1a0177183" containerID="eb8968aea9385e1dde655c54c8ea7b9e9dd1d36626471746583622ba42e161be" exitCode=0 Apr 23 17:15:18.276429 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:15:18.276308 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-r4qzz" Apr 23 17:15:18.276429 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:15:18.276339 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-r4qzz" event={"ID":"4ff4a9c6-0867-4c59-800e-f0f1a0177183","Type":"ContainerDied","Data":"eb8968aea9385e1dde655c54c8ea7b9e9dd1d36626471746583622ba42e161be"} Apr 23 17:15:18.276429 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:15:18.276393 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-r4qzz" event={"ID":"4ff4a9c6-0867-4c59-800e-f0f1a0177183","Type":"ContainerDied","Data":"658155f42e61af4ee1e9cb9ce9c1050945a77bc8b7946df16887348177f3d4ea"} Apr 23 17:15:18.276429 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:15:18.276415 2562 scope.go:117] "RemoveContainer" containerID="eb8968aea9385e1dde655c54c8ea7b9e9dd1d36626471746583622ba42e161be" Apr 23 17:15:18.285399 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:15:18.285382 2562 scope.go:117] "RemoveContainer" containerID="0db9660f8f8d392383c37bea4a4aff6a642fd7b75e4290b878a5d4f7c86a92b1" Apr 23 17:15:18.292676 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:15:18.292662 2562 scope.go:117] "RemoveContainer" containerID="eb8968aea9385e1dde655c54c8ea7b9e9dd1d36626471746583622ba42e161be" Apr 23 17:15:18.292926 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:15:18.292908 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb8968aea9385e1dde655c54c8ea7b9e9dd1d36626471746583622ba42e161be\": container with ID starting with eb8968aea9385e1dde655c54c8ea7b9e9dd1d36626471746583622ba42e161be not found: ID does not exist" containerID="eb8968aea9385e1dde655c54c8ea7b9e9dd1d36626471746583622ba42e161be" Apr 23 17:15:18.292980 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:15:18.292936 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb8968aea9385e1dde655c54c8ea7b9e9dd1d36626471746583622ba42e161be"} err="failed to get container status \"eb8968aea9385e1dde655c54c8ea7b9e9dd1d36626471746583622ba42e161be\": rpc error: code = NotFound desc = could not find container \"eb8968aea9385e1dde655c54c8ea7b9e9dd1d36626471746583622ba42e161be\": container with ID starting with eb8968aea9385e1dde655c54c8ea7b9e9dd1d36626471746583622ba42e161be not found: ID does not exist" Apr 23 17:15:18.292980 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:15:18.292955 2562 scope.go:117] "RemoveContainer" containerID="0db9660f8f8d392383c37bea4a4aff6a642fd7b75e4290b878a5d4f7c86a92b1" Apr 23 17:15:18.293148 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:15:18.293136 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0db9660f8f8d392383c37bea4a4aff6a642fd7b75e4290b878a5d4f7c86a92b1\": container with ID starting with 0db9660f8f8d392383c37bea4a4aff6a642fd7b75e4290b878a5d4f7c86a92b1 not found: ID does not exist" containerID="0db9660f8f8d392383c37bea4a4aff6a642fd7b75e4290b878a5d4f7c86a92b1" Apr 23 17:15:18.293188 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:15:18.293150 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0db9660f8f8d392383c37bea4a4aff6a642fd7b75e4290b878a5d4f7c86a92b1"} err="failed to get container status \"0db9660f8f8d392383c37bea4a4aff6a642fd7b75e4290b878a5d4f7c86a92b1\": rpc error: code = NotFound desc = could not find container \"0db9660f8f8d392383c37bea4a4aff6a642fd7b75e4290b878a5d4f7c86a92b1\": container with ID starting with 0db9660f8f8d392383c37bea4a4aff6a642fd7b75e4290b878a5d4f7c86a92b1 not found: ID does not exist" Apr 23 17:15:18.297806 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:15:18.297761 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-r4qzz"] Apr 23 17:15:18.299614 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:15:18.299593 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-r4qzz"] Apr 23 17:15:18.741558 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:15:18.741523 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ff4a9c6-0867-4c59-800e-f0f1a0177183" path="/var/lib/kubelet/pods/4ff4a9c6-0867-4c59-800e-f0f1a0177183/volumes" Apr 23 17:15:19.280009 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:15:19.279918 2562 generic.go:358] "Generic (PLEG): container finished" podID="41bebb8a-5ad6-4341-a8f8-4735ee6f3a49" containerID="450d229e340fe5fb9a4e69bb0d544efd11cd49d9fc9c7a793c8266c6b31be37c" exitCode=0 Apr 23 17:15:19.280439 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:15:19.279996 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7cdb7c9b74-lzt55" event={"ID":"41bebb8a-5ad6-4341-a8f8-4735ee6f3a49","Type":"ContainerDied","Data":"450d229e340fe5fb9a4e69bb0d544efd11cd49d9fc9c7a793c8266c6b31be37c"} Apr 23 17:15:20.285595 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:15:20.285561 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7cdb7c9b74-lzt55" event={"ID":"41bebb8a-5ad6-4341-a8f8-4735ee6f3a49","Type":"ContainerStarted","Data":"92c13990d8defe4b5ec7a970a2012d0d7c0ecb8405f921b9fab4b8107169d985"} Apr 23 17:15:20.286036 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:15:20.285876 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7cdb7c9b74-lzt55" Apr 23 17:15:20.287329 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:15:20.287266 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7cdb7c9b74-lzt55" podUID="41bebb8a-5ad6-4341-a8f8-4735ee6f3a49" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 23 17:15:20.303410 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:15:20.303364 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7cdb7c9b74-lzt55" podStartSLOduration=6.303351058 podStartE2EDuration="6.303351058s" podCreationTimestamp="2026-04-23 17:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:15:20.302555522 +0000 UTC m=+2400.155370603" watchObservedRunningTime="2026-04-23 17:15:20.303351058 +0000 UTC m=+2400.156166122" Apr 23 17:15:21.288428 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:15:21.288390 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7cdb7c9b74-lzt55" podUID="41bebb8a-5ad6-4341-a8f8-4735ee6f3a49" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 23 17:15:31.288760 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:15:31.288690 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7cdb7c9b74-lzt55" podUID="41bebb8a-5ad6-4341-a8f8-4735ee6f3a49" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 23 17:15:41.288602 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:15:41.288514 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7cdb7c9b74-lzt55" podUID="41bebb8a-5ad6-4341-a8f8-4735ee6f3a49" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 23 17:15:51.289102 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:15:51.289059 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7cdb7c9b74-lzt55" podUID="41bebb8a-5ad6-4341-a8f8-4735ee6f3a49" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 23 17:16:01.288786 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:16:01.288719 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7cdb7c9b74-lzt55" podUID="41bebb8a-5ad6-4341-a8f8-4735ee6f3a49" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 23 17:16:11.288411 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:16:11.288353 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7cdb7c9b74-lzt55" podUID="41bebb8a-5ad6-4341-a8f8-4735ee6f3a49" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 23 17:16:21.288623 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:16:21.288579 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7cdb7c9b74-lzt55" podUID="41bebb8a-5ad6-4341-a8f8-4735ee6f3a49" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 23 17:16:31.290468 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:16:31.290433 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7cdb7c9b74-lzt55" Apr 23 17:16:34.804589 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:16:34.804559 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-7cdb7c9b74-lzt55"] Apr 23 17:16:34.805050 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:16:34.804828 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7cdb7c9b74-lzt55" podUID="41bebb8a-5ad6-4341-a8f8-4735ee6f3a49" containerName="kserve-container" containerID="cri-o://92c13990d8defe4b5ec7a970a2012d0d7c0ecb8405f921b9fab4b8107169d985" gracePeriod=30 Apr 23 17:16:34.876650 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:16:34.876611 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-xnt95"] Apr 23 17:16:34.876971 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:16:34.876958 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ff4a9c6-0867-4c59-800e-f0f1a0177183" containerName="kserve-container" Apr 23 17:16:34.877026 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:16:34.876973 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ff4a9c6-0867-4c59-800e-f0f1a0177183" containerName="kserve-container" Apr 23 17:16:34.877026 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:16:34.876984 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ff4a9c6-0867-4c59-800e-f0f1a0177183" containerName="storage-initializer" Apr 23 17:16:34.877026 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:16:34.876989 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ff4a9c6-0867-4c59-800e-f0f1a0177183" containerName="storage-initializer" Apr 23 17:16:34.877119 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:16:34.877042 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="4ff4a9c6-0867-4c59-800e-f0f1a0177183" containerName="kserve-container" Apr 23 17:16:34.880054 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:16:34.880034 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-xnt95" Apr 23 17:16:34.889176 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:16:34.889149 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-xnt95"] Apr 23 17:16:35.051013 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:16:35.050968 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb5847d6-03ec-4f95-b52e-73ad90bd6deb-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-695d5f5568-xnt95\" (UID: \"fb5847d6-03ec-4f95-b52e-73ad90bd6deb\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-xnt95" Apr 23 17:16:35.152471 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:16:35.152437 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb5847d6-03ec-4f95-b52e-73ad90bd6deb-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-695d5f5568-xnt95\" (UID: \"fb5847d6-03ec-4f95-b52e-73ad90bd6deb\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-xnt95" Apr 23 17:16:35.152821 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:16:35.152805 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb5847d6-03ec-4f95-b52e-73ad90bd6deb-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-695d5f5568-xnt95\" (UID: \"fb5847d6-03ec-4f95-b52e-73ad90bd6deb\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-xnt95" Apr 23 17:16:35.190527 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:16:35.190491 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-xnt95" Apr 23 17:16:35.312619 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:16:35.312592 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-xnt95"] Apr 23 17:16:35.314802 ip-10-0-135-57 kubenswrapper[2562]: W0423 17:16:35.314771 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb5847d6_03ec_4f95_b52e_73ad90bd6deb.slice/crio-4ca2221d51e7865ab86be26e04afded0688943d435832788409225ce5a1b36f2 WatchSource:0}: Error finding container 4ca2221d51e7865ab86be26e04afded0688943d435832788409225ce5a1b36f2: Status 404 returned error can't find the container with id 4ca2221d51e7865ab86be26e04afded0688943d435832788409225ce5a1b36f2 Apr 23 17:16:35.316717 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:16:35.316695 2562 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 17:16:35.506693 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:16:35.506599 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-xnt95" event={"ID":"fb5847d6-03ec-4f95-b52e-73ad90bd6deb","Type":"ContainerStarted","Data":"37c5c9eaa7d6f13169d1e98d569fe17c2021a5b94912b7832c94a88380ca906a"} Apr 23 17:16:35.506693 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:16:35.506650 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-xnt95" event={"ID":"fb5847d6-03ec-4f95-b52e-73ad90bd6deb","Type":"ContainerStarted","Data":"4ca2221d51e7865ab86be26e04afded0688943d435832788409225ce5a1b36f2"} Apr 23 17:16:39.450126 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:16:39.450102 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7cdb7c9b74-lzt55" Apr 23 17:16:39.520882 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:16:39.520733 2562 generic.go:358] "Generic (PLEG): container finished" podID="41bebb8a-5ad6-4341-a8f8-4735ee6f3a49" containerID="92c13990d8defe4b5ec7a970a2012d0d7c0ecb8405f921b9fab4b8107169d985" exitCode=0 Apr 23 17:16:39.520882 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:16:39.520780 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7cdb7c9b74-lzt55" event={"ID":"41bebb8a-5ad6-4341-a8f8-4735ee6f3a49","Type":"ContainerDied","Data":"92c13990d8defe4b5ec7a970a2012d0d7c0ecb8405f921b9fab4b8107169d985"} Apr 23 17:16:39.520882 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:16:39.520820 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7cdb7c9b74-lzt55" event={"ID":"41bebb8a-5ad6-4341-a8f8-4735ee6f3a49","Type":"ContainerDied","Data":"968ec26ffd0bf6d0d65e4fd2026878ab5898083fdf962a2c9dfc2852a3395df6"} Apr 23 17:16:39.520882 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:16:39.520829 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7cdb7c9b74-lzt55" Apr 23 17:16:39.520882 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:16:39.520837 2562 scope.go:117] "RemoveContainer" containerID="92c13990d8defe4b5ec7a970a2012d0d7c0ecb8405f921b9fab4b8107169d985" Apr 23 17:16:39.522161 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:16:39.522136 2562 generic.go:358] "Generic (PLEG): container finished" podID="fb5847d6-03ec-4f95-b52e-73ad90bd6deb" containerID="37c5c9eaa7d6f13169d1e98d569fe17c2021a5b94912b7832c94a88380ca906a" exitCode=0 Apr 23 17:16:39.522246 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:16:39.522186 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-xnt95" event={"ID":"fb5847d6-03ec-4f95-b52e-73ad90bd6deb","Type":"ContainerDied","Data":"37c5c9eaa7d6f13169d1e98d569fe17c2021a5b94912b7832c94a88380ca906a"} Apr 23 17:16:39.528951 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:16:39.528932 2562 scope.go:117] "RemoveContainer" containerID="450d229e340fe5fb9a4e69bb0d544efd11cd49d9fc9c7a793c8266c6b31be37c" Apr 23 17:16:39.535936 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:16:39.535907 2562 scope.go:117] "RemoveContainer" containerID="92c13990d8defe4b5ec7a970a2012d0d7c0ecb8405f921b9fab4b8107169d985" Apr 23 17:16:39.536186 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:16:39.536165 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92c13990d8defe4b5ec7a970a2012d0d7c0ecb8405f921b9fab4b8107169d985\": container with ID starting with 92c13990d8defe4b5ec7a970a2012d0d7c0ecb8405f921b9fab4b8107169d985 not found: ID does not exist" containerID="92c13990d8defe4b5ec7a970a2012d0d7c0ecb8405f921b9fab4b8107169d985" Apr 23 17:16:39.536243 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:16:39.536195 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92c13990d8defe4b5ec7a970a2012d0d7c0ecb8405f921b9fab4b8107169d985"} err="failed to get container status \"92c13990d8defe4b5ec7a970a2012d0d7c0ecb8405f921b9fab4b8107169d985\": rpc error: code = NotFound desc = could not find container \"92c13990d8defe4b5ec7a970a2012d0d7c0ecb8405f921b9fab4b8107169d985\": container with ID starting with 92c13990d8defe4b5ec7a970a2012d0d7c0ecb8405f921b9fab4b8107169d985 not found: ID does not exist" Apr 23 17:16:39.536243 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:16:39.536213 2562 scope.go:117] "RemoveContainer" containerID="450d229e340fe5fb9a4e69bb0d544efd11cd49d9fc9c7a793c8266c6b31be37c" Apr 23 17:16:39.536443 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:16:39.536427 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"450d229e340fe5fb9a4e69bb0d544efd11cd49d9fc9c7a793c8266c6b31be37c\": container with ID starting with 450d229e340fe5fb9a4e69bb0d544efd11cd49d9fc9c7a793c8266c6b31be37c not found: ID does not exist" containerID="450d229e340fe5fb9a4e69bb0d544efd11cd49d9fc9c7a793c8266c6b31be37c" Apr 23 17:16:39.536483 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:16:39.536448 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"450d229e340fe5fb9a4e69bb0d544efd11cd49d9fc9c7a793c8266c6b31be37c"} err="failed to get container status \"450d229e340fe5fb9a4e69bb0d544efd11cd49d9fc9c7a793c8266c6b31be37c\": rpc error: code = NotFound desc = could not find container \"450d229e340fe5fb9a4e69bb0d544efd11cd49d9fc9c7a793c8266c6b31be37c\": container with ID starting with 450d229e340fe5fb9a4e69bb0d544efd11cd49d9fc9c7a793c8266c6b31be37c not found: ID does not exist" Apr 23 17:16:39.593855 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:16:39.593831 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41bebb8a-5ad6-4341-a8f8-4735ee6f3a49-kserve-provision-location\") pod \"41bebb8a-5ad6-4341-a8f8-4735ee6f3a49\" (UID: \"41bebb8a-5ad6-4341-a8f8-4735ee6f3a49\") " Apr 23 17:16:39.594113 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:16:39.594091 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41bebb8a-5ad6-4341-a8f8-4735ee6f3a49-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "41bebb8a-5ad6-4341-a8f8-4735ee6f3a49" (UID: "41bebb8a-5ad6-4341-a8f8-4735ee6f3a49"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:16:39.695136 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:16:39.695099 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41bebb8a-5ad6-4341-a8f8-4735ee6f3a49-kserve-provision-location\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 17:16:39.843246 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:16:39.843214 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-7cdb7c9b74-lzt55"] Apr 23 17:16:39.846712 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:16:39.846686 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-7cdb7c9b74-lzt55"] Apr 23 17:16:40.527398 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:16:40.527364 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-xnt95" event={"ID":"fb5847d6-03ec-4f95-b52e-73ad90bd6deb","Type":"ContainerStarted","Data":"bea88cf37d75ddd00942f93cc9fd362a96c42f6a040b8b887fa41de0e7187016"} Apr 23 17:16:40.527816 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:16:40.527574 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-xnt95" Apr 23 17:16:40.544005 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:16:40.543955 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-xnt95" podStartSLOduration=6.543941002 podStartE2EDuration="6.543941002s" podCreationTimestamp="2026-04-23 17:16:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:16:40.543304507 +0000 UTC m=+2480.396119571" watchObservedRunningTime="2026-04-23 17:16:40.543941002 +0000 UTC m=+2480.396756066" Apr 23 17:16:40.742491 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:16:40.742456 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41bebb8a-5ad6-4341-a8f8-4735ee6f3a49" path="/var/lib/kubelet/pods/41bebb8a-5ad6-4341-a8f8-4735ee6f3a49/volumes" Apr 23 17:17:11.535962 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:17:11.535922 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-xnt95" podUID="fb5847d6-03ec-4f95-b52e-73ad90bd6deb" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 23 17:17:21.532653 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:17:21.532620 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-xnt95" Apr 23 17:17:24.958671 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:17:24.958639 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-xnt95"] Apr 23 17:17:24.959076 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:17:24.958906 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-xnt95" podUID="fb5847d6-03ec-4f95-b52e-73ad90bd6deb" containerName="kserve-container" containerID="cri-o://bea88cf37d75ddd00942f93cc9fd362a96c42f6a040b8b887fa41de0e7187016" gracePeriod=30 Apr 23 17:17:25.052462 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:17:25.052421 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-596dd4c984-qk65z"] Apr 23 17:17:25.052834 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:17:25.052817 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="41bebb8a-5ad6-4341-a8f8-4735ee6f3a49" containerName="storage-initializer" Apr 23 17:17:25.052926 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:17:25.052836 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="41bebb8a-5ad6-4341-a8f8-4735ee6f3a49" containerName="storage-initializer" Apr 23 17:17:25.052926 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:17:25.052874 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="41bebb8a-5ad6-4341-a8f8-4735ee6f3a49" containerName="kserve-container" Apr 23 17:17:25.052926 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:17:25.052883 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="41bebb8a-5ad6-4341-a8f8-4735ee6f3a49" containerName="kserve-container" Apr 23 17:17:25.053093 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:17:25.052962 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="41bebb8a-5ad6-4341-a8f8-4735ee6f3a49" containerName="kserve-container" Apr 23 17:17:25.056205 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:17:25.056183 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-596dd4c984-qk65z" Apr 23 17:17:25.063398 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:17:25.063046 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-596dd4c984-qk65z"] Apr 23 17:17:25.172752 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:17:25.172717 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2482d6e0-3fd8-4caa-a192-b5591518285a-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-596dd4c984-qk65z\" (UID: \"2482d6e0-3fd8-4caa-a192-b5591518285a\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-596dd4c984-qk65z" Apr 23 17:17:25.273838 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:17:25.273752 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2482d6e0-3fd8-4caa-a192-b5591518285a-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-596dd4c984-qk65z\" (UID: \"2482d6e0-3fd8-4caa-a192-b5591518285a\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-596dd4c984-qk65z" Apr 23 17:17:25.274122 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:17:25.274097 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2482d6e0-3fd8-4caa-a192-b5591518285a-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-596dd4c984-qk65z\" (UID: \"2482d6e0-3fd8-4caa-a192-b5591518285a\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-596dd4c984-qk65z" Apr 23 17:17:25.367468 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:17:25.367431 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-596dd4c984-qk65z" Apr 23 17:17:25.486800 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:17:25.486773 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-596dd4c984-qk65z"] Apr 23 17:17:25.489117 ip-10-0-135-57 kubenswrapper[2562]: W0423 17:17:25.489078 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2482d6e0_3fd8_4caa_a192_b5591518285a.slice/crio-2cbaf310825b8f8047dd652bd117885d771de7c07343ff3d5e1d01794dfca997 WatchSource:0}: Error finding container 2cbaf310825b8f8047dd652bd117885d771de7c07343ff3d5e1d01794dfca997: Status 404 returned error can't find the container with id 2cbaf310825b8f8047dd652bd117885d771de7c07343ff3d5e1d01794dfca997 Apr 23 17:17:25.661432 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:17:25.661400 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-596dd4c984-qk65z" event={"ID":"2482d6e0-3fd8-4caa-a192-b5591518285a","Type":"ContainerStarted","Data":"fa7ac5a3f322a94272aa592b701d5dbbe4732a757f46f79c525e6f1738add967"} Apr 23 17:17:25.661432 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:17:25.661436 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-596dd4c984-qk65z" event={"ID":"2482d6e0-3fd8-4caa-a192-b5591518285a","Type":"ContainerStarted","Data":"2cbaf310825b8f8047dd652bd117885d771de7c07343ff3d5e1d01794dfca997"} Apr 23 17:17:30.678102 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:17:30.678064 2562 generic.go:358] "Generic (PLEG): container finished" podID="2482d6e0-3fd8-4caa-a192-b5591518285a" containerID="fa7ac5a3f322a94272aa592b701d5dbbe4732a757f46f79c525e6f1738add967" exitCode=0 Apr 23 17:17:30.678468 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:17:30.678116 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-596dd4c984-qk65z" event={"ID":"2482d6e0-3fd8-4caa-a192-b5591518285a","Type":"ContainerDied","Data":"fa7ac5a3f322a94272aa592b701d5dbbe4732a757f46f79c525e6f1738add967"} Apr 23 17:17:31.530714 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:17:31.530651 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-xnt95" podUID="fb5847d6-03ec-4f95-b52e-73ad90bd6deb" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.49:8080/v2/models/sklearn-v2-mlserver/ready\": dial tcp 10.133.0.49:8080: connect: connection refused" Apr 23 17:17:31.683727 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:17:31.683692 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-596dd4c984-qk65z" event={"ID":"2482d6e0-3fd8-4caa-a192-b5591518285a","Type":"ContainerStarted","Data":"64f4f1a86bde965d6bbd18f30691540c2ce810a590f1fc14394e37f2f71f12ba"} Apr 23 17:17:31.684107 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:17:31.684017 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-596dd4c984-qk65z" Apr 23 17:17:31.685394 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:17:31.685362 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-596dd4c984-qk65z" podUID="2482d6e0-3fd8-4caa-a192-b5591518285a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.50:8080: connect: connection refused" Apr 23 17:17:31.700953 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:17:31.700912 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-596dd4c984-qk65z" podStartSLOduration=6.700896317 podStartE2EDuration="6.700896317s" podCreationTimestamp="2026-04-23 17:17:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:17:31.699275264 +0000 UTC m=+2531.552090329" watchObservedRunningTime="2026-04-23 17:17:31.700896317 +0000 UTC m=+2531.553711384" Apr 23 17:17:32.596248 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:17:32.596222 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-xnt95" Apr 23 17:17:32.687885 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:17:32.687774 2562 generic.go:358] "Generic (PLEG): container finished" podID="fb5847d6-03ec-4f95-b52e-73ad90bd6deb" containerID="bea88cf37d75ddd00942f93cc9fd362a96c42f6a040b8b887fa41de0e7187016" exitCode=0 Apr 23 17:17:32.687885 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:17:32.687820 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-xnt95" event={"ID":"fb5847d6-03ec-4f95-b52e-73ad90bd6deb","Type":"ContainerDied","Data":"bea88cf37d75ddd00942f93cc9fd362a96c42f6a040b8b887fa41de0e7187016"} Apr 23 17:17:32.687885 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:17:32.687852 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-xnt95" Apr 23 17:17:32.687885 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:17:32.687863 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-xnt95" event={"ID":"fb5847d6-03ec-4f95-b52e-73ad90bd6deb","Type":"ContainerDied","Data":"4ca2221d51e7865ab86be26e04afded0688943d435832788409225ce5a1b36f2"} Apr 23 17:17:32.687885 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:17:32.687885 2562 scope.go:117] "RemoveContainer" containerID="bea88cf37d75ddd00942f93cc9fd362a96c42f6a040b8b887fa41de0e7187016" Apr 23 17:17:32.688501 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:17:32.688471 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-596dd4c984-qk65z" podUID="2482d6e0-3fd8-4caa-a192-b5591518285a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.50:8080: connect: connection refused" Apr 23 17:17:32.695807 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:17:32.695788 2562 scope.go:117] "RemoveContainer" containerID="37c5c9eaa7d6f13169d1e98d569fe17c2021a5b94912b7832c94a88380ca906a" Apr 23 17:17:32.703377 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:17:32.703361 2562 scope.go:117] "RemoveContainer" containerID="bea88cf37d75ddd00942f93cc9fd362a96c42f6a040b8b887fa41de0e7187016" Apr 23 17:17:32.703638 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:17:32.703615 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bea88cf37d75ddd00942f93cc9fd362a96c42f6a040b8b887fa41de0e7187016\": container with ID starting with bea88cf37d75ddd00942f93cc9fd362a96c42f6a040b8b887fa41de0e7187016 not found: ID does not exist" containerID="bea88cf37d75ddd00942f93cc9fd362a96c42f6a040b8b887fa41de0e7187016" Apr 23 17:17:32.703736 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:17:32.703644 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bea88cf37d75ddd00942f93cc9fd362a96c42f6a040b8b887fa41de0e7187016"} err="failed to get container status \"bea88cf37d75ddd00942f93cc9fd362a96c42f6a040b8b887fa41de0e7187016\": rpc error: code = NotFound desc = could not find container \"bea88cf37d75ddd00942f93cc9fd362a96c42f6a040b8b887fa41de0e7187016\": container with ID starting with bea88cf37d75ddd00942f93cc9fd362a96c42f6a040b8b887fa41de0e7187016 not found: ID does not exist" Apr 23 17:17:32.703736 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:17:32.703663 2562 scope.go:117] "RemoveContainer" containerID="37c5c9eaa7d6f13169d1e98d569fe17c2021a5b94912b7832c94a88380ca906a" Apr 23 17:17:32.703967 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:17:32.703948 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37c5c9eaa7d6f13169d1e98d569fe17c2021a5b94912b7832c94a88380ca906a\": container with ID starting with 37c5c9eaa7d6f13169d1e98d569fe17c2021a5b94912b7832c94a88380ca906a not found: ID does not exist" containerID="37c5c9eaa7d6f13169d1e98d569fe17c2021a5b94912b7832c94a88380ca906a" Apr 23 17:17:32.704019 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:17:32.703977 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37c5c9eaa7d6f13169d1e98d569fe17c2021a5b94912b7832c94a88380ca906a"} err="failed to get container status \"37c5c9eaa7d6f13169d1e98d569fe17c2021a5b94912b7832c94a88380ca906a\": rpc error: code = NotFound desc = could not find container \"37c5c9eaa7d6f13169d1e98d569fe17c2021a5b94912b7832c94a88380ca906a\": container with ID starting with 37c5c9eaa7d6f13169d1e98d569fe17c2021a5b94912b7832c94a88380ca906a not found: ID does not exist" Apr 23 17:17:32.737234 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:17:32.737200 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb5847d6-03ec-4f95-b52e-73ad90bd6deb-kserve-provision-location\") pod \"fb5847d6-03ec-4f95-b52e-73ad90bd6deb\" (UID: \"fb5847d6-03ec-4f95-b52e-73ad90bd6deb\") " Apr 23 17:17:32.737591 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:17:32.737567 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb5847d6-03ec-4f95-b52e-73ad90bd6deb-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "fb5847d6-03ec-4f95-b52e-73ad90bd6deb" (UID: "fb5847d6-03ec-4f95-b52e-73ad90bd6deb"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:17:32.838355 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:17:32.838314 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb5847d6-03ec-4f95-b52e-73ad90bd6deb-kserve-provision-location\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 17:17:33.003345 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:17:33.003264 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-xnt95"] Apr 23 17:17:33.006224 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:17:33.006193 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-xnt95"] Apr 23 17:17:34.743320 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:17:34.743286 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb5847d6-03ec-4f95-b52e-73ad90bd6deb" path="/var/lib/kubelet/pods/fb5847d6-03ec-4f95-b52e-73ad90bd6deb/volumes" Apr 23 17:17:42.689145 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:17:42.689101 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-596dd4c984-qk65z" podUID="2482d6e0-3fd8-4caa-a192-b5591518285a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.50:8080: connect: connection refused" Apr 23 17:17:52.689542 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:17:52.689504 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-596dd4c984-qk65z" Apr 23 17:18:02.064857 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:02.064814 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-runtime-predictor-596dd4c984-qk65z_2482d6e0-3fd8-4caa-a192-b5591518285a/kserve-container/0.log" Apr 23 17:18:02.193528 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:02.193490 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-596dd4c984-qk65z"] Apr 23 17:18:02.193864 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:02.193821 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-596dd4c984-qk65z" podUID="2482d6e0-3fd8-4caa-a192-b5591518285a" containerName="kserve-container" containerID="cri-o://64f4f1a86bde965d6bbd18f30691540c2ce810a590f1fc14394e37f2f71f12ba" gracePeriod=30 Apr 23 17:18:02.254702 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:02.254668 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-xn9tc"] Apr 23 17:18:02.255028 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:02.255015 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb5847d6-03ec-4f95-b52e-73ad90bd6deb" containerName="kserve-container" Apr 23 17:18:02.255079 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:02.255030 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb5847d6-03ec-4f95-b52e-73ad90bd6deb" containerName="kserve-container" Apr 23 17:18:02.255079 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:02.255045 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb5847d6-03ec-4f95-b52e-73ad90bd6deb" containerName="storage-initializer" Apr 23 17:18:02.255079 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:02.255051 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb5847d6-03ec-4f95-b52e-73ad90bd6deb" containerName="storage-initializer" Apr 23 17:18:02.255175 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:02.255102 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="fb5847d6-03ec-4f95-b52e-73ad90bd6deb" containerName="kserve-container" Apr 23 17:18:02.258281 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:02.258266 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-xn9tc" Apr 23 17:18:02.266690 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:02.266664 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-xn9tc"] Apr 23 17:18:02.385068 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:02.384979 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e71e7e87-3c84-4329-8ec3-40f541c098eb-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-7dd9b85c64-xn9tc\" (UID: \"e71e7e87-3c84-4329-8ec3-40f541c098eb\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-xn9tc" Apr 23 17:18:02.486350 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:02.486310 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e71e7e87-3c84-4329-8ec3-40f541c098eb-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-7dd9b85c64-xn9tc\" (UID: \"e71e7e87-3c84-4329-8ec3-40f541c098eb\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-xn9tc" Apr 23 17:18:02.486682 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:02.486663 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e71e7e87-3c84-4329-8ec3-40f541c098eb-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-7dd9b85c64-xn9tc\" (UID: \"e71e7e87-3c84-4329-8ec3-40f541c098eb\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-xn9tc" Apr 23 17:18:02.568520 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:02.568483 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-xn9tc" Apr 23 17:18:02.688683 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:02.688640 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-596dd4c984-qk65z" podUID="2482d6e0-3fd8-4caa-a192-b5591518285a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.50:8080: connect: connection refused" Apr 23 17:18:02.693311 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:02.693286 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-xn9tc"] Apr 23 17:18:02.696417 ip-10-0-135-57 kubenswrapper[2562]: W0423 17:18:02.696389 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode71e7e87_3c84_4329_8ec3_40f541c098eb.slice/crio-eb74afddb7075a1f0de096a64290b01fd0e1dcab04cdaadec78fb04a58291529 WatchSource:0}: Error finding container eb74afddb7075a1f0de096a64290b01fd0e1dcab04cdaadec78fb04a58291529: Status 404 returned error can't find the container with id eb74afddb7075a1f0de096a64290b01fd0e1dcab04cdaadec78fb04a58291529 Apr 23 17:18:02.779585 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:02.779550 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-xn9tc" event={"ID":"e71e7e87-3c84-4329-8ec3-40f541c098eb","Type":"ContainerStarted","Data":"10c0fe5516b09a18f1a260c97893ced661133717f3a76cf85ad92d94ec5869fe"} Apr 23 17:18:02.779736 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:02.779594 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-xn9tc" event={"ID":"e71e7e87-3c84-4329-8ec3-40f541c098eb","Type":"ContainerStarted","Data":"eb74afddb7075a1f0de096a64290b01fd0e1dcab04cdaadec78fb04a58291529"} Apr 23 17:18:03.238995 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:03.238967 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-596dd4c984-qk65z" Apr 23 17:18:03.294404 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:03.294364 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2482d6e0-3fd8-4caa-a192-b5591518285a-kserve-provision-location\") pod \"2482d6e0-3fd8-4caa-a192-b5591518285a\" (UID: \"2482d6e0-3fd8-4caa-a192-b5591518285a\") " Apr 23 17:18:03.319260 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:03.319222 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2482d6e0-3fd8-4caa-a192-b5591518285a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2482d6e0-3fd8-4caa-a192-b5591518285a" (UID: "2482d6e0-3fd8-4caa-a192-b5591518285a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:18:03.395793 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:03.395737 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2482d6e0-3fd8-4caa-a192-b5591518285a-kserve-provision-location\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 17:18:03.783824 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:03.783685 2562 generic.go:358] "Generic (PLEG): container finished" podID="2482d6e0-3fd8-4caa-a192-b5591518285a" containerID="64f4f1a86bde965d6bbd18f30691540c2ce810a590f1fc14394e37f2f71f12ba" exitCode=0 Apr 23 17:18:03.783824 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:03.783785 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-596dd4c984-qk65z" Apr 23 17:18:03.784057 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:03.783780 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-596dd4c984-qk65z" event={"ID":"2482d6e0-3fd8-4caa-a192-b5591518285a","Type":"ContainerDied","Data":"64f4f1a86bde965d6bbd18f30691540c2ce810a590f1fc14394e37f2f71f12ba"} Apr 23 17:18:03.784057 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:03.783893 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-596dd4c984-qk65z" event={"ID":"2482d6e0-3fd8-4caa-a192-b5591518285a","Type":"ContainerDied","Data":"2cbaf310825b8f8047dd652bd117885d771de7c07343ff3d5e1d01794dfca997"} Apr 23 17:18:03.784057 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:03.783916 2562 scope.go:117] "RemoveContainer" containerID="64f4f1a86bde965d6bbd18f30691540c2ce810a590f1fc14394e37f2f71f12ba" Apr 23 17:18:03.791647 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:03.791546 2562 scope.go:117] "RemoveContainer" containerID="fa7ac5a3f322a94272aa592b701d5dbbe4732a757f46f79c525e6f1738add967" Apr 23 17:18:03.799056 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:03.798820 2562 scope.go:117] "RemoveContainer" containerID="64f4f1a86bde965d6bbd18f30691540c2ce810a590f1fc14394e37f2f71f12ba" Apr 23 17:18:03.799206 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:18:03.799183 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64f4f1a86bde965d6bbd18f30691540c2ce810a590f1fc14394e37f2f71f12ba\": container with ID starting with 64f4f1a86bde965d6bbd18f30691540c2ce810a590f1fc14394e37f2f71f12ba not found: ID does not exist" containerID="64f4f1a86bde965d6bbd18f30691540c2ce810a590f1fc14394e37f2f71f12ba" Apr 23 17:18:03.799281 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:03.799217 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64f4f1a86bde965d6bbd18f30691540c2ce810a590f1fc14394e37f2f71f12ba"} err="failed to get container status \"64f4f1a86bde965d6bbd18f30691540c2ce810a590f1fc14394e37f2f71f12ba\": rpc error: code = NotFound desc = could not find container \"64f4f1a86bde965d6bbd18f30691540c2ce810a590f1fc14394e37f2f71f12ba\": container with ID starting with 64f4f1a86bde965d6bbd18f30691540c2ce810a590f1fc14394e37f2f71f12ba not found: ID does not exist" Apr 23 17:18:03.799281 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:03.799242 2562 scope.go:117] "RemoveContainer" containerID="fa7ac5a3f322a94272aa592b701d5dbbe4732a757f46f79c525e6f1738add967" Apr 23 17:18:03.799514 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:18:03.799495 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa7ac5a3f322a94272aa592b701d5dbbe4732a757f46f79c525e6f1738add967\": container with ID starting with fa7ac5a3f322a94272aa592b701d5dbbe4732a757f46f79c525e6f1738add967 not found: ID does not exist" containerID="fa7ac5a3f322a94272aa592b701d5dbbe4732a757f46f79c525e6f1738add967" Apr 23 17:18:03.799555 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:03.799520 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa7ac5a3f322a94272aa592b701d5dbbe4732a757f46f79c525e6f1738add967"} err="failed to get container status \"fa7ac5a3f322a94272aa592b701d5dbbe4732a757f46f79c525e6f1738add967\": rpc error: code = NotFound desc = could not find container \"fa7ac5a3f322a94272aa592b701d5dbbe4732a757f46f79c525e6f1738add967\": container with ID starting with fa7ac5a3f322a94272aa592b701d5dbbe4732a757f46f79c525e6f1738add967 not found: ID does not exist" Apr 23 17:18:03.804663 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:03.804637 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-596dd4c984-qk65z"] Apr 23 17:18:03.810613 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:03.810589 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-596dd4c984-qk65z"] Apr 23 17:18:04.742143 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:04.742110 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2482d6e0-3fd8-4caa-a192-b5591518285a" path="/var/lib/kubelet/pods/2482d6e0-3fd8-4caa-a192-b5591518285a/volumes" Apr 23 17:18:06.793469 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:06.793431 2562 generic.go:358] "Generic (PLEG): container finished" podID="e71e7e87-3c84-4329-8ec3-40f541c098eb" containerID="10c0fe5516b09a18f1a260c97893ced661133717f3a76cf85ad92d94ec5869fe" exitCode=0 Apr 23 17:18:06.793904 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:06.793495 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-xn9tc" event={"ID":"e71e7e87-3c84-4329-8ec3-40f541c098eb","Type":"ContainerDied","Data":"10c0fe5516b09a18f1a260c97893ced661133717f3a76cf85ad92d94ec5869fe"} Apr 23 17:18:07.797718 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:07.797682 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-xn9tc" event={"ID":"e71e7e87-3c84-4329-8ec3-40f541c098eb","Type":"ContainerStarted","Data":"75e861bc29177ea83ad36d77de5f96977bdc5fbfa18d1e7a8e310bce0fb6a5c7"} Apr 23 17:18:07.798132 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:07.797948 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-xn9tc" Apr 23 17:18:07.815929 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:07.815870 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-xn9tc" podStartSLOduration=5.815854648 podStartE2EDuration="5.815854648s" podCreationTimestamp="2026-04-23 17:18:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:18:07.813780293 +0000 UTC m=+2567.666595357" watchObservedRunningTime="2026-04-23 17:18:07.815854648 +0000 UTC m=+2567.668669773" Apr 23 17:18:38.837923 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:38.837873 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-xn9tc" podUID="e71e7e87-3c84-4329-8ec3-40f541c098eb" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 23 17:18:48.803394 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:48.803313 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-xn9tc" Apr 23 17:18:52.360844 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:52.360809 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-xn9tc"] Apr 23 17:18:52.361219 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:52.361049 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-xn9tc" podUID="e71e7e87-3c84-4329-8ec3-40f541c098eb" containerName="kserve-container" containerID="cri-o://75e861bc29177ea83ad36d77de5f96977bdc5fbfa18d1e7a8e310bce0fb6a5c7" gracePeriod=30 Apr 23 17:18:52.430315 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:52.430276 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-77bdb67c66-8nvqp"] Apr 23 17:18:52.430677 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:52.430658 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2482d6e0-3fd8-4caa-a192-b5591518285a" containerName="kserve-container" Apr 23 17:18:52.430794 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:52.430679 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="2482d6e0-3fd8-4caa-a192-b5591518285a" containerName="kserve-container" Apr 23 17:18:52.430794 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:52.430699 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2482d6e0-3fd8-4caa-a192-b5591518285a" containerName="storage-initializer" Apr 23 17:18:52.430794 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:52.430708 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="2482d6e0-3fd8-4caa-a192-b5591518285a" containerName="storage-initializer" Apr 23 17:18:52.430974 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:52.430814 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="2482d6e0-3fd8-4caa-a192-b5591518285a" containerName="kserve-container" Apr 23 17:18:52.435416 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:52.435385 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-77bdb67c66-8nvqp" Apr 23 17:18:52.440615 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:52.440587 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-77bdb67c66-8nvqp"] Apr 23 17:18:52.479377 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:52.479338 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/417e2587-311a-4b06-8a7d-642f40191d62-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-77bdb67c66-8nvqp\" (UID: \"417e2587-311a-4b06-8a7d-642f40191d62\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-77bdb67c66-8nvqp" Apr 23 17:18:52.580031 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:52.579988 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/417e2587-311a-4b06-8a7d-642f40191d62-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-77bdb67c66-8nvqp\" (UID: \"417e2587-311a-4b06-8a7d-642f40191d62\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-77bdb67c66-8nvqp" Apr 23 17:18:52.580416 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:52.580396 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/417e2587-311a-4b06-8a7d-642f40191d62-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-77bdb67c66-8nvqp\" (UID: \"417e2587-311a-4b06-8a7d-642f40191d62\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-77bdb67c66-8nvqp" Apr 23 17:18:52.746876 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:52.746848 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-77bdb67c66-8nvqp" Apr 23 17:18:52.877330 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:52.877306 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-77bdb67c66-8nvqp"] Apr 23 17:18:52.879246 ip-10-0-135-57 kubenswrapper[2562]: W0423 17:18:52.879217 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod417e2587_311a_4b06_8a7d_642f40191d62.slice/crio-9331636aeffa976bd10180bef03100aafbddfcf3face3fd9c5f43bcea4919bed WatchSource:0}: Error finding container 9331636aeffa976bd10180bef03100aafbddfcf3face3fd9c5f43bcea4919bed: Status 404 returned error can't find the container with id 9331636aeffa976bd10180bef03100aafbddfcf3face3fd9c5f43bcea4919bed Apr 23 17:18:52.930682 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:52.930650 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-77bdb67c66-8nvqp" event={"ID":"417e2587-311a-4b06-8a7d-642f40191d62","Type":"ContainerStarted","Data":"9331636aeffa976bd10180bef03100aafbddfcf3face3fd9c5f43bcea4919bed"} Apr 23 17:18:53.935055 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:53.935018 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-77bdb67c66-8nvqp" event={"ID":"417e2587-311a-4b06-8a7d-642f40191d62","Type":"ContainerStarted","Data":"549e432eb1ec7e4ebb25dbd97888e96f502b1c4058a94fcde7f2e883185da025"} Apr 23 17:18:56.944267 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:56.944229 2562 generic.go:358] "Generic (PLEG): container finished" podID="417e2587-311a-4b06-8a7d-642f40191d62" containerID="549e432eb1ec7e4ebb25dbd97888e96f502b1c4058a94fcde7f2e883185da025" exitCode=0 Apr 23 17:18:56.944655 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:56.944304 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-77bdb67c66-8nvqp" event={"ID":"417e2587-311a-4b06-8a7d-642f40191d62","Type":"ContainerDied","Data":"549e432eb1ec7e4ebb25dbd97888e96f502b1c4058a94fcde7f2e883185da025"} Apr 23 17:18:57.949155 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:57.949123 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-77bdb67c66-8nvqp" event={"ID":"417e2587-311a-4b06-8a7d-642f40191d62","Type":"ContainerStarted","Data":"abc8fd1823aec69eec21d742074be8e92539fc2dd50c61042a22363157d232b6"} Apr 23 17:18:57.949613 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:57.949417 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-77bdb67c66-8nvqp" Apr 23 17:18:57.950833 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:57.950805 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-77bdb67c66-8nvqp" podUID="417e2587-311a-4b06-8a7d-642f40191d62" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.52:8080: connect: connection refused" Apr 23 17:18:57.965962 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:57.965916 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-77bdb67c66-8nvqp" podStartSLOduration=5.965900882 podStartE2EDuration="5.965900882s" podCreationTimestamp="2026-04-23 17:18:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:18:57.96468476 +0000 UTC m=+2617.817499823" watchObservedRunningTime="2026-04-23 17:18:57.965900882 +0000 UTC m=+2617.818715943" Apr 23 17:18:58.801540 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:58.801492 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-xn9tc" podUID="e71e7e87-3c84-4329-8ec3-40f541c098eb" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.51:8080/v2/models/isvc-sklearn-v2-runtime/ready\": dial tcp 10.133.0.51:8080: connect: connection refused" Apr 23 17:18:58.952591 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:58.952556 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-77bdb67c66-8nvqp" podUID="417e2587-311a-4b06-8a7d-642f40191d62" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.52:8080: connect: connection refused" Apr 23 17:18:59.896100 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:59.896075 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-xn9tc" Apr 23 17:18:59.945653 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:59.945615 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e71e7e87-3c84-4329-8ec3-40f541c098eb-kserve-provision-location\") pod \"e71e7e87-3c84-4329-8ec3-40f541c098eb\" (UID: \"e71e7e87-3c84-4329-8ec3-40f541c098eb\") " Apr 23 17:18:59.945962 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:59.945938 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e71e7e87-3c84-4329-8ec3-40f541c098eb-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e71e7e87-3c84-4329-8ec3-40f541c098eb" (UID: "e71e7e87-3c84-4329-8ec3-40f541c098eb"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:18:59.957085 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:59.957055 2562 generic.go:358] "Generic (PLEG): container finished" podID="e71e7e87-3c84-4329-8ec3-40f541c098eb" containerID="75e861bc29177ea83ad36d77de5f96977bdc5fbfa18d1e7a8e310bce0fb6a5c7" exitCode=0 Apr 23 17:18:59.957502 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:59.957124 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-xn9tc" Apr 23 17:18:59.957502 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:59.957131 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-xn9tc" event={"ID":"e71e7e87-3c84-4329-8ec3-40f541c098eb","Type":"ContainerDied","Data":"75e861bc29177ea83ad36d77de5f96977bdc5fbfa18d1e7a8e310bce0fb6a5c7"} Apr 23 17:18:59.957502 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:59.957171 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-xn9tc" event={"ID":"e71e7e87-3c84-4329-8ec3-40f541c098eb","Type":"ContainerDied","Data":"eb74afddb7075a1f0de096a64290b01fd0e1dcab04cdaadec78fb04a58291529"} Apr 23 17:18:59.957502 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:59.957194 2562 scope.go:117] "RemoveContainer" containerID="75e861bc29177ea83ad36d77de5f96977bdc5fbfa18d1e7a8e310bce0fb6a5c7" Apr 23 17:18:59.966791 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:59.966766 2562 scope.go:117] "RemoveContainer" containerID="10c0fe5516b09a18f1a260c97893ced661133717f3a76cf85ad92d94ec5869fe" Apr 23 17:18:59.974349 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:59.974310 2562 scope.go:117] "RemoveContainer" containerID="75e861bc29177ea83ad36d77de5f96977bdc5fbfa18d1e7a8e310bce0fb6a5c7" Apr 23 17:18:59.974661 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:18:59.974642 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75e861bc29177ea83ad36d77de5f96977bdc5fbfa18d1e7a8e310bce0fb6a5c7\": container with ID starting with 75e861bc29177ea83ad36d77de5f96977bdc5fbfa18d1e7a8e310bce0fb6a5c7 not found: ID does not exist" containerID="75e861bc29177ea83ad36d77de5f96977bdc5fbfa18d1e7a8e310bce0fb6a5c7" Apr 23 17:18:59.974713 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:59.974671 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75e861bc29177ea83ad36d77de5f96977bdc5fbfa18d1e7a8e310bce0fb6a5c7"} err="failed to get container status \"75e861bc29177ea83ad36d77de5f96977bdc5fbfa18d1e7a8e310bce0fb6a5c7\": rpc error: code = NotFound desc = could not find container \"75e861bc29177ea83ad36d77de5f96977bdc5fbfa18d1e7a8e310bce0fb6a5c7\": container with ID starting with 75e861bc29177ea83ad36d77de5f96977bdc5fbfa18d1e7a8e310bce0fb6a5c7 not found: ID does not exist" Apr 23 17:18:59.974713 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:59.974690 2562 scope.go:117] "RemoveContainer" containerID="10c0fe5516b09a18f1a260c97893ced661133717f3a76cf85ad92d94ec5869fe" Apr 23 17:18:59.975286 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:18:59.975261 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10c0fe5516b09a18f1a260c97893ced661133717f3a76cf85ad92d94ec5869fe\": container with ID starting with 10c0fe5516b09a18f1a260c97893ced661133717f3a76cf85ad92d94ec5869fe not found: ID does not exist" containerID="10c0fe5516b09a18f1a260c97893ced661133717f3a76cf85ad92d94ec5869fe" Apr 23 17:18:59.975393 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:59.975290 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10c0fe5516b09a18f1a260c97893ced661133717f3a76cf85ad92d94ec5869fe"} err="failed to get container status \"10c0fe5516b09a18f1a260c97893ced661133717f3a76cf85ad92d94ec5869fe\": rpc error: code = NotFound desc = could not find container \"10c0fe5516b09a18f1a260c97893ced661133717f3a76cf85ad92d94ec5869fe\": container with ID starting with 10c0fe5516b09a18f1a260c97893ced661133717f3a76cf85ad92d94ec5869fe not found: ID does not exist" Apr 23 17:18:59.979122 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:59.979103 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-xn9tc"] Apr 23 17:18:59.982513 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:18:59.982490 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-xn9tc"] Apr 23 17:19:00.046715 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:19:00.046629 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e71e7e87-3c84-4329-8ec3-40f541c098eb-kserve-provision-location\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 17:19:00.743388 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:19:00.743354 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e71e7e87-3c84-4329-8ec3-40f541c098eb" path="/var/lib/kubelet/pods/e71e7e87-3c84-4329-8ec3-40f541c098eb/volumes" Apr 23 17:19:08.952875 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:19:08.952835 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-77bdb67c66-8nvqp" podUID="417e2587-311a-4b06-8a7d-642f40191d62" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.52:8080: connect: connection refused" Apr 23 17:19:18.953520 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:19:18.953479 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-77bdb67c66-8nvqp" podUID="417e2587-311a-4b06-8a7d-642f40191d62" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.52:8080: connect: connection refused" Apr 23 17:19:28.953317 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:19:28.953259 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-77bdb67c66-8nvqp" podUID="417e2587-311a-4b06-8a7d-642f40191d62" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.52:8080: connect: connection refused" Apr 23 17:19:38.953560 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:19:38.953504 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-77bdb67c66-8nvqp" podUID="417e2587-311a-4b06-8a7d-642f40191d62" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.52:8080: connect: connection refused" Apr 23 17:19:48.953230 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:19:48.953190 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-77bdb67c66-8nvqp" podUID="417e2587-311a-4b06-8a7d-642f40191d62" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.52:8080: connect: connection refused" Apr 23 17:19:58.952659 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:19:58.952611 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-77bdb67c66-8nvqp" podUID="417e2587-311a-4b06-8a7d-642f40191d62" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.52:8080: connect: connection refused" Apr 23 17:20:08.953945 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:20:08.953915 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-77bdb67c66-8nvqp" Apr 23 17:20:12.672106 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:20:12.672072 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-77bdb67c66-8nvqp"] Apr 23 17:20:12.672480 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:20:12.672365 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-77bdb67c66-8nvqp" podUID="417e2587-311a-4b06-8a7d-642f40191d62" containerName="kserve-container" containerID="cri-o://abc8fd1823aec69eec21d742074be8e92539fc2dd50c61042a22363157d232b6" gracePeriod=30 Apr 23 17:20:12.734156 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:20:12.734114 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8655f595df-bq652"] Apr 23 17:20:12.734533 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:20:12.734514 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e71e7e87-3c84-4329-8ec3-40f541c098eb" containerName="kserve-container" Apr 23 17:20:12.734625 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:20:12.734535 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="e71e7e87-3c84-4329-8ec3-40f541c098eb" containerName="kserve-container" Apr 23 17:20:12.734625 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:20:12.734555 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e71e7e87-3c84-4329-8ec3-40f541c098eb" containerName="storage-initializer" Apr 23 17:20:12.734625 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:20:12.734563 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="e71e7e87-3c84-4329-8ec3-40f541c098eb" containerName="storage-initializer" Apr 23 17:20:12.734825 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:20:12.734633 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="e71e7e87-3c84-4329-8ec3-40f541c098eb" containerName="kserve-container" Apr 23 17:20:12.738022 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:20:12.737987 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8655f595df-bq652" Apr 23 17:20:12.745207 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:20:12.745110 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8655f595df-bq652"] Apr 23 17:20:12.857994 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:20:12.857955 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f4156c1d-7c47-4c7f-a157-70846e4b3937-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-8655f595df-bq652\" (UID: \"f4156c1d-7c47-4c7f-a157-70846e4b3937\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8655f595df-bq652" Apr 23 17:20:12.959430 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:20:12.959341 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f4156c1d-7c47-4c7f-a157-70846e4b3937-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-8655f595df-bq652\" (UID: \"f4156c1d-7c47-4c7f-a157-70846e4b3937\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8655f595df-bq652" Apr 23 17:20:12.959717 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:20:12.959695 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f4156c1d-7c47-4c7f-a157-70846e4b3937-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-8655f595df-bq652\" (UID: \"f4156c1d-7c47-4c7f-a157-70846e4b3937\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8655f595df-bq652" Apr 23 17:20:13.049196 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:20:13.049162 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8655f595df-bq652" Apr 23 17:20:13.168512 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:20:13.168486 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8655f595df-bq652"] Apr 23 17:20:13.170961 ip-10-0-135-57 kubenswrapper[2562]: W0423 17:20:13.170926 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4156c1d_7c47_4c7f_a157_70846e4b3937.slice/crio-280a7f5da9d49da910fc031014f8d9a1c9d4dc1009601d9be9f22939a2efe7d1 WatchSource:0}: Error finding container 280a7f5da9d49da910fc031014f8d9a1c9d4dc1009601d9be9f22939a2efe7d1: Status 404 returned error can't find the container with id 280a7f5da9d49da910fc031014f8d9a1c9d4dc1009601d9be9f22939a2efe7d1 Apr 23 17:20:14.172059 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:20:14.172021 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8655f595df-bq652" event={"ID":"f4156c1d-7c47-4c7f-a157-70846e4b3937","Type":"ContainerStarted","Data":"9b6dbb8ee05e1f7cc32e290e6ae236bf7f19e67626c4a8eca022ba3753226844"} Apr 23 17:20:14.172059 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:20:14.172063 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8655f595df-bq652" event={"ID":"f4156c1d-7c47-4c7f-a157-70846e4b3937","Type":"ContainerStarted","Data":"280a7f5da9d49da910fc031014f8d9a1c9d4dc1009601d9be9f22939a2efe7d1"} Apr 23 17:20:17.182654 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:20:17.182624 2562 generic.go:358] "Generic (PLEG): container finished" podID="f4156c1d-7c47-4c7f-a157-70846e4b3937" containerID="9b6dbb8ee05e1f7cc32e290e6ae236bf7f19e67626c4a8eca022ba3753226844" exitCode=0 Apr 23 17:20:17.183057 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:20:17.182668 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8655f595df-bq652" event={"ID":"f4156c1d-7c47-4c7f-a157-70846e4b3937","Type":"ContainerDied","Data":"9b6dbb8ee05e1f7cc32e290e6ae236bf7f19e67626c4a8eca022ba3753226844"} Apr 23 17:20:17.314699 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:20:17.314674 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-77bdb67c66-8nvqp" Apr 23 17:20:17.397584 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:20:17.397551 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/417e2587-311a-4b06-8a7d-642f40191d62-kserve-provision-location\") pod \"417e2587-311a-4b06-8a7d-642f40191d62\" (UID: \"417e2587-311a-4b06-8a7d-642f40191d62\") " Apr 23 17:20:17.397926 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:20:17.397904 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/417e2587-311a-4b06-8a7d-642f40191d62-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "417e2587-311a-4b06-8a7d-642f40191d62" (UID: "417e2587-311a-4b06-8a7d-642f40191d62"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:20:17.499100 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:20:17.499068 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/417e2587-311a-4b06-8a7d-642f40191d62-kserve-provision-location\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 17:20:18.186902 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:20:18.186866 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8655f595df-bq652" event={"ID":"f4156c1d-7c47-4c7f-a157-70846e4b3937","Type":"ContainerStarted","Data":"88a5b697253e30447d1acdc0511aca0fbafa12460055c592b07943a3f728f1b8"} Apr 23 17:20:18.187358 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:20:18.187255 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8655f595df-bq652" Apr 23 17:20:18.188393 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:20:18.188364 2562 generic.go:358] "Generic (PLEG): container finished" podID="417e2587-311a-4b06-8a7d-642f40191d62" containerID="abc8fd1823aec69eec21d742074be8e92539fc2dd50c61042a22363157d232b6" exitCode=0 Apr 23 17:20:18.188524 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:20:18.188415 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-77bdb67c66-8nvqp" event={"ID":"417e2587-311a-4b06-8a7d-642f40191d62","Type":"ContainerDied","Data":"abc8fd1823aec69eec21d742074be8e92539fc2dd50c61042a22363157d232b6"} Apr 23 17:20:18.188524 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:20:18.188426 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-77bdb67c66-8nvqp" Apr 23 17:20:18.188524 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:20:18.188438 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-77bdb67c66-8nvqp" event={"ID":"417e2587-311a-4b06-8a7d-642f40191d62","Type":"ContainerDied","Data":"9331636aeffa976bd10180bef03100aafbddfcf3face3fd9c5f43bcea4919bed"} Apr 23 17:20:18.188524 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:20:18.188457 2562 scope.go:117] "RemoveContainer" containerID="abc8fd1823aec69eec21d742074be8e92539fc2dd50c61042a22363157d232b6" Apr 23 17:20:18.188832 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:20:18.188805 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8655f595df-bq652" podUID="f4156c1d-7c47-4c7f-a157-70846e4b3937" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.53:8080: connect: connection refused" Apr 23 17:20:18.196832 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:20:18.196799 2562 scope.go:117] "RemoveContainer" containerID="549e432eb1ec7e4ebb25dbd97888e96f502b1c4058a94fcde7f2e883185da025" Apr 23 17:20:18.203640 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:20:18.203601 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8655f595df-bq652" podStartSLOduration=6.203588631 podStartE2EDuration="6.203588631s" podCreationTimestamp="2026-04-23 17:20:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:20:18.202398425 +0000 UTC m=+2698.055213489" watchObservedRunningTime="2026-04-23 17:20:18.203588631 +0000 UTC m=+2698.056403694" Apr 23 17:20:18.204538 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:20:18.204522 2562 scope.go:117] "RemoveContainer" containerID="abc8fd1823aec69eec21d742074be8e92539fc2dd50c61042a22363157d232b6" Apr 23 17:20:18.204795 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:20:18.204775 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abc8fd1823aec69eec21d742074be8e92539fc2dd50c61042a22363157d232b6\": container with ID starting with abc8fd1823aec69eec21d742074be8e92539fc2dd50c61042a22363157d232b6 not found: ID does not exist" containerID="abc8fd1823aec69eec21d742074be8e92539fc2dd50c61042a22363157d232b6" Apr 23 17:20:18.204841 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:20:18.204804 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abc8fd1823aec69eec21d742074be8e92539fc2dd50c61042a22363157d232b6"} err="failed to get container status \"abc8fd1823aec69eec21d742074be8e92539fc2dd50c61042a22363157d232b6\": rpc error: code = NotFound desc = could not find container \"abc8fd1823aec69eec21d742074be8e92539fc2dd50c61042a22363157d232b6\": container with ID starting with abc8fd1823aec69eec21d742074be8e92539fc2dd50c61042a22363157d232b6 not found: ID does not exist" Apr 23 17:20:18.204841 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:20:18.204822 2562 scope.go:117] "RemoveContainer" containerID="549e432eb1ec7e4ebb25dbd97888e96f502b1c4058a94fcde7f2e883185da025" Apr 23 17:20:18.205052 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:20:18.205032 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"549e432eb1ec7e4ebb25dbd97888e96f502b1c4058a94fcde7f2e883185da025\": container with ID starting with 549e432eb1ec7e4ebb25dbd97888e96f502b1c4058a94fcde7f2e883185da025 not found: ID does not exist" containerID="549e432eb1ec7e4ebb25dbd97888e96f502b1c4058a94fcde7f2e883185da025" Apr 23 17:20:18.205141 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:20:18.205055 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"549e432eb1ec7e4ebb25dbd97888e96f502b1c4058a94fcde7f2e883185da025"} err="failed to get container status \"549e432eb1ec7e4ebb25dbd97888e96f502b1c4058a94fcde7f2e883185da025\": rpc error: code = NotFound desc = could not find container \"549e432eb1ec7e4ebb25dbd97888e96f502b1c4058a94fcde7f2e883185da025\": container with ID starting with 549e432eb1ec7e4ebb25dbd97888e96f502b1c4058a94fcde7f2e883185da025 not found: ID does not exist" Apr 23 17:20:18.214923 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:20:18.214900 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-77bdb67c66-8nvqp"] Apr 23 17:20:18.218158 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:20:18.218138 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-77bdb67c66-8nvqp"] Apr 23 17:20:18.742317 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:20:18.742286 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="417e2587-311a-4b06-8a7d-642f40191d62" path="/var/lib/kubelet/pods/417e2587-311a-4b06-8a7d-642f40191d62/volumes" Apr 23 17:20:19.192988 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:20:19.192950 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8655f595df-bq652" podUID="f4156c1d-7c47-4c7f-a157-70846e4b3937" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.53:8080: connect: connection refused" Apr 23 17:20:29.193664 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:20:29.193611 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8655f595df-bq652" podUID="f4156c1d-7c47-4c7f-a157-70846e4b3937" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.53:8080: connect: connection refused" Apr 23 17:20:39.193430 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:20:39.193380 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8655f595df-bq652" podUID="f4156c1d-7c47-4c7f-a157-70846e4b3937" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.53:8080: connect: connection refused" Apr 23 17:20:49.193393 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:20:49.193341 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8655f595df-bq652" podUID="f4156c1d-7c47-4c7f-a157-70846e4b3937" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.53:8080: connect: connection refused" Apr 23 17:20:59.193806 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:20:59.193729 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8655f595df-bq652" podUID="f4156c1d-7c47-4c7f-a157-70846e4b3937" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.53:8080: connect: connection refused" Apr 23 17:21:09.193926 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:21:09.193878 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8655f595df-bq652" podUID="f4156c1d-7c47-4c7f-a157-70846e4b3937" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.53:8080: connect: connection refused" Apr 23 17:21:19.193083 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:21:19.193037 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8655f595df-bq652" podUID="f4156c1d-7c47-4c7f-a157-70846e4b3937" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.53:8080: connect: connection refused" Apr 23 17:21:29.194549 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:21:29.194517 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8655f595df-bq652" Apr 23 17:21:32.854034 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:21:32.854000 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8655f595df-bq652"] Apr 23 17:21:32.856445 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:21:32.854222 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8655f595df-bq652" podUID="f4156c1d-7c47-4c7f-a157-70846e4b3937" containerName="kserve-container" containerID="cri-o://88a5b697253e30447d1acdc0511aca0fbafa12460055c592b07943a3f728f1b8" gracePeriod=30 Apr 23 17:21:32.921248 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:21:32.921212 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-4n8rx"] Apr 23 17:21:32.921569 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:21:32.921554 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="417e2587-311a-4b06-8a7d-642f40191d62" containerName="storage-initializer" Apr 23 17:21:32.921622 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:21:32.921570 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="417e2587-311a-4b06-8a7d-642f40191d62" containerName="storage-initializer" Apr 23 17:21:32.921622 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:21:32.921580 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="417e2587-311a-4b06-8a7d-642f40191d62" containerName="kserve-container" Apr 23 17:21:32.921622 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:21:32.921588 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="417e2587-311a-4b06-8a7d-642f40191d62" containerName="kserve-container" Apr 23 17:21:32.921720 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:21:32.921655 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="417e2587-311a-4b06-8a7d-642f40191d62" containerName="kserve-container" Apr 23 17:21:32.924505 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:21:32.924490 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-4n8rx" Apr 23 17:21:32.931059 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:21:32.931027 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-4n8rx"] Apr 23 17:21:32.983728 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:21:32.983682 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b481433-d513-415b-8524-f66f00ee88f2-kserve-provision-location\") pod \"isvc-tensorflow-predictor-88f6f6cb7-4n8rx\" (UID: \"0b481433-d513-415b-8524-f66f00ee88f2\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-4n8rx" Apr 23 17:21:33.085092 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:21:33.085050 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b481433-d513-415b-8524-f66f00ee88f2-kserve-provision-location\") pod \"isvc-tensorflow-predictor-88f6f6cb7-4n8rx\" (UID: \"0b481433-d513-415b-8524-f66f00ee88f2\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-4n8rx" Apr 23 17:21:33.085439 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:21:33.085418 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b481433-d513-415b-8524-f66f00ee88f2-kserve-provision-location\") pod \"isvc-tensorflow-predictor-88f6f6cb7-4n8rx\" (UID: \"0b481433-d513-415b-8524-f66f00ee88f2\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-4n8rx" Apr 23 17:21:33.235795 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:21:33.235733 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-4n8rx" Apr 23 17:21:33.353895 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:21:33.353838 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-4n8rx"] Apr 23 17:21:33.356588 ip-10-0-135-57 kubenswrapper[2562]: W0423 17:21:33.356554 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b481433_d513_415b_8524_f66f00ee88f2.slice/crio-3e7b836f6e128d0705637b9c4d2d235d7ddd5be58a1f31ff916897e24e0d2833 WatchSource:0}: Error finding container 3e7b836f6e128d0705637b9c4d2d235d7ddd5be58a1f31ff916897e24e0d2833: Status 404 returned error can't find the container with id 3e7b836f6e128d0705637b9c4d2d235d7ddd5be58a1f31ff916897e24e0d2833 Apr 23 17:21:33.404301 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:21:33.404277 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-4n8rx" event={"ID":"0b481433-d513-415b-8524-f66f00ee88f2","Type":"ContainerStarted","Data":"3e7b836f6e128d0705637b9c4d2d235d7ddd5be58a1f31ff916897e24e0d2833"} Apr 23 17:21:34.408677 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:21:34.408636 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-4n8rx" event={"ID":"0b481433-d513-415b-8524-f66f00ee88f2","Type":"ContainerStarted","Data":"22d093f27b320f8a050ba0da802bb9fb9bd6bf47ff500a954ce02b0bb6537c58"} Apr 23 17:21:37.311852 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:21:37.311829 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8655f595df-bq652" Apr 23 17:21:37.420049 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:21:37.420015 2562 generic.go:358] "Generic (PLEG): container finished" podID="f4156c1d-7c47-4c7f-a157-70846e4b3937" containerID="88a5b697253e30447d1acdc0511aca0fbafa12460055c592b07943a3f728f1b8" exitCode=0 Apr 23 17:21:37.420219 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:21:37.420080 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8655f595df-bq652" event={"ID":"f4156c1d-7c47-4c7f-a157-70846e4b3937","Type":"ContainerDied","Data":"88a5b697253e30447d1acdc0511aca0fbafa12460055c592b07943a3f728f1b8"} Apr 23 17:21:37.420219 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:21:37.420087 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8655f595df-bq652" Apr 23 17:21:37.420219 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:21:37.420118 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8655f595df-bq652" event={"ID":"f4156c1d-7c47-4c7f-a157-70846e4b3937","Type":"ContainerDied","Data":"280a7f5da9d49da910fc031014f8d9a1c9d4dc1009601d9be9f22939a2efe7d1"} Apr 23 17:21:37.420219 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:21:37.420144 2562 scope.go:117] "RemoveContainer" containerID="88a5b697253e30447d1acdc0511aca0fbafa12460055c592b07943a3f728f1b8" Apr 23 17:21:37.422420 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:21:37.422400 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f4156c1d-7c47-4c7f-a157-70846e4b3937-kserve-provision-location\") pod \"f4156c1d-7c47-4c7f-a157-70846e4b3937\" (UID: \"f4156c1d-7c47-4c7f-a157-70846e4b3937\") " Apr 23 17:21:37.422790 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:21:37.422767 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4156c1d-7c47-4c7f-a157-70846e4b3937-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f4156c1d-7c47-4c7f-a157-70846e4b3937" (UID: "f4156c1d-7c47-4c7f-a157-70846e4b3937"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:21:37.428446 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:21:37.428427 2562 scope.go:117] "RemoveContainer" containerID="9b6dbb8ee05e1f7cc32e290e6ae236bf7f19e67626c4a8eca022ba3753226844" Apr 23 17:21:37.435909 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:21:37.435893 2562 scope.go:117] "RemoveContainer" containerID="88a5b697253e30447d1acdc0511aca0fbafa12460055c592b07943a3f728f1b8" Apr 23 17:21:37.436147 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:21:37.436132 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88a5b697253e30447d1acdc0511aca0fbafa12460055c592b07943a3f728f1b8\": container with ID starting with 88a5b697253e30447d1acdc0511aca0fbafa12460055c592b07943a3f728f1b8 not found: ID does not exist" containerID="88a5b697253e30447d1acdc0511aca0fbafa12460055c592b07943a3f728f1b8" Apr 23 17:21:37.436200 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:21:37.436156 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88a5b697253e30447d1acdc0511aca0fbafa12460055c592b07943a3f728f1b8"} err="failed to get container status \"88a5b697253e30447d1acdc0511aca0fbafa12460055c592b07943a3f728f1b8\": rpc error: code = NotFound desc = could not find container \"88a5b697253e30447d1acdc0511aca0fbafa12460055c592b07943a3f728f1b8\": container with ID starting with 88a5b697253e30447d1acdc0511aca0fbafa12460055c592b07943a3f728f1b8 not found: ID does not exist" Apr 23 17:21:37.436200 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:21:37.436180 2562 scope.go:117] "RemoveContainer" containerID="9b6dbb8ee05e1f7cc32e290e6ae236bf7f19e67626c4a8eca022ba3753226844" Apr 23 17:21:37.436433 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:21:37.436411 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b6dbb8ee05e1f7cc32e290e6ae236bf7f19e67626c4a8eca022ba3753226844\": container with ID starting with 9b6dbb8ee05e1f7cc32e290e6ae236bf7f19e67626c4a8eca022ba3753226844 not found: ID does not exist" containerID="9b6dbb8ee05e1f7cc32e290e6ae236bf7f19e67626c4a8eca022ba3753226844" Apr 23 17:21:37.436482 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:21:37.436439 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b6dbb8ee05e1f7cc32e290e6ae236bf7f19e67626c4a8eca022ba3753226844"} err="failed to get container status \"9b6dbb8ee05e1f7cc32e290e6ae236bf7f19e67626c4a8eca022ba3753226844\": rpc error: code = NotFound desc = could not find container \"9b6dbb8ee05e1f7cc32e290e6ae236bf7f19e67626c4a8eca022ba3753226844\": container with ID starting with 9b6dbb8ee05e1f7cc32e290e6ae236bf7f19e67626c4a8eca022ba3753226844 not found: ID does not exist" Apr 23 17:21:37.523660 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:21:37.523627 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f4156c1d-7c47-4c7f-a157-70846e4b3937-kserve-provision-location\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 17:21:37.742087 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:21:37.742056 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8655f595df-bq652"] Apr 23 17:21:37.746006 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:21:37.745979 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-8655f595df-bq652"] Apr 23 17:21:38.425152 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:21:38.425115 2562 generic.go:358] "Generic (PLEG): container finished" podID="0b481433-d513-415b-8524-f66f00ee88f2" containerID="22d093f27b320f8a050ba0da802bb9fb9bd6bf47ff500a954ce02b0bb6537c58" exitCode=0 Apr 23 17:21:38.425506 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:21:38.425191 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-4n8rx" event={"ID":"0b481433-d513-415b-8524-f66f00ee88f2","Type":"ContainerDied","Data":"22d093f27b320f8a050ba0da802bb9fb9bd6bf47ff500a954ce02b0bb6537c58"} Apr 23 17:21:38.426353 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:21:38.426336 2562 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 17:21:38.742435 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:21:38.742339 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4156c1d-7c47-4c7f-a157-70846e4b3937" path="/var/lib/kubelet/pods/f4156c1d-7c47-4c7f-a157-70846e4b3937/volumes" Apr 23 17:21:43.445123 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:21:43.445088 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-4n8rx" event={"ID":"0b481433-d513-415b-8524-f66f00ee88f2","Type":"ContainerStarted","Data":"23ee6b180c7f2142720f979e92ffeb0870e8f71aa020038a57d002e834c29756"} Apr 23 17:21:43.445493 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:21:43.445361 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-4n8rx" Apr 23 17:21:43.446699 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:21:43.446672 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-4n8rx" podUID="0b481433-d513-415b-8524-f66f00ee88f2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.54:8080: connect: connection refused" Apr 23 17:21:43.462912 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:21:43.462857 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-4n8rx" podStartSLOduration=7.472540173 podStartE2EDuration="11.462844088s" podCreationTimestamp="2026-04-23 17:21:32 +0000 UTC" firstStartedPulling="2026-04-23 17:21:38.426458868 +0000 UTC m=+2778.279273909" lastFinishedPulling="2026-04-23 17:21:42.416762778 +0000 UTC m=+2782.269577824" observedRunningTime="2026-04-23 17:21:43.460622929 +0000 UTC m=+2783.313437995" watchObservedRunningTime="2026-04-23 17:21:43.462844088 +0000 UTC m=+2783.315659151" Apr 23 17:21:44.448176 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:21:44.448135 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-4n8rx" podUID="0b481433-d513-415b-8524-f66f00ee88f2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.54:8080: connect: connection refused" Apr 23 17:21:54.449087 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:21:54.449061 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-4n8rx" Apr 23 17:22:14.064870 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:14.064826 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-4n8rx"] Apr 23 17:22:14.065371 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:14.065111 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-4n8rx" podUID="0b481433-d513-415b-8524-f66f00ee88f2" containerName="kserve-container" containerID="cri-o://23ee6b180c7f2142720f979e92ffeb0870e8f71aa020038a57d002e834c29756" gracePeriod=30 Apr 23 17:22:14.133588 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:14.133552 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4zvvx"] Apr 23 17:22:14.133877 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:14.133864 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f4156c1d-7c47-4c7f-a157-70846e4b3937" containerName="storage-initializer" Apr 23 17:22:14.133930 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:14.133878 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4156c1d-7c47-4c7f-a157-70846e4b3937" containerName="storage-initializer" Apr 23 17:22:14.133930 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:14.133898 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f4156c1d-7c47-4c7f-a157-70846e4b3937" containerName="kserve-container" Apr 23 17:22:14.133930 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:14.133903 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4156c1d-7c47-4c7f-a157-70846e4b3937" containerName="kserve-container" Apr 23 17:22:14.134033 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:14.133955 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="f4156c1d-7c47-4c7f-a157-70846e4b3937" containerName="kserve-container" Apr 23 17:22:14.136247 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:14.136231 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4zvvx" Apr 23 17:22:14.145952 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:14.145923 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4zvvx"] Apr 23 17:22:14.227426 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:14.227391 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8b72e29-d15e-481a-8ccb-7ca3d0262f87-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-854bdff69c-4zvvx\" (UID: \"a8b72e29-d15e-481a-8ccb-7ca3d0262f87\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4zvvx" Apr 23 17:22:14.327950 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:14.327859 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8b72e29-d15e-481a-8ccb-7ca3d0262f87-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-854bdff69c-4zvvx\" (UID: \"a8b72e29-d15e-481a-8ccb-7ca3d0262f87\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4zvvx" Apr 23 17:22:14.328229 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:14.328209 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8b72e29-d15e-481a-8ccb-7ca3d0262f87-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-854bdff69c-4zvvx\" (UID: \"a8b72e29-d15e-481a-8ccb-7ca3d0262f87\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4zvvx" Apr 23 17:22:14.447200 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:14.447160 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4zvvx" Apr 23 17:22:14.570818 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:14.570791 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4zvvx"] Apr 23 17:22:14.573310 ip-10-0-135-57 kubenswrapper[2562]: W0423 17:22:14.573283 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8b72e29_d15e_481a_8ccb_7ca3d0262f87.slice/crio-1d29267fa20359c3e4f5aa1362acf4068414adc84f7d281b98ce48a66e9cc0dc WatchSource:0}: Error finding container 1d29267fa20359c3e4f5aa1362acf4068414adc84f7d281b98ce48a66e9cc0dc: Status 404 returned error can't find the container with id 1d29267fa20359c3e4f5aa1362acf4068414adc84f7d281b98ce48a66e9cc0dc Apr 23 17:22:15.539668 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:15.539629 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4zvvx" event={"ID":"a8b72e29-d15e-481a-8ccb-7ca3d0262f87","Type":"ContainerStarted","Data":"713881e78b884c24350793eac38185205bfd327b820ad0bab5e33b5da5481593"} Apr 23 17:22:15.539668 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:15.539672 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4zvvx" event={"ID":"a8b72e29-d15e-481a-8ccb-7ca3d0262f87","Type":"ContainerStarted","Data":"1d29267fa20359c3e4f5aa1362acf4068414adc84f7d281b98ce48a66e9cc0dc"} Apr 23 17:22:19.552801 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:19.552766 2562 generic.go:358] "Generic (PLEG): container finished" podID="a8b72e29-d15e-481a-8ccb-7ca3d0262f87" containerID="713881e78b884c24350793eac38185205bfd327b820ad0bab5e33b5da5481593" exitCode=0 Apr 23 17:22:19.553214 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:19.552836 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4zvvx" event={"ID":"a8b72e29-d15e-481a-8ccb-7ca3d0262f87","Type":"ContainerDied","Data":"713881e78b884c24350793eac38185205bfd327b820ad0bab5e33b5da5481593"} Apr 23 17:22:20.556802 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:20.556763 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4zvvx" event={"ID":"a8b72e29-d15e-481a-8ccb-7ca3d0262f87","Type":"ContainerStarted","Data":"e769f24504444ac3da1375fd5e3715c142319cb957eb0fd02f7b93a88ac803b9"} Apr 23 17:22:20.557192 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:20.557065 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4zvvx" Apr 23 17:22:20.558265 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:20.558236 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4zvvx" podUID="a8b72e29-d15e-481a-8ccb-7ca3d0262f87" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.55:8080: connect: connection refused" Apr 23 17:22:20.574954 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:20.574906 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4zvvx" podStartSLOduration=6.574892467 podStartE2EDuration="6.574892467s" podCreationTimestamp="2026-04-23 17:22:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:22:20.573378728 +0000 UTC m=+2820.426193793" watchObservedRunningTime="2026-04-23 17:22:20.574892467 +0000 UTC m=+2820.427707531" Apr 23 17:22:21.559792 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:21.559731 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4zvvx" podUID="a8b72e29-d15e-481a-8ccb-7ca3d0262f87" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.55:8080: connect: connection refused" Apr 23 17:22:31.560495 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:31.560462 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4zvvx" Apr 23 17:22:44.085502 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:22:44.085469 2562 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b481433_d513_415b_8524_f66f00ee88f2.slice/crio-3e7b836f6e128d0705637b9c4d2d235d7ddd5be58a1f31ff916897e24e0d2833\": RecentStats: unable to find data in memory cache]" Apr 23 17:22:44.448186 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:44.448132 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-4n8rx" podUID="0b481433-d513-415b-8524-f66f00ee88f2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.54:8080: connect: connection refused" Apr 23 17:22:44.631734 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:44.631703 2562 generic.go:358] "Generic (PLEG): container finished" podID="0b481433-d513-415b-8524-f66f00ee88f2" containerID="23ee6b180c7f2142720f979e92ffeb0870e8f71aa020038a57d002e834c29756" exitCode=137 Apr 23 17:22:44.631935 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:44.631781 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-4n8rx" event={"ID":"0b481433-d513-415b-8524-f66f00ee88f2","Type":"ContainerDied","Data":"23ee6b180c7f2142720f979e92ffeb0870e8f71aa020038a57d002e834c29756"} Apr 23 17:22:44.702398 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:44.702329 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-4n8rx" Apr 23 17:22:44.788036 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:44.788004 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b481433-d513-415b-8524-f66f00ee88f2-kserve-provision-location\") pod \"0b481433-d513-415b-8524-f66f00ee88f2\" (UID: \"0b481433-d513-415b-8524-f66f00ee88f2\") " Apr 23 17:22:44.798992 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:44.798966 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b481433-d513-415b-8524-f66f00ee88f2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0b481433-d513-415b-8524-f66f00ee88f2" (UID: "0b481433-d513-415b-8524-f66f00ee88f2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:22:44.889146 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:44.889106 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b481433-d513-415b-8524-f66f00ee88f2-kserve-provision-location\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 17:22:45.470841 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:45.470801 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4zvvx"] Apr 23 17:22:45.471310 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:45.471170 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4zvvx" podUID="a8b72e29-d15e-481a-8ccb-7ca3d0262f87" containerName="kserve-container" containerID="cri-o://e769f24504444ac3da1375fd5e3715c142319cb957eb0fd02f7b93a88ac803b9" gracePeriod=30 Apr 23 17:22:45.537848 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:45.537816 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-dwrns"] Apr 23 17:22:45.538141 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:45.538128 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0b481433-d513-415b-8524-f66f00ee88f2" containerName="kserve-container" Apr 23 17:22:45.538187 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:45.538143 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b481433-d513-415b-8524-f66f00ee88f2" containerName="kserve-container" Apr 23 17:22:45.538187 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:45.538160 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0b481433-d513-415b-8524-f66f00ee88f2" containerName="storage-initializer" Apr 23 17:22:45.538187 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:45.538166 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b481433-d513-415b-8524-f66f00ee88f2" containerName="storage-initializer" Apr 23 17:22:45.538284 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:45.538228 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="0b481433-d513-415b-8524-f66f00ee88f2" containerName="kserve-container" Apr 23 17:22:45.540321 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:45.540307 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-dwrns" Apr 23 17:22:45.550518 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:45.550493 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-dwrns"] Apr 23 17:22:45.635652 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:45.635626 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-4n8rx" Apr 23 17:22:45.635845 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:45.635626 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-4n8rx" event={"ID":"0b481433-d513-415b-8524-f66f00ee88f2","Type":"ContainerDied","Data":"3e7b836f6e128d0705637b9c4d2d235d7ddd5be58a1f31ff916897e24e0d2833"} Apr 23 17:22:45.635845 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:45.635772 2562 scope.go:117] "RemoveContainer" containerID="23ee6b180c7f2142720f979e92ffeb0870e8f71aa020038a57d002e834c29756" Apr 23 17:22:45.643790 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:45.643767 2562 scope.go:117] "RemoveContainer" containerID="22d093f27b320f8a050ba0da802bb9fb9bd6bf47ff500a954ce02b0bb6537c58" Apr 23 17:22:45.656491 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:45.656464 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-4n8rx"] Apr 23 17:22:45.660326 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:45.660301 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-4n8rx"] Apr 23 17:22:45.695065 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:45.695030 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca61059c-cb89-4653-a588-48f645e74d70-kserve-provision-location\") pod \"isvc-triton-predictor-85f9f46646-dwrns\" (UID: \"ca61059c-cb89-4653-a588-48f645e74d70\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-dwrns" Apr 23 17:22:45.796448 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:45.796352 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca61059c-cb89-4653-a588-48f645e74d70-kserve-provision-location\") pod \"isvc-triton-predictor-85f9f46646-dwrns\" (UID: \"ca61059c-cb89-4653-a588-48f645e74d70\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-dwrns" Apr 23 17:22:45.796785 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:45.796732 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca61059c-cb89-4653-a588-48f645e74d70-kserve-provision-location\") pod \"isvc-triton-predictor-85f9f46646-dwrns\" (UID: \"ca61059c-cb89-4653-a588-48f645e74d70\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-dwrns" Apr 23 17:22:45.850778 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:45.850726 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-dwrns" Apr 23 17:22:45.974393 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:45.974366 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-dwrns"] Apr 23 17:22:45.976780 ip-10-0-135-57 kubenswrapper[2562]: W0423 17:22:45.976728 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca61059c_cb89_4653_a588_48f645e74d70.slice/crio-e44976fdc19dd3f75ac05cbca6f2a2608a9553217c46b8a1cb593ba48caa6b9d WatchSource:0}: Error finding container e44976fdc19dd3f75ac05cbca6f2a2608a9553217c46b8a1cb593ba48caa6b9d: Status 404 returned error can't find the container with id e44976fdc19dd3f75ac05cbca6f2a2608a9553217c46b8a1cb593ba48caa6b9d Apr 23 17:22:46.640513 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:46.640474 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-dwrns" event={"ID":"ca61059c-cb89-4653-a588-48f645e74d70","Type":"ContainerStarted","Data":"5ed48fae2c0e969518a68be8c86a6d695c4fe1cbbf22bb4741db8f8178d53319"} Apr 23 17:22:46.640928 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:46.640519 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-dwrns" event={"ID":"ca61059c-cb89-4653-a588-48f645e74d70","Type":"ContainerStarted","Data":"e44976fdc19dd3f75ac05cbca6f2a2608a9553217c46b8a1cb593ba48caa6b9d"} Apr 23 17:22:46.742566 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:46.742531 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b481433-d513-415b-8524-f66f00ee88f2" path="/var/lib/kubelet/pods/0b481433-d513-415b-8524-f66f00ee88f2/volumes" Apr 23 17:22:50.653549 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:50.653513 2562 generic.go:358] "Generic (PLEG): container finished" podID="ca61059c-cb89-4653-a588-48f645e74d70" containerID="5ed48fae2c0e969518a68be8c86a6d695c4fe1cbbf22bb4741db8f8178d53319" exitCode=0 Apr 23 17:22:50.653946 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:22:50.653592 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-dwrns" event={"ID":"ca61059c-cb89-4653-a588-48f645e74d70","Type":"ContainerDied","Data":"5ed48fae2c0e969518a68be8c86a6d695c4fe1cbbf22bb4741db8f8178d53319"} Apr 23 17:23:15.784860 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:23:15.784770 2562 generic.go:358] "Generic (PLEG): container finished" podID="a8b72e29-d15e-481a-8ccb-7ca3d0262f87" containerID="e769f24504444ac3da1375fd5e3715c142319cb957eb0fd02f7b93a88ac803b9" exitCode=137 Apr 23 17:23:15.785348 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:23:15.784921 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4zvvx" event={"ID":"a8b72e29-d15e-481a-8ccb-7ca3d0262f87","Type":"ContainerDied","Data":"e769f24504444ac3da1375fd5e3715c142319cb957eb0fd02f7b93a88ac803b9"} Apr 23 17:23:16.162546 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:23:16.162515 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4zvvx" Apr 23 17:23:16.284819 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:23:16.284722 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8b72e29-d15e-481a-8ccb-7ca3d0262f87-kserve-provision-location\") pod \"a8b72e29-d15e-481a-8ccb-7ca3d0262f87\" (UID: \"a8b72e29-d15e-481a-8ccb-7ca3d0262f87\") " Apr 23 17:23:16.293856 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:23:16.293805 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8b72e29-d15e-481a-8ccb-7ca3d0262f87-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a8b72e29-d15e-481a-8ccb-7ca3d0262f87" (UID: "a8b72e29-d15e-481a-8ccb-7ca3d0262f87"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:23:16.385893 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:23:16.385795 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8b72e29-d15e-481a-8ccb-7ca3d0262f87-kserve-provision-location\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 17:23:16.790561 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:23:16.790517 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4zvvx" event={"ID":"a8b72e29-d15e-481a-8ccb-7ca3d0262f87","Type":"ContainerDied","Data":"1d29267fa20359c3e4f5aa1362acf4068414adc84f7d281b98ce48a66e9cc0dc"} Apr 23 17:23:16.791025 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:23:16.790575 2562 scope.go:117] "RemoveContainer" containerID="e769f24504444ac3da1375fd5e3715c142319cb957eb0fd02f7b93a88ac803b9" Apr 23 17:23:16.791025 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:23:16.790617 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4zvvx" Apr 23 17:23:16.801934 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:23:16.801906 2562 scope.go:117] "RemoveContainer" containerID="713881e78b884c24350793eac38185205bfd327b820ad0bab5e33b5da5481593" Apr 23 17:23:16.809352 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:23:16.809318 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4zvvx"] Apr 23 17:23:16.815325 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:23:16.815300 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4zvvx"] Apr 23 17:23:18.744924 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:23:18.744778 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8b72e29-d15e-481a-8ccb-7ca3d0262f87" path="/var/lib/kubelet/pods/a8b72e29-d15e-481a-8ccb-7ca3d0262f87/volumes" Apr 23 17:24:46.082009 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:24:46.081971 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-dwrns" event={"ID":"ca61059c-cb89-4653-a588-48f645e74d70","Type":"ContainerStarted","Data":"5e3aa35b6fac9d79558732442ed7d344d63b8a731e6133cbbda7e0865486cbab"} Apr 23 17:24:46.082480 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:24:46.082168 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-dwrns" Apr 23 17:24:46.083399 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:24:46.083372 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-dwrns" podUID="ca61059c-cb89-4653-a588-48f645e74d70" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.56:8080: connect: connection refused" Apr 23 17:24:46.100861 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:24:46.100807 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-dwrns" podStartSLOduration=6.642278768 podStartE2EDuration="2m1.100793994s" podCreationTimestamp="2026-04-23 17:22:45 +0000 UTC" firstStartedPulling="2026-04-23 17:22:50.654614295 +0000 UTC m=+2850.507429337" lastFinishedPulling="2026-04-23 17:24:45.113129509 +0000 UTC m=+2964.965944563" observedRunningTime="2026-04-23 17:24:46.099017025 +0000 UTC m=+2965.951832091" watchObservedRunningTime="2026-04-23 17:24:46.100793994 +0000 UTC m=+2965.953609058" Apr 23 17:24:47.085480 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:24:47.085440 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-dwrns" podUID="ca61059c-cb89-4653-a588-48f645e74d70" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.56:8080: connect: connection refused" Apr 23 17:24:57.086673 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:24:57.086643 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-dwrns" Apr 23 17:25:07.112180 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:25:07.112147 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-dwrns"] Apr 23 17:25:07.112644 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:25:07.112444 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-dwrns" podUID="ca61059c-cb89-4653-a588-48f645e74d70" containerName="kserve-container" containerID="cri-o://5e3aa35b6fac9d79558732442ed7d344d63b8a731e6133cbbda7e0865486cbab" gracePeriod=30 Apr 23 17:25:07.189581 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:25:07.188819 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rx2wx"] Apr 23 17:25:07.189581 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:25:07.189502 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8b72e29-d15e-481a-8ccb-7ca3d0262f87" containerName="kserve-container" Apr 23 17:25:07.189581 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:25:07.189525 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8b72e29-d15e-481a-8ccb-7ca3d0262f87" containerName="kserve-container" Apr 23 17:25:07.189581 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:25:07.189545 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8b72e29-d15e-481a-8ccb-7ca3d0262f87" containerName="storage-initializer" Apr 23 17:25:07.189581 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:25:07.189557 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8b72e29-d15e-481a-8ccb-7ca3d0262f87" containerName="storage-initializer" Apr 23 17:25:07.189979 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:25:07.189722 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="a8b72e29-d15e-481a-8ccb-7ca3d0262f87" containerName="kserve-container" Apr 23 17:25:07.193283 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:25:07.193258 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rx2wx" Apr 23 17:25:07.197087 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:25:07.197060 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rx2wx"] Apr 23 17:25:07.284797 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:25:07.284733 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41d2fc84-8021-4390-a4c3-11f48835abf7-kserve-provision-location\") pod \"isvc-xgboost-predictor-6dbc9d6d47-rx2wx\" (UID: \"41d2fc84-8021-4390-a4c3-11f48835abf7\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rx2wx" Apr 23 17:25:07.385606 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:25:07.385511 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41d2fc84-8021-4390-a4c3-11f48835abf7-kserve-provision-location\") pod \"isvc-xgboost-predictor-6dbc9d6d47-rx2wx\" (UID: \"41d2fc84-8021-4390-a4c3-11f48835abf7\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rx2wx" Apr 23 17:25:07.385944 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:25:07.385921 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41d2fc84-8021-4390-a4c3-11f48835abf7-kserve-provision-location\") pod \"isvc-xgboost-predictor-6dbc9d6d47-rx2wx\" (UID: \"41d2fc84-8021-4390-a4c3-11f48835abf7\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rx2wx" Apr 23 17:25:07.505038 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:25:07.505000 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rx2wx" Apr 23 17:25:07.628117 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:25:07.628000 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rx2wx"] Apr 23 17:25:07.630912 ip-10-0-135-57 kubenswrapper[2562]: W0423 17:25:07.630877 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41d2fc84_8021_4390_a4c3_11f48835abf7.slice/crio-45d56c1c5c03751e43b5612b15f736d200550e10001eefcff3d86400d8ee6a58 WatchSource:0}: Error finding container 45d56c1c5c03751e43b5612b15f736d200550e10001eefcff3d86400d8ee6a58: Status 404 returned error can't find the container with id 45d56c1c5c03751e43b5612b15f736d200550e10001eefcff3d86400d8ee6a58 Apr 23 17:25:08.154227 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:25:08.154194 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rx2wx" event={"ID":"41d2fc84-8021-4390-a4c3-11f48835abf7","Type":"ContainerStarted","Data":"5394ebb42e8bb29fe8db2b001ac3c10a8047afbdb951a1111226f110db064f6b"} Apr 23 17:25:08.154227 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:25:08.154229 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rx2wx" event={"ID":"41d2fc84-8021-4390-a4c3-11f48835abf7","Type":"ContainerStarted","Data":"45d56c1c5c03751e43b5612b15f736d200550e10001eefcff3d86400d8ee6a58"} Apr 23 17:25:09.984678 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:25:09.984653 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-dwrns" Apr 23 17:25:10.110407 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:25:10.110312 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca61059c-cb89-4653-a588-48f645e74d70-kserve-provision-location\") pod \"ca61059c-cb89-4653-a588-48f645e74d70\" (UID: \"ca61059c-cb89-4653-a588-48f645e74d70\") " Apr 23 17:25:10.110699 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:25:10.110674 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca61059c-cb89-4653-a588-48f645e74d70-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ca61059c-cb89-4653-a588-48f645e74d70" (UID: "ca61059c-cb89-4653-a588-48f645e74d70"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:25:10.161628 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:25:10.161593 2562 generic.go:358] "Generic (PLEG): container finished" podID="ca61059c-cb89-4653-a588-48f645e74d70" containerID="5e3aa35b6fac9d79558732442ed7d344d63b8a731e6133cbbda7e0865486cbab" exitCode=0 Apr 23 17:25:10.161823 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:25:10.161663 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-dwrns" Apr 23 17:25:10.161823 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:25:10.161673 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-dwrns" event={"ID":"ca61059c-cb89-4653-a588-48f645e74d70","Type":"ContainerDied","Data":"5e3aa35b6fac9d79558732442ed7d344d63b8a731e6133cbbda7e0865486cbab"} Apr 23 17:25:10.161823 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:25:10.161709 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-dwrns" event={"ID":"ca61059c-cb89-4653-a588-48f645e74d70","Type":"ContainerDied","Data":"e44976fdc19dd3f75ac05cbca6f2a2608a9553217c46b8a1cb593ba48caa6b9d"} Apr 23 17:25:10.161823 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:25:10.161726 2562 scope.go:117] "RemoveContainer" containerID="5e3aa35b6fac9d79558732442ed7d344d63b8a731e6133cbbda7e0865486cbab" Apr 23 17:25:10.169923 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:25:10.169903 2562 scope.go:117] "RemoveContainer" containerID="5ed48fae2c0e969518a68be8c86a6d695c4fe1cbbf22bb4741db8f8178d53319" Apr 23 17:25:10.177331 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:25:10.177315 2562 scope.go:117] "RemoveContainer" containerID="5e3aa35b6fac9d79558732442ed7d344d63b8a731e6133cbbda7e0865486cbab" Apr 23 17:25:10.177580 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:25:10.177564 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e3aa35b6fac9d79558732442ed7d344d63b8a731e6133cbbda7e0865486cbab\": container with ID starting with 5e3aa35b6fac9d79558732442ed7d344d63b8a731e6133cbbda7e0865486cbab not found: ID does not exist" containerID="5e3aa35b6fac9d79558732442ed7d344d63b8a731e6133cbbda7e0865486cbab" Apr 23 17:25:10.177620 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:25:10.177589 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e3aa35b6fac9d79558732442ed7d344d63b8a731e6133cbbda7e0865486cbab"} err="failed to get container status \"5e3aa35b6fac9d79558732442ed7d344d63b8a731e6133cbbda7e0865486cbab\": rpc error: code = NotFound desc = could not find container \"5e3aa35b6fac9d79558732442ed7d344d63b8a731e6133cbbda7e0865486cbab\": container with ID starting with 5e3aa35b6fac9d79558732442ed7d344d63b8a731e6133cbbda7e0865486cbab not found: ID does not exist" Apr 23 17:25:10.177620 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:25:10.177608 2562 scope.go:117] "RemoveContainer" containerID="5ed48fae2c0e969518a68be8c86a6d695c4fe1cbbf22bb4741db8f8178d53319" Apr 23 17:25:10.177824 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:25:10.177810 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ed48fae2c0e969518a68be8c86a6d695c4fe1cbbf22bb4741db8f8178d53319\": container with ID starting with 5ed48fae2c0e969518a68be8c86a6d695c4fe1cbbf22bb4741db8f8178d53319 not found: ID does not exist" containerID="5ed48fae2c0e969518a68be8c86a6d695c4fe1cbbf22bb4741db8f8178d53319" Apr 23 17:25:10.177870 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:25:10.177826 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ed48fae2c0e969518a68be8c86a6d695c4fe1cbbf22bb4741db8f8178d53319"} err="failed to get container status \"5ed48fae2c0e969518a68be8c86a6d695c4fe1cbbf22bb4741db8f8178d53319\": rpc error: code = NotFound desc = could not find container \"5ed48fae2c0e969518a68be8c86a6d695c4fe1cbbf22bb4741db8f8178d53319\": container with ID starting with 5ed48fae2c0e969518a68be8c86a6d695c4fe1cbbf22bb4741db8f8178d53319 not found: ID does not exist" Apr 23 17:25:10.183185 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:25:10.183161 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-dwrns"] Apr 23 17:25:10.187779 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:25:10.187729 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-dwrns"] Apr 23 17:25:10.211568 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:25:10.211526 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca61059c-cb89-4653-a588-48f645e74d70-kserve-provision-location\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 17:25:10.742424 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:25:10.742392 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca61059c-cb89-4653-a588-48f645e74d70" path="/var/lib/kubelet/pods/ca61059c-cb89-4653-a588-48f645e74d70/volumes" Apr 23 17:25:12.169908 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:25:12.169875 2562 generic.go:358] "Generic (PLEG): container finished" podID="41d2fc84-8021-4390-a4c3-11f48835abf7" containerID="5394ebb42e8bb29fe8db2b001ac3c10a8047afbdb951a1111226f110db064f6b" exitCode=0 Apr 23 17:25:12.170290 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:25:12.169951 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rx2wx" event={"ID":"41d2fc84-8021-4390-a4c3-11f48835abf7","Type":"ContainerDied","Data":"5394ebb42e8bb29fe8db2b001ac3c10a8047afbdb951a1111226f110db064f6b"} Apr 23 17:25:32.231462 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:25:32.231422 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rx2wx" event={"ID":"41d2fc84-8021-4390-a4c3-11f48835abf7","Type":"ContainerStarted","Data":"fdc82f2f47d2b894d3b4c23b076fc10b0d3661336c8f01f3e68e9f2958371d58"} Apr 23 17:25:32.232064 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:25:32.231770 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rx2wx" Apr 23 17:25:32.233164 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:25:32.233136 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rx2wx" podUID="41d2fc84-8021-4390-a4c3-11f48835abf7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.57:8080: connect: connection refused" Apr 23 17:25:32.248101 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:25:32.248048 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rx2wx" podStartSLOduration=5.896226726 podStartE2EDuration="25.248030575s" podCreationTimestamp="2026-04-23 17:25:07 +0000 UTC" firstStartedPulling="2026-04-23 17:25:12.17112374 +0000 UTC m=+2992.023938782" lastFinishedPulling="2026-04-23 17:25:31.52292759 +0000 UTC m=+3011.375742631" observedRunningTime="2026-04-23 17:25:32.247103255 +0000 UTC m=+3012.099918319" watchObservedRunningTime="2026-04-23 17:25:32.248030575 +0000 UTC m=+3012.100845640" Apr 23 17:25:33.234228 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:25:33.234189 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rx2wx" podUID="41d2fc84-8021-4390-a4c3-11f48835abf7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.57:8080: connect: connection refused" Apr 23 17:25:43.234498 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:25:43.234459 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rx2wx" podUID="41d2fc84-8021-4390-a4c3-11f48835abf7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.57:8080: connect: connection refused" Apr 23 17:25:53.234466 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:25:53.234421 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rx2wx" podUID="41d2fc84-8021-4390-a4c3-11f48835abf7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.57:8080: connect: connection refused" Apr 23 17:26:03.234732 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:26:03.234694 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rx2wx" podUID="41d2fc84-8021-4390-a4c3-11f48835abf7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.57:8080: connect: connection refused" Apr 23 17:26:13.234378 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:26:13.234333 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rx2wx" podUID="41d2fc84-8021-4390-a4c3-11f48835abf7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.57:8080: connect: connection refused" Apr 23 17:26:23.234313 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:26:23.234270 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rx2wx" podUID="41d2fc84-8021-4390-a4c3-11f48835abf7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.57:8080: connect: connection refused" Apr 23 17:26:33.235798 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:26:33.235765 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rx2wx" Apr 23 17:26:37.338677 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:26:37.338640 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rx2wx"] Apr 23 17:26:37.339087 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:26:37.338923 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rx2wx" podUID="41d2fc84-8021-4390-a4c3-11f48835abf7" containerName="kserve-container" containerID="cri-o://fdc82f2f47d2b894d3b4c23b076fc10b0d3661336c8f01f3e68e9f2958371d58" gracePeriod=30 Apr 23 17:26:37.450023 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:26:37.449988 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-c9r2r"] Apr 23 17:26:37.450325 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:26:37.450313 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ca61059c-cb89-4653-a588-48f645e74d70" containerName="storage-initializer" Apr 23 17:26:37.450378 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:26:37.450327 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca61059c-cb89-4653-a588-48f645e74d70" containerName="storage-initializer" Apr 23 17:26:37.450378 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:26:37.450347 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ca61059c-cb89-4653-a588-48f645e74d70" containerName="kserve-container" Apr 23 17:26:37.450378 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:26:37.450352 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca61059c-cb89-4653-a588-48f645e74d70" containerName="kserve-container" Apr 23 17:26:37.450492 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:26:37.450405 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="ca61059c-cb89-4653-a588-48f645e74d70" containerName="kserve-container" Apr 23 17:26:37.453213 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:26:37.453192 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-c9r2r" Apr 23 17:26:37.463558 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:26:37.463534 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-c9r2r"] Apr 23 17:26:37.549799 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:26:37.549733 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8872a186-969e-434a-af09-8dccadcaf56f-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-58b7db6668-c9r2r\" (UID: \"8872a186-969e-434a-af09-8dccadcaf56f\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-c9r2r" Apr 23 17:26:37.650808 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:26:37.650764 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8872a186-969e-434a-af09-8dccadcaf56f-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-58b7db6668-c9r2r\" (UID: \"8872a186-969e-434a-af09-8dccadcaf56f\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-c9r2r" Apr 23 17:26:37.651152 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:26:37.651131 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8872a186-969e-434a-af09-8dccadcaf56f-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-58b7db6668-c9r2r\" (UID: \"8872a186-969e-434a-af09-8dccadcaf56f\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-c9r2r" Apr 23 17:26:37.764142 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:26:37.764108 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-c9r2r" Apr 23 17:26:37.884051 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:26:37.884026 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-c9r2r"] Apr 23 17:26:37.886286 ip-10-0-135-57 kubenswrapper[2562]: W0423 17:26:37.886264 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8872a186_969e_434a_af09_8dccadcaf56f.slice/crio-74d81257a0a8abadc9527cc7610be65ba6c223a4e35867e724a870dc96958a1a WatchSource:0}: Error finding container 74d81257a0a8abadc9527cc7610be65ba6c223a4e35867e724a870dc96958a1a: Status 404 returned error can't find the container with id 74d81257a0a8abadc9527cc7610be65ba6c223a4e35867e724a870dc96958a1a Apr 23 17:26:38.424615 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:26:38.424578 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-c9r2r" event={"ID":"8872a186-969e-434a-af09-8dccadcaf56f","Type":"ContainerStarted","Data":"c71d25df585fa6f93b1de1e5bcf3e21ea3f00d9726c565aabe6ecca5234f847d"} Apr 23 17:26:38.424615 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:26:38.424617 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-c9r2r" event={"ID":"8872a186-969e-434a-af09-8dccadcaf56f","Type":"ContainerStarted","Data":"74d81257a0a8abadc9527cc7610be65ba6c223a4e35867e724a870dc96958a1a"} Apr 23 17:26:40.994273 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:26:40.994250 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rx2wx" Apr 23 17:26:41.081997 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:26:41.081958 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41d2fc84-8021-4390-a4c3-11f48835abf7-kserve-provision-location\") pod \"41d2fc84-8021-4390-a4c3-11f48835abf7\" (UID: \"41d2fc84-8021-4390-a4c3-11f48835abf7\") " Apr 23 17:26:41.082371 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:26:41.082343 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41d2fc84-8021-4390-a4c3-11f48835abf7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "41d2fc84-8021-4390-a4c3-11f48835abf7" (UID: "41d2fc84-8021-4390-a4c3-11f48835abf7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:26:41.182799 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:26:41.182766 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41d2fc84-8021-4390-a4c3-11f48835abf7-kserve-provision-location\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 17:26:41.435134 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:26:41.435033 2562 generic.go:358] "Generic (PLEG): container finished" podID="41d2fc84-8021-4390-a4c3-11f48835abf7" containerID="fdc82f2f47d2b894d3b4c23b076fc10b0d3661336c8f01f3e68e9f2958371d58" exitCode=0 Apr 23 17:26:41.435134 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:26:41.435120 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rx2wx" Apr 23 17:26:41.435134 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:26:41.435121 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rx2wx" event={"ID":"41d2fc84-8021-4390-a4c3-11f48835abf7","Type":"ContainerDied","Data":"fdc82f2f47d2b894d3b4c23b076fc10b0d3661336c8f01f3e68e9f2958371d58"} Apr 23 17:26:41.435360 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:26:41.435164 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rx2wx" event={"ID":"41d2fc84-8021-4390-a4c3-11f48835abf7","Type":"ContainerDied","Data":"45d56c1c5c03751e43b5612b15f736d200550e10001eefcff3d86400d8ee6a58"} Apr 23 17:26:41.435360 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:26:41.435187 2562 scope.go:117] "RemoveContainer" containerID="fdc82f2f47d2b894d3b4c23b076fc10b0d3661336c8f01f3e68e9f2958371d58" Apr 23 17:26:41.443193 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:26:41.443177 2562 scope.go:117] "RemoveContainer" containerID="5394ebb42e8bb29fe8db2b001ac3c10a8047afbdb951a1111226f110db064f6b" Apr 23 17:26:41.450097 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:26:41.450081 2562 scope.go:117] "RemoveContainer" containerID="fdc82f2f47d2b894d3b4c23b076fc10b0d3661336c8f01f3e68e9f2958371d58" Apr 23 17:26:41.450339 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:26:41.450321 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdc82f2f47d2b894d3b4c23b076fc10b0d3661336c8f01f3e68e9f2958371d58\": container with ID starting with fdc82f2f47d2b894d3b4c23b076fc10b0d3661336c8f01f3e68e9f2958371d58 not found: ID does not exist" containerID="fdc82f2f47d2b894d3b4c23b076fc10b0d3661336c8f01f3e68e9f2958371d58" Apr 23 17:26:41.450394 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:26:41.450348 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdc82f2f47d2b894d3b4c23b076fc10b0d3661336c8f01f3e68e9f2958371d58"} err="failed to get container status \"fdc82f2f47d2b894d3b4c23b076fc10b0d3661336c8f01f3e68e9f2958371d58\": rpc error: code = NotFound desc = could not find container \"fdc82f2f47d2b894d3b4c23b076fc10b0d3661336c8f01f3e68e9f2958371d58\": container with ID starting with fdc82f2f47d2b894d3b4c23b076fc10b0d3661336c8f01f3e68e9f2958371d58 not found: ID does not exist" Apr 23 17:26:41.450394 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:26:41.450368 2562 scope.go:117] "RemoveContainer" containerID="5394ebb42e8bb29fe8db2b001ac3c10a8047afbdb951a1111226f110db064f6b" Apr 23 17:26:41.450615 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:26:41.450598 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5394ebb42e8bb29fe8db2b001ac3c10a8047afbdb951a1111226f110db064f6b\": container with ID starting with 5394ebb42e8bb29fe8db2b001ac3c10a8047afbdb951a1111226f110db064f6b not found: ID does not exist" containerID="5394ebb42e8bb29fe8db2b001ac3c10a8047afbdb951a1111226f110db064f6b" Apr 23 17:26:41.450674 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:26:41.450621 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5394ebb42e8bb29fe8db2b001ac3c10a8047afbdb951a1111226f110db064f6b"} err="failed to get container status \"5394ebb42e8bb29fe8db2b001ac3c10a8047afbdb951a1111226f110db064f6b\": rpc error: code = NotFound desc = could not find container \"5394ebb42e8bb29fe8db2b001ac3c10a8047afbdb951a1111226f110db064f6b\": container with ID starting with 5394ebb42e8bb29fe8db2b001ac3c10a8047afbdb951a1111226f110db064f6b not found: ID does not exist" Apr 23 17:26:41.455938 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:26:41.455913 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rx2wx"] Apr 23 17:26:41.459302 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:26:41.459282 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rx2wx"] Apr 23 17:26:42.439829 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:26:42.439792 2562 generic.go:358] "Generic (PLEG): container finished" podID="8872a186-969e-434a-af09-8dccadcaf56f" containerID="c71d25df585fa6f93b1de1e5bcf3e21ea3f00d9726c565aabe6ecca5234f847d" exitCode=0 Apr 23 17:26:42.440312 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:26:42.439853 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-c9r2r" event={"ID":"8872a186-969e-434a-af09-8dccadcaf56f","Type":"ContainerDied","Data":"c71d25df585fa6f93b1de1e5bcf3e21ea3f00d9726c565aabe6ecca5234f847d"} Apr 23 17:26:42.441031 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:26:42.441014 2562 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 17:26:42.742157 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:26:42.742073 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41d2fc84-8021-4390-a4c3-11f48835abf7" path="/var/lib/kubelet/pods/41d2fc84-8021-4390-a4c3-11f48835abf7/volumes" Apr 23 17:26:43.445426 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:26:43.445391 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-c9r2r" event={"ID":"8872a186-969e-434a-af09-8dccadcaf56f","Type":"ContainerStarted","Data":"9fbb3dd0133265290702c124f57374e85c24caeba2ea9c6a9d13bed0e7116f2b"} Apr 23 17:26:43.445820 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:26:43.445614 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-c9r2r" Apr 23 17:26:43.462947 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:26:43.462901 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-c9r2r" podStartSLOduration=6.462887029 podStartE2EDuration="6.462887029s" podCreationTimestamp="2026-04-23 17:26:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:26:43.46235541 +0000 UTC m=+3083.315170479" watchObservedRunningTime="2026-04-23 17:26:43.462887029 +0000 UTC m=+3083.315702093" Apr 23 17:27:14.450461 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:27:14.450417 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-c9r2r" podUID="8872a186-969e-434a-af09-8dccadcaf56f" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.58:8080/v2/models/isvc-xgboost-v2-mlserver/ready\": dial tcp 10.133.0.58:8080: connect: connection refused" Apr 23 17:27:24.452870 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:27:24.452840 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-c9r2r" Apr 23 17:27:27.518798 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:27:27.518758 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-c9r2r"] Apr 23 17:27:27.519199 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:27:27.519029 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-c9r2r" podUID="8872a186-969e-434a-af09-8dccadcaf56f" containerName="kserve-container" containerID="cri-o://9fbb3dd0133265290702c124f57374e85c24caeba2ea9c6a9d13bed0e7116f2b" gracePeriod=30 Apr 23 17:27:27.602831 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:27:27.602795 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-4hphq"] Apr 23 17:27:27.603125 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:27:27.603112 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="41d2fc84-8021-4390-a4c3-11f48835abf7" containerName="kserve-container" Apr 23 17:27:27.603169 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:27:27.603127 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="41d2fc84-8021-4390-a4c3-11f48835abf7" containerName="kserve-container" Apr 23 17:27:27.603169 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:27:27.603139 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="41d2fc84-8021-4390-a4c3-11f48835abf7" containerName="storage-initializer" Apr 23 17:27:27.603169 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:27:27.603145 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="41d2fc84-8021-4390-a4c3-11f48835abf7" containerName="storage-initializer" Apr 23 17:27:27.603267 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:27:27.603221 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="41d2fc84-8021-4390-a4c3-11f48835abf7" containerName="kserve-container" Apr 23 17:27:27.607012 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:27:27.606995 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-4hphq" Apr 23 17:27:27.614464 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:27:27.614440 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-4hphq"] Apr 23 17:27:27.682189 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:27:27.682159 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6fbb239e-faae-4c08-ae51-c15c4253a406-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-c58d48f-4hphq\" (UID: \"6fbb239e-faae-4c08-ae51-c15c4253a406\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-4hphq" Apr 23 17:27:27.782927 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:27:27.782838 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6fbb239e-faae-4c08-ae51-c15c4253a406-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-c58d48f-4hphq\" (UID: \"6fbb239e-faae-4c08-ae51-c15c4253a406\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-4hphq" Apr 23 17:27:27.783218 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:27:27.783196 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6fbb239e-faae-4c08-ae51-c15c4253a406-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-c58d48f-4hphq\" (UID: \"6fbb239e-faae-4c08-ae51-c15c4253a406\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-4hphq" Apr 23 17:27:27.917531 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:27:27.917493 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-4hphq" Apr 23 17:27:28.039880 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:27:28.039804 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-4hphq"] Apr 23 17:27:28.043106 ip-10-0-135-57 kubenswrapper[2562]: W0423 17:27:28.043078 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fbb239e_faae_4c08_ae51_c15c4253a406.slice/crio-882fb3a71b6ae1ace7c6a28f5971b04d22a15d9ac736dd6d488ca458947d4547 WatchSource:0}: Error finding container 882fb3a71b6ae1ace7c6a28f5971b04d22a15d9ac736dd6d488ca458947d4547: Status 404 returned error can't find the container with id 882fb3a71b6ae1ace7c6a28f5971b04d22a15d9ac736dd6d488ca458947d4547 Apr 23 17:27:28.577462 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:27:28.577427 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-4hphq" event={"ID":"6fbb239e-faae-4c08-ae51-c15c4253a406","Type":"ContainerStarted","Data":"6da3f506b6e2f0ed138f0f1f2fc2243af31fd6247a719be0b439dcfe94158d45"} Apr 23 17:27:28.577462 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:27:28.577465 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-4hphq" event={"ID":"6fbb239e-faae-4c08-ae51-c15c4253a406","Type":"ContainerStarted","Data":"882fb3a71b6ae1ace7c6a28f5971b04d22a15d9ac736dd6d488ca458947d4547"} Apr 23 17:27:32.590220 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:27:32.590186 2562 generic.go:358] "Generic (PLEG): container finished" podID="6fbb239e-faae-4c08-ae51-c15c4253a406" containerID="6da3f506b6e2f0ed138f0f1f2fc2243af31fd6247a719be0b439dcfe94158d45" exitCode=0 Apr 23 17:27:32.590613 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:27:32.590257 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-4hphq" event={"ID":"6fbb239e-faae-4c08-ae51-c15c4253a406","Type":"ContainerDied","Data":"6da3f506b6e2f0ed138f0f1f2fc2243af31fd6247a719be0b439dcfe94158d45"} Apr 23 17:27:33.594629 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:27:33.594596 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-4hphq" event={"ID":"6fbb239e-faae-4c08-ae51-c15c4253a406","Type":"ContainerStarted","Data":"b33dee054082f929d0d0c2400cfed82d892a6a88e8ac09f822d0d074d55dacab"} Apr 23 17:27:33.595129 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:27:33.594903 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-4hphq" Apr 23 17:27:33.610964 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:27:33.610913 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-4hphq" podStartSLOduration=6.610897971 podStartE2EDuration="6.610897971s" podCreationTimestamp="2026-04-23 17:27:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:27:33.610318311 +0000 UTC m=+3133.463133376" watchObservedRunningTime="2026-04-23 17:27:33.610897971 +0000 UTC m=+3133.463713035" Apr 23 17:27:34.057176 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:27:34.057152 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-c9r2r" Apr 23 17:27:34.132068 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:27:34.131984 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8872a186-969e-434a-af09-8dccadcaf56f-kserve-provision-location\") pod \"8872a186-969e-434a-af09-8dccadcaf56f\" (UID: \"8872a186-969e-434a-af09-8dccadcaf56f\") " Apr 23 17:27:34.132368 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:27:34.132346 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8872a186-969e-434a-af09-8dccadcaf56f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8872a186-969e-434a-af09-8dccadcaf56f" (UID: "8872a186-969e-434a-af09-8dccadcaf56f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:27:34.232678 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:27:34.232625 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8872a186-969e-434a-af09-8dccadcaf56f-kserve-provision-location\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 17:27:34.601533 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:27:34.601500 2562 generic.go:358] "Generic (PLEG): container finished" podID="8872a186-969e-434a-af09-8dccadcaf56f" containerID="9fbb3dd0133265290702c124f57374e85c24caeba2ea9c6a9d13bed0e7116f2b" exitCode=0 Apr 23 17:27:34.602017 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:27:34.601553 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-c9r2r" event={"ID":"8872a186-969e-434a-af09-8dccadcaf56f","Type":"ContainerDied","Data":"9fbb3dd0133265290702c124f57374e85c24caeba2ea9c6a9d13bed0e7116f2b"} Apr 23 17:27:34.602017 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:27:34.601564 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-c9r2r" Apr 23 17:27:34.602017 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:27:34.601594 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-c9r2r" event={"ID":"8872a186-969e-434a-af09-8dccadcaf56f","Type":"ContainerDied","Data":"74d81257a0a8abadc9527cc7610be65ba6c223a4e35867e724a870dc96958a1a"} Apr 23 17:27:34.602017 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:27:34.601614 2562 scope.go:117] "RemoveContainer" containerID="9fbb3dd0133265290702c124f57374e85c24caeba2ea9c6a9d13bed0e7116f2b" Apr 23 17:27:34.609632 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:27:34.609612 2562 scope.go:117] "RemoveContainer" containerID="c71d25df585fa6f93b1de1e5bcf3e21ea3f00d9726c565aabe6ecca5234f847d" Apr 23 17:27:34.616472 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:27:34.616455 2562 scope.go:117] "RemoveContainer" containerID="9fbb3dd0133265290702c124f57374e85c24caeba2ea9c6a9d13bed0e7116f2b" Apr 23 17:27:34.616708 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:27:34.616685 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fbb3dd0133265290702c124f57374e85c24caeba2ea9c6a9d13bed0e7116f2b\": container with ID starting with 9fbb3dd0133265290702c124f57374e85c24caeba2ea9c6a9d13bed0e7116f2b not found: ID does not exist" containerID="9fbb3dd0133265290702c124f57374e85c24caeba2ea9c6a9d13bed0e7116f2b" Apr 23 17:27:34.616906 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:27:34.616720 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fbb3dd0133265290702c124f57374e85c24caeba2ea9c6a9d13bed0e7116f2b"} err="failed to get container status \"9fbb3dd0133265290702c124f57374e85c24caeba2ea9c6a9d13bed0e7116f2b\": rpc error: code = NotFound desc = could not find container \"9fbb3dd0133265290702c124f57374e85c24caeba2ea9c6a9d13bed0e7116f2b\": container with ID starting with 9fbb3dd0133265290702c124f57374e85c24caeba2ea9c6a9d13bed0e7116f2b not found: ID does not exist" Apr 23 17:27:34.616906 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:27:34.616763 2562 scope.go:117] "RemoveContainer" containerID="c71d25df585fa6f93b1de1e5bcf3e21ea3f00d9726c565aabe6ecca5234f847d" Apr 23 17:27:34.617019 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:27:34.617001 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c71d25df585fa6f93b1de1e5bcf3e21ea3f00d9726c565aabe6ecca5234f847d\": container with ID starting with c71d25df585fa6f93b1de1e5bcf3e21ea3f00d9726c565aabe6ecca5234f847d not found: ID does not exist" containerID="c71d25df585fa6f93b1de1e5bcf3e21ea3f00d9726c565aabe6ecca5234f847d" Apr 23 17:27:34.617059 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:27:34.617027 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c71d25df585fa6f93b1de1e5bcf3e21ea3f00d9726c565aabe6ecca5234f847d"} err="failed to get container status \"c71d25df585fa6f93b1de1e5bcf3e21ea3f00d9726c565aabe6ecca5234f847d\": rpc error: code = NotFound desc = could not find container \"c71d25df585fa6f93b1de1e5bcf3e21ea3f00d9726c565aabe6ecca5234f847d\": container with ID starting with c71d25df585fa6f93b1de1e5bcf3e21ea3f00d9726c565aabe6ecca5234f847d not found: ID does not exist" Apr 23 17:27:34.622563 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:27:34.622543 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-c9r2r"] Apr 23 17:27:34.628036 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:27:34.628013 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-c9r2r"] Apr 23 17:27:34.743854 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:27:34.743824 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8872a186-969e-434a-af09-8dccadcaf56f" path="/var/lib/kubelet/pods/8872a186-969e-434a-af09-8dccadcaf56f/volumes" Apr 23 17:28:04.636319 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:28:04.636242 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-4hphq" Apr 23 17:28:07.914884 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:28:07.914849 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-4hphq"] Apr 23 17:28:07.915364 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:28:07.915156 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-4hphq" podUID="6fbb239e-faae-4c08-ae51-c15c4253a406" containerName="kserve-container" containerID="cri-o://b33dee054082f929d0d0c2400cfed82d892a6a88e8ac09f822d0d074d55dacab" gracePeriod=30 Apr 23 17:28:07.964478 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:28:07.964441 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-wfwsm"] Apr 23 17:28:07.964777 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:28:07.964762 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8872a186-969e-434a-af09-8dccadcaf56f" containerName="kserve-container" Apr 23 17:28:07.964835 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:28:07.964780 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="8872a186-969e-434a-af09-8dccadcaf56f" containerName="kserve-container" Apr 23 17:28:07.964835 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:28:07.964791 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8872a186-969e-434a-af09-8dccadcaf56f" containerName="storage-initializer" Apr 23 17:28:07.964835 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:28:07.964797 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="8872a186-969e-434a-af09-8dccadcaf56f" containerName="storage-initializer" Apr 23 17:28:07.964941 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:28:07.964858 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="8872a186-969e-434a-af09-8dccadcaf56f" containerName="kserve-container" Apr 23 17:28:07.967835 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:28:07.967819 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-wfwsm" Apr 23 17:28:07.976694 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:28:07.976671 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-wfwsm"] Apr 23 17:28:08.133187 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:28:08.133153 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/afd115a5-5572-484c-9b54-fa98d32be17d-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-687c7765c9-wfwsm\" (UID: \"afd115a5-5572-484c-9b54-fa98d32be17d\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-wfwsm" Apr 23 17:28:08.234180 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:28:08.234080 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/afd115a5-5572-484c-9b54-fa98d32be17d-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-687c7765c9-wfwsm\" (UID: \"afd115a5-5572-484c-9b54-fa98d32be17d\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-wfwsm" Apr 23 17:28:08.234473 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:28:08.234453 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/afd115a5-5572-484c-9b54-fa98d32be17d-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-687c7765c9-wfwsm\" (UID: \"afd115a5-5572-484c-9b54-fa98d32be17d\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-wfwsm" Apr 23 17:28:08.278150 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:28:08.278124 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-wfwsm" Apr 23 17:28:08.394980 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:28:08.394852 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-wfwsm"] Apr 23 17:28:08.397411 ip-10-0-135-57 kubenswrapper[2562]: W0423 17:28:08.397360 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafd115a5_5572_484c_9b54_fa98d32be17d.slice/crio-5fd0682e0b847acfdcc5af4aa54baffab89e5c0a6443a5870f647eaa4f52b81b WatchSource:0}: Error finding container 5fd0682e0b847acfdcc5af4aa54baffab89e5c0a6443a5870f647eaa4f52b81b: Status 404 returned error can't find the container with id 5fd0682e0b847acfdcc5af4aa54baffab89e5c0a6443a5870f647eaa4f52b81b Apr 23 17:28:08.700894 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:28:08.700860 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-wfwsm" event={"ID":"afd115a5-5572-484c-9b54-fa98d32be17d","Type":"ContainerStarted","Data":"30bf4039183edd6a44375c17e7958b0a17fe4b6494936b5301e605fbfc58244e"} Apr 23 17:28:08.700894 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:28:08.700900 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-wfwsm" event={"ID":"afd115a5-5572-484c-9b54-fa98d32be17d","Type":"ContainerStarted","Data":"5fd0682e0b847acfdcc5af4aa54baffab89e5c0a6443a5870f647eaa4f52b81b"} Apr 23 17:28:12.714049 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:28:12.714015 2562 generic.go:358] "Generic (PLEG): container finished" podID="afd115a5-5572-484c-9b54-fa98d32be17d" containerID="30bf4039183edd6a44375c17e7958b0a17fe4b6494936b5301e605fbfc58244e" exitCode=0 Apr 23 17:28:12.714438 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:28:12.714062 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-wfwsm" event={"ID":"afd115a5-5572-484c-9b54-fa98d32be17d","Type":"ContainerDied","Data":"30bf4039183edd6a44375c17e7958b0a17fe4b6494936b5301e605fbfc58244e"} Apr 23 17:28:13.718513 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:28:13.718478 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-wfwsm" event={"ID":"afd115a5-5572-484c-9b54-fa98d32be17d","Type":"ContainerStarted","Data":"3539b4e62055ebec77c7b6341ac244a6fc41e2239450c0a3b11d5ea3e2f5e82a"} Apr 23 17:28:13.718980 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:28:13.718864 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-wfwsm" Apr 23 17:28:13.720187 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:28:13.720160 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-wfwsm" podUID="afd115a5-5572-484c-9b54-fa98d32be17d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.60:8080: connect: connection refused" Apr 23 17:28:13.735075 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:28:13.735027 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-wfwsm" podStartSLOduration=6.73499208 podStartE2EDuration="6.73499208s" podCreationTimestamp="2026-04-23 17:28:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:28:13.734123837 +0000 UTC m=+3173.586938902" watchObservedRunningTime="2026-04-23 17:28:13.73499208 +0000 UTC m=+3173.587807145" Apr 23 17:28:14.602083 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:28:14.602036 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-4hphq" podUID="6fbb239e-faae-4c08-ae51-c15c4253a406" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.59:8080/v2/models/xgboost-v2-mlserver/ready\": dial tcp 10.133.0.59:8080: connect: connection refused" Apr 23 17:28:14.721320 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:28:14.721282 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-wfwsm" podUID="afd115a5-5572-484c-9b54-fa98d32be17d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.60:8080: connect: connection refused" Apr 23 17:28:16.145213 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:28:16.145188 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-4hphq" Apr 23 17:28:16.200489 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:28:16.200395 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6fbb239e-faae-4c08-ae51-c15c4253a406-kserve-provision-location\") pod \"6fbb239e-faae-4c08-ae51-c15c4253a406\" (UID: \"6fbb239e-faae-4c08-ae51-c15c4253a406\") " Apr 23 17:28:16.200760 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:28:16.200717 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6fbb239e-faae-4c08-ae51-c15c4253a406-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6fbb239e-faae-4c08-ae51-c15c4253a406" (UID: "6fbb239e-faae-4c08-ae51-c15c4253a406"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:28:16.301039 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:28:16.300996 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6fbb239e-faae-4c08-ae51-c15c4253a406-kserve-provision-location\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 17:28:16.730009 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:28:16.729973 2562 generic.go:358] "Generic (PLEG): container finished" podID="6fbb239e-faae-4c08-ae51-c15c4253a406" containerID="b33dee054082f929d0d0c2400cfed82d892a6a88e8ac09f822d0d074d55dacab" exitCode=0 Apr 23 17:28:16.730199 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:28:16.730032 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-4hphq" event={"ID":"6fbb239e-faae-4c08-ae51-c15c4253a406","Type":"ContainerDied","Data":"b33dee054082f929d0d0c2400cfed82d892a6a88e8ac09f822d0d074d55dacab"} Apr 23 17:28:16.730199 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:28:16.730065 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-4hphq" event={"ID":"6fbb239e-faae-4c08-ae51-c15c4253a406","Type":"ContainerDied","Data":"882fb3a71b6ae1ace7c6a28f5971b04d22a15d9ac736dd6d488ca458947d4547"} Apr 23 17:28:16.730199 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:28:16.730084 2562 scope.go:117] "RemoveContainer" containerID="b33dee054082f929d0d0c2400cfed82d892a6a88e8ac09f822d0d074d55dacab" Apr 23 17:28:16.730199 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:28:16.730107 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-4hphq" Apr 23 17:28:16.738309 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:28:16.738109 2562 scope.go:117] "RemoveContainer" containerID="6da3f506b6e2f0ed138f0f1f2fc2243af31fd6247a719be0b439dcfe94158d45" Apr 23 17:28:16.745686 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:28:16.745666 2562 scope.go:117] "RemoveContainer" containerID="b33dee054082f929d0d0c2400cfed82d892a6a88e8ac09f822d0d074d55dacab" Apr 23 17:28:16.745984 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:28:16.745970 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b33dee054082f929d0d0c2400cfed82d892a6a88e8ac09f822d0d074d55dacab\": container with ID starting with b33dee054082f929d0d0c2400cfed82d892a6a88e8ac09f822d0d074d55dacab not found: ID does not exist" containerID="b33dee054082f929d0d0c2400cfed82d892a6a88e8ac09f822d0d074d55dacab" Apr 23 17:28:16.746034 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:28:16.745993 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b33dee054082f929d0d0c2400cfed82d892a6a88e8ac09f822d0d074d55dacab"} err="failed to get container status \"b33dee054082f929d0d0c2400cfed82d892a6a88e8ac09f822d0d074d55dacab\": rpc error: code = NotFound desc = could not find container \"b33dee054082f929d0d0c2400cfed82d892a6a88e8ac09f822d0d074d55dacab\": container with ID starting with b33dee054082f929d0d0c2400cfed82d892a6a88e8ac09f822d0d074d55dacab not found: ID does not exist" Apr 23 17:28:16.746034 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:28:16.746011 2562 scope.go:117] "RemoveContainer" containerID="6da3f506b6e2f0ed138f0f1f2fc2243af31fd6247a719be0b439dcfe94158d45" Apr 23 17:28:16.746241 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:28:16.746222 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6da3f506b6e2f0ed138f0f1f2fc2243af31fd6247a719be0b439dcfe94158d45\": container with ID starting with 6da3f506b6e2f0ed138f0f1f2fc2243af31fd6247a719be0b439dcfe94158d45 not found: ID does not exist" containerID="6da3f506b6e2f0ed138f0f1f2fc2243af31fd6247a719be0b439dcfe94158d45" Apr 23 17:28:16.746278 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:28:16.746249 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6da3f506b6e2f0ed138f0f1f2fc2243af31fd6247a719be0b439dcfe94158d45"} err="failed to get container status \"6da3f506b6e2f0ed138f0f1f2fc2243af31fd6247a719be0b439dcfe94158d45\": rpc error: code = NotFound desc = could not find container \"6da3f506b6e2f0ed138f0f1f2fc2243af31fd6247a719be0b439dcfe94158d45\": container with ID starting with 6da3f506b6e2f0ed138f0f1f2fc2243af31fd6247a719be0b439dcfe94158d45 not found: ID does not exist" Apr 23 17:28:16.752109 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:28:16.752086 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-4hphq"] Apr 23 17:28:16.755240 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:28:16.755220 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-4hphq"] Apr 23 17:28:18.742281 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:28:18.742243 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fbb239e-faae-4c08-ae51-c15c4253a406" path="/var/lib/kubelet/pods/6fbb239e-faae-4c08-ae51-c15c4253a406/volumes" Apr 23 17:28:24.721837 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:28:24.721783 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-wfwsm" podUID="afd115a5-5572-484c-9b54-fa98d32be17d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.60:8080: connect: connection refused" Apr 23 17:28:34.722178 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:28:34.722131 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-wfwsm" podUID="afd115a5-5572-484c-9b54-fa98d32be17d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.60:8080: connect: connection refused" Apr 23 17:28:44.721867 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:28:44.721826 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-wfwsm" podUID="afd115a5-5572-484c-9b54-fa98d32be17d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.60:8080: connect: connection refused" Apr 23 17:28:54.721832 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:28:54.721781 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-wfwsm" podUID="afd115a5-5572-484c-9b54-fa98d32be17d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.60:8080: connect: connection refused" Apr 23 17:29:04.721719 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:29:04.721668 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-wfwsm" podUID="afd115a5-5572-484c-9b54-fa98d32be17d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.60:8080: connect: connection refused" Apr 23 17:29:14.722974 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:29:14.722942 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-wfwsm" Apr 23 17:29:18.150314 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:29:18.150282 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-wfwsm"] Apr 23 17:29:18.150710 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:29:18.150543 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-wfwsm" podUID="afd115a5-5572-484c-9b54-fa98d32be17d" containerName="kserve-container" containerID="cri-o://3539b4e62055ebec77c7b6341ac244a6fc41e2239450c0a3b11d5ea3e2f5e82a" gracePeriod=30 Apr 23 17:29:18.258368 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:29:18.258330 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-8jv68"] Apr 23 17:29:18.258715 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:29:18.258701 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6fbb239e-faae-4c08-ae51-c15c4253a406" containerName="storage-initializer" Apr 23 17:29:18.258715 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:29:18.258715 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fbb239e-faae-4c08-ae51-c15c4253a406" containerName="storage-initializer" Apr 23 17:29:18.258884 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:29:18.258735 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6fbb239e-faae-4c08-ae51-c15c4253a406" containerName="kserve-container" Apr 23 17:29:18.258884 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:29:18.258756 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fbb239e-faae-4c08-ae51-c15c4253a406" containerName="kserve-container" Apr 23 17:29:18.258884 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:29:18.258811 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="6fbb239e-faae-4c08-ae51-c15c4253a406" containerName="kserve-container" Apr 23 17:29:18.261940 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:29:18.261922 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-8jv68" Apr 23 17:29:18.276272 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:29:18.276244 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-8jv68"] Apr 23 17:29:18.323059 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:29:18.323030 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/66541121-3569-4d4e-8c8f-efe00396b530-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-b5d6966c7-8jv68\" (UID: \"66541121-3569-4d4e-8c8f-efe00396b530\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-8jv68" Apr 23 17:29:18.424456 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:29:18.424353 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/66541121-3569-4d4e-8c8f-efe00396b530-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-b5d6966c7-8jv68\" (UID: \"66541121-3569-4d4e-8c8f-efe00396b530\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-8jv68" Apr 23 17:29:18.424783 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:29:18.424732 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/66541121-3569-4d4e-8c8f-efe00396b530-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-b5d6966c7-8jv68\" (UID: \"66541121-3569-4d4e-8c8f-efe00396b530\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-8jv68" Apr 23 17:29:18.571573 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:29:18.571540 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-8jv68" Apr 23 17:29:18.698363 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:29:18.698339 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-8jv68"] Apr 23 17:29:18.699947 ip-10-0-135-57 kubenswrapper[2562]: W0423 17:29:18.699920 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66541121_3569_4d4e_8c8f_efe00396b530.slice/crio-bfbd1716067a6bc56ea995d59adac45d5e6e6fb45da4a391a909cd0b4a78fd69 WatchSource:0}: Error finding container bfbd1716067a6bc56ea995d59adac45d5e6e6fb45da4a391a909cd0b4a78fd69: Status 404 returned error can't find the container with id bfbd1716067a6bc56ea995d59adac45d5e6e6fb45da4a391a909cd0b4a78fd69 Apr 23 17:29:18.920276 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:29:18.920243 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-8jv68" event={"ID":"66541121-3569-4d4e-8c8f-efe00396b530","Type":"ContainerStarted","Data":"32b44cab48f3c07f13f7d79cdac057d7a1113e690f21a75351766241c57faa11"} Apr 23 17:29:18.920276 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:29:18.920280 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-8jv68" event={"ID":"66541121-3569-4d4e-8c8f-efe00396b530","Type":"ContainerStarted","Data":"bfbd1716067a6bc56ea995d59adac45d5e6e6fb45da4a391a909cd0b4a78fd69"} Apr 23 17:29:21.888314 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:29:21.888292 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-wfwsm" Apr 23 17:29:21.930723 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:29:21.930687 2562 generic.go:358] "Generic (PLEG): container finished" podID="afd115a5-5572-484c-9b54-fa98d32be17d" containerID="3539b4e62055ebec77c7b6341ac244a6fc41e2239450c0a3b11d5ea3e2f5e82a" exitCode=0 Apr 23 17:29:21.930925 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:29:21.930754 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-wfwsm" event={"ID":"afd115a5-5572-484c-9b54-fa98d32be17d","Type":"ContainerDied","Data":"3539b4e62055ebec77c7b6341ac244a6fc41e2239450c0a3b11d5ea3e2f5e82a"} Apr 23 17:29:21.930925 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:29:21.930784 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-wfwsm" Apr 23 17:29:21.930925 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:29:21.930804 2562 scope.go:117] "RemoveContainer" containerID="3539b4e62055ebec77c7b6341ac244a6fc41e2239450c0a3b11d5ea3e2f5e82a" Apr 23 17:29:21.930925 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:29:21.930792 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-wfwsm" event={"ID":"afd115a5-5572-484c-9b54-fa98d32be17d","Type":"ContainerDied","Data":"5fd0682e0b847acfdcc5af4aa54baffab89e5c0a6443a5870f647eaa4f52b81b"} Apr 23 17:29:21.938429 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:29:21.938407 2562 scope.go:117] "RemoveContainer" containerID="30bf4039183edd6a44375c17e7958b0a17fe4b6494936b5301e605fbfc58244e" Apr 23 17:29:21.945258 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:29:21.945239 2562 scope.go:117] "RemoveContainer" containerID="3539b4e62055ebec77c7b6341ac244a6fc41e2239450c0a3b11d5ea3e2f5e82a" Apr 23 17:29:21.945505 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:29:21.945487 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3539b4e62055ebec77c7b6341ac244a6fc41e2239450c0a3b11d5ea3e2f5e82a\": container with ID starting with 3539b4e62055ebec77c7b6341ac244a6fc41e2239450c0a3b11d5ea3e2f5e82a not found: ID does not exist" containerID="3539b4e62055ebec77c7b6341ac244a6fc41e2239450c0a3b11d5ea3e2f5e82a" Apr 23 17:29:21.945549 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:29:21.945520 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3539b4e62055ebec77c7b6341ac244a6fc41e2239450c0a3b11d5ea3e2f5e82a"} err="failed to get container status \"3539b4e62055ebec77c7b6341ac244a6fc41e2239450c0a3b11d5ea3e2f5e82a\": rpc error: code = NotFound desc = could not find container \"3539b4e62055ebec77c7b6341ac244a6fc41e2239450c0a3b11d5ea3e2f5e82a\": container with ID starting with 3539b4e62055ebec77c7b6341ac244a6fc41e2239450c0a3b11d5ea3e2f5e82a not found: ID does not exist" Apr 23 17:29:21.945549 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:29:21.945539 2562 scope.go:117] "RemoveContainer" containerID="30bf4039183edd6a44375c17e7958b0a17fe4b6494936b5301e605fbfc58244e" Apr 23 17:29:21.945784 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:29:21.945767 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30bf4039183edd6a44375c17e7958b0a17fe4b6494936b5301e605fbfc58244e\": container with ID starting with 30bf4039183edd6a44375c17e7958b0a17fe4b6494936b5301e605fbfc58244e not found: ID does not exist" containerID="30bf4039183edd6a44375c17e7958b0a17fe4b6494936b5301e605fbfc58244e" Apr 23 17:29:21.945848 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:29:21.945793 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30bf4039183edd6a44375c17e7958b0a17fe4b6494936b5301e605fbfc58244e"} err="failed to get container status \"30bf4039183edd6a44375c17e7958b0a17fe4b6494936b5301e605fbfc58244e\": rpc error: code = NotFound desc = could not find container \"30bf4039183edd6a44375c17e7958b0a17fe4b6494936b5301e605fbfc58244e\": container with ID starting with 30bf4039183edd6a44375c17e7958b0a17fe4b6494936b5301e605fbfc58244e not found: ID does not exist" Apr 23 17:29:21.952155 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:29:21.952135 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/afd115a5-5572-484c-9b54-fa98d32be17d-kserve-provision-location\") pod \"afd115a5-5572-484c-9b54-fa98d32be17d\" (UID: \"afd115a5-5572-484c-9b54-fa98d32be17d\") " Apr 23 17:29:21.952406 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:29:21.952388 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afd115a5-5572-484c-9b54-fa98d32be17d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "afd115a5-5572-484c-9b54-fa98d32be17d" (UID: "afd115a5-5572-484c-9b54-fa98d32be17d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:29:22.052941 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:29:22.052856 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/afd115a5-5572-484c-9b54-fa98d32be17d-kserve-provision-location\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 17:29:22.268904 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:29:22.268871 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-wfwsm"] Apr 23 17:29:22.274908 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:29:22.274882 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-wfwsm"] Apr 23 17:29:22.742189 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:29:22.742158 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afd115a5-5572-484c-9b54-fa98d32be17d" path="/var/lib/kubelet/pods/afd115a5-5572-484c-9b54-fa98d32be17d/volumes" Apr 23 17:29:22.935378 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:29:22.935348 2562 generic.go:358] "Generic (PLEG): container finished" podID="66541121-3569-4d4e-8c8f-efe00396b530" containerID="32b44cab48f3c07f13f7d79cdac057d7a1113e690f21a75351766241c57faa11" exitCode=0 Apr 23 17:29:22.935854 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:29:22.935426 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-8jv68" event={"ID":"66541121-3569-4d4e-8c8f-efe00396b530","Type":"ContainerDied","Data":"32b44cab48f3c07f13f7d79cdac057d7a1113e690f21a75351766241c57faa11"} Apr 23 17:29:23.941547 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:29:23.941511 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-8jv68" event={"ID":"66541121-3569-4d4e-8c8f-efe00396b530","Type":"ContainerStarted","Data":"2d151fea7d46a0ef981b92a74e62d0f6153c8c9cf630b615d6633a33f3dcfcfc"} Apr 23 17:29:23.942045 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:29:23.941719 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-8jv68" Apr 23 17:29:23.961813 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:29:23.961762 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-8jv68" podStartSLOduration=5.961732154 podStartE2EDuration="5.961732154s" podCreationTimestamp="2026-04-23 17:29:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:29:23.960831648 +0000 UTC m=+3243.813646712" watchObservedRunningTime="2026-04-23 17:29:23.961732154 +0000 UTC m=+3243.814547218" Apr 23 17:29:55.036315 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:29:55.036266 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-8jv68" podUID="66541121-3569-4d4e-8c8f-efe00396b530" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 23 17:30:04.946945 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:30:04.946909 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-8jv68" Apr 23 17:30:08.426167 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:30:08.426126 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-sfmj4"] Apr 23 17:30:08.426652 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:30:08.426595 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="afd115a5-5572-484c-9b54-fa98d32be17d" containerName="kserve-container" Apr 23 17:30:08.426652 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:30:08.426613 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="afd115a5-5572-484c-9b54-fa98d32be17d" containerName="kserve-container" Apr 23 17:30:08.426652 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:30:08.426628 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="afd115a5-5572-484c-9b54-fa98d32be17d" containerName="storage-initializer" Apr 23 17:30:08.426652 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:30:08.426637 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="afd115a5-5572-484c-9b54-fa98d32be17d" containerName="storage-initializer" Apr 23 17:30:08.426914 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:30:08.426724 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="afd115a5-5572-484c-9b54-fa98d32be17d" containerName="kserve-container" Apr 23 17:30:08.429901 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:30:08.429882 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-sfmj4" Apr 23 17:30:08.439469 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:30:08.439437 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-sfmj4"] Apr 23 17:30:08.452286 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:30:08.452254 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/96499f1a-002a-4f25-9600-fef8def7e241-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-5db5686f9f-sfmj4\" (UID: \"96499f1a-002a-4f25-9600-fef8def7e241\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-sfmj4" Apr 23 17:30:08.552825 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:30:08.552785 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/96499f1a-002a-4f25-9600-fef8def7e241-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-5db5686f9f-sfmj4\" (UID: \"96499f1a-002a-4f25-9600-fef8def7e241\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-sfmj4" Apr 23 17:30:08.553280 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:30:08.553257 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/96499f1a-002a-4f25-9600-fef8def7e241-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-5db5686f9f-sfmj4\" (UID: \"96499f1a-002a-4f25-9600-fef8def7e241\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-sfmj4" Apr 23 17:30:08.554449 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:30:08.554429 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-8jv68"] Apr 23 17:30:08.554693 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:30:08.554670 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-8jv68" podUID="66541121-3569-4d4e-8c8f-efe00396b530" containerName="kserve-container" containerID="cri-o://2d151fea7d46a0ef981b92a74e62d0f6153c8c9cf630b615d6633a33f3dcfcfc" gracePeriod=30 Apr 23 17:30:08.739948 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:30:08.739856 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-sfmj4" Apr 23 17:30:08.866543 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:30:08.866507 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-sfmj4"] Apr 23 17:30:08.872935 ip-10-0-135-57 kubenswrapper[2562]: W0423 17:30:08.872906 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96499f1a_002a_4f25_9600_fef8def7e241.slice/crio-4c260b4e1dc7eef6902985e0eba2f70488629dd4dafcf32a5eabc1c8e18e1ad3 WatchSource:0}: Error finding container 4c260b4e1dc7eef6902985e0eba2f70488629dd4dafcf32a5eabc1c8e18e1ad3: Status 404 returned error can't find the container with id 4c260b4e1dc7eef6902985e0eba2f70488629dd4dafcf32a5eabc1c8e18e1ad3 Apr 23 17:30:09.083027 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:30:09.082941 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-sfmj4" event={"ID":"96499f1a-002a-4f25-9600-fef8def7e241","Type":"ContainerStarted","Data":"5af6b750b268e45c8a2747c107e9403c1f0f181f550dc9f187e11b01a93c1e7d"} Apr 23 17:30:09.083027 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:30:09.082978 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-sfmj4" event={"ID":"96499f1a-002a-4f25-9600-fef8def7e241","Type":"ContainerStarted","Data":"4c260b4e1dc7eef6902985e0eba2f70488629dd4dafcf32a5eabc1c8e18e1ad3"} Apr 23 17:30:13.095644 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:30:13.095608 2562 generic.go:358] "Generic (PLEG): container finished" podID="96499f1a-002a-4f25-9600-fef8def7e241" containerID="5af6b750b268e45c8a2747c107e9403c1f0f181f550dc9f187e11b01a93c1e7d" exitCode=0 Apr 23 17:30:13.096046 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:30:13.095658 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-sfmj4" event={"ID":"96499f1a-002a-4f25-9600-fef8def7e241","Type":"ContainerDied","Data":"5af6b750b268e45c8a2747c107e9403c1f0f181f550dc9f187e11b01a93c1e7d"} Apr 23 17:30:14.099917 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:30:14.099878 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-sfmj4" event={"ID":"96499f1a-002a-4f25-9600-fef8def7e241","Type":"ContainerStarted","Data":"23c759187bc8c5e44ff634c8797fe35864065e4113be496baf39daadccefd157"} Apr 23 17:30:14.100301 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:30:14.100162 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-sfmj4" Apr 23 17:30:14.101451 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:30:14.101426 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-sfmj4" podUID="96499f1a-002a-4f25-9600-fef8def7e241" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.62:8080: connect: connection refused" Apr 23 17:30:14.123171 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:30:14.123119 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-sfmj4" podStartSLOduration=6.123104573 podStartE2EDuration="6.123104573s" podCreationTimestamp="2026-04-23 17:30:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:30:14.119936912 +0000 UTC m=+3293.972751977" watchObservedRunningTime="2026-04-23 17:30:14.123104573 +0000 UTC m=+3293.975919636" Apr 23 17:30:14.945258 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:30:14.945213 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-8jv68" podUID="66541121-3569-4d4e-8c8f-efe00396b530" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.61:8080/v2/models/isvc-xgboost-v2-runtime/ready\": dial tcp 10.133.0.61:8080: connect: connection refused" Apr 23 17:30:15.103585 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:30:15.103544 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-sfmj4" podUID="96499f1a-002a-4f25-9600-fef8def7e241" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.62:8080: connect: connection refused" Apr 23 17:30:16.291064 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:30:16.291042 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-8jv68" Apr 23 17:30:16.311636 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:30:16.311605 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/66541121-3569-4d4e-8c8f-efe00396b530-kserve-provision-location\") pod \"66541121-3569-4d4e-8c8f-efe00396b530\" (UID: \"66541121-3569-4d4e-8c8f-efe00396b530\") " Apr 23 17:30:16.312004 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:30:16.311963 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66541121-3569-4d4e-8c8f-efe00396b530-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "66541121-3569-4d4e-8c8f-efe00396b530" (UID: "66541121-3569-4d4e-8c8f-efe00396b530"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:30:16.413176 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:30:16.413085 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/66541121-3569-4d4e-8c8f-efe00396b530-kserve-provision-location\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 17:30:17.110779 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:30:17.110669 2562 generic.go:358] "Generic (PLEG): container finished" podID="66541121-3569-4d4e-8c8f-efe00396b530" containerID="2d151fea7d46a0ef981b92a74e62d0f6153c8c9cf630b615d6633a33f3dcfcfc" exitCode=0 Apr 23 17:30:17.110779 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:30:17.110729 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-8jv68" event={"ID":"66541121-3569-4d4e-8c8f-efe00396b530","Type":"ContainerDied","Data":"2d151fea7d46a0ef981b92a74e62d0f6153c8c9cf630b615d6633a33f3dcfcfc"} Apr 23 17:30:17.110779 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:30:17.110780 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-8jv68" event={"ID":"66541121-3569-4d4e-8c8f-efe00396b530","Type":"ContainerDied","Data":"bfbd1716067a6bc56ea995d59adac45d5e6e6fb45da4a391a909cd0b4a78fd69"} Apr 23 17:30:17.111027 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:30:17.110779 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-8jv68" Apr 23 17:30:17.111027 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:30:17.110793 2562 scope.go:117] "RemoveContainer" containerID="2d151fea7d46a0ef981b92a74e62d0f6153c8c9cf630b615d6633a33f3dcfcfc" Apr 23 17:30:17.118713 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:30:17.118692 2562 scope.go:117] "RemoveContainer" containerID="32b44cab48f3c07f13f7d79cdac057d7a1113e690f21a75351766241c57faa11" Apr 23 17:30:17.125997 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:30:17.125979 2562 scope.go:117] "RemoveContainer" containerID="2d151fea7d46a0ef981b92a74e62d0f6153c8c9cf630b615d6633a33f3dcfcfc" Apr 23 17:30:17.126268 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:30:17.126248 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d151fea7d46a0ef981b92a74e62d0f6153c8c9cf630b615d6633a33f3dcfcfc\": container with ID starting with 2d151fea7d46a0ef981b92a74e62d0f6153c8c9cf630b615d6633a33f3dcfcfc not found: ID does not exist" containerID="2d151fea7d46a0ef981b92a74e62d0f6153c8c9cf630b615d6633a33f3dcfcfc" Apr 23 17:30:17.126341 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:30:17.126277 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d151fea7d46a0ef981b92a74e62d0f6153c8c9cf630b615d6633a33f3dcfcfc"} err="failed to get container status \"2d151fea7d46a0ef981b92a74e62d0f6153c8c9cf630b615d6633a33f3dcfcfc\": rpc error: code = NotFound desc = could not find container \"2d151fea7d46a0ef981b92a74e62d0f6153c8c9cf630b615d6633a33f3dcfcfc\": container with ID starting with 2d151fea7d46a0ef981b92a74e62d0f6153c8c9cf630b615d6633a33f3dcfcfc not found: ID does not exist" Apr 23 17:30:17.126341 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:30:17.126295 2562 scope.go:117] "RemoveContainer" containerID="32b44cab48f3c07f13f7d79cdac057d7a1113e690f21a75351766241c57faa11" Apr 23 17:30:17.126576 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:30:17.126559 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32b44cab48f3c07f13f7d79cdac057d7a1113e690f21a75351766241c57faa11\": container with ID starting with 32b44cab48f3c07f13f7d79cdac057d7a1113e690f21a75351766241c57faa11 not found: ID does not exist" containerID="32b44cab48f3c07f13f7d79cdac057d7a1113e690f21a75351766241c57faa11" Apr 23 17:30:17.126617 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:30:17.126584 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32b44cab48f3c07f13f7d79cdac057d7a1113e690f21a75351766241c57faa11"} err="failed to get container status \"32b44cab48f3c07f13f7d79cdac057d7a1113e690f21a75351766241c57faa11\": rpc error: code = NotFound desc = could not find container \"32b44cab48f3c07f13f7d79cdac057d7a1113e690f21a75351766241c57faa11\": container with ID starting with 32b44cab48f3c07f13f7d79cdac057d7a1113e690f21a75351766241c57faa11 not found: ID does not exist" Apr 23 17:30:17.142891 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:30:17.142858 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-8jv68"] Apr 23 17:30:17.149166 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:30:17.149141 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-8jv68"] Apr 23 17:30:18.741476 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:30:18.741442 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66541121-3569-4d4e-8c8f-efe00396b530" path="/var/lib/kubelet/pods/66541121-3569-4d4e-8c8f-efe00396b530/volumes" Apr 23 17:30:25.104350 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:30:25.104305 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-sfmj4" podUID="96499f1a-002a-4f25-9600-fef8def7e241" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.62:8080: connect: connection refused" Apr 23 17:30:35.104006 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:30:35.103962 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-sfmj4" podUID="96499f1a-002a-4f25-9600-fef8def7e241" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.62:8080: connect: connection refused" Apr 23 17:30:45.103613 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:30:45.103521 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-sfmj4" podUID="96499f1a-002a-4f25-9600-fef8def7e241" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.62:8080: connect: connection refused" Apr 23 17:30:55.103869 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:30:55.103823 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-sfmj4" podUID="96499f1a-002a-4f25-9600-fef8def7e241" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.62:8080: connect: connection refused" Apr 23 17:31:05.103719 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:31:05.103674 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-sfmj4" podUID="96499f1a-002a-4f25-9600-fef8def7e241" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.62:8080: connect: connection refused" Apr 23 17:31:15.104933 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:31:15.104899 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-sfmj4" Apr 23 17:31:18.563999 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:31:18.563963 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7545595b5-7952m"] Apr 23 17:31:18.564443 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:31:18.564426 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="66541121-3569-4d4e-8c8f-efe00396b530" containerName="kserve-container" Apr 23 17:31:18.564487 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:31:18.564446 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="66541121-3569-4d4e-8c8f-efe00396b530" containerName="kserve-container" Apr 23 17:31:18.564487 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:31:18.564461 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="66541121-3569-4d4e-8c8f-efe00396b530" containerName="storage-initializer" Apr 23 17:31:18.564487 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:31:18.564470 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="66541121-3569-4d4e-8c8f-efe00396b530" containerName="storage-initializer" Apr 23 17:31:18.564585 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:31:18.564538 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="66541121-3569-4d4e-8c8f-efe00396b530" containerName="kserve-container" Apr 23 17:31:18.567973 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:31:18.567956 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-sfmj4"] Apr 23 17:31:18.568113 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:31:18.568091 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7545595b5-7952m" Apr 23 17:31:18.568237 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:31:18.568215 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-sfmj4" podUID="96499f1a-002a-4f25-9600-fef8def7e241" containerName="kserve-container" containerID="cri-o://23c759187bc8c5e44ff634c8797fe35864065e4113be496baf39daadccefd157" gracePeriod=30 Apr 23 17:31:18.571228 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:31:18.571211 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"storage-config\"" Apr 23 17:31:18.584586 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:31:18.584563 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7545595b5-7952m"] Apr 23 17:31:18.622851 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:31:18.622821 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e9e27568-220b-44de-82bf-f8df701829fa-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-7545595b5-7952m\" (UID: \"e9e27568-220b-44de-82bf-f8df701829fa\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7545595b5-7952m" Apr 23 17:31:18.723726 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:31:18.723693 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e9e27568-220b-44de-82bf-f8df701829fa-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-7545595b5-7952m\" (UID: \"e9e27568-220b-44de-82bf-f8df701829fa\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7545595b5-7952m" Apr 23 17:31:18.724088 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:31:18.724070 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e9e27568-220b-44de-82bf-f8df701829fa-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-7545595b5-7952m\" (UID: \"e9e27568-220b-44de-82bf-f8df701829fa\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7545595b5-7952m" Apr 23 17:31:18.878349 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:31:18.878258 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7545595b5-7952m" Apr 23 17:31:19.007890 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:31:19.007855 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7545595b5-7952m"] Apr 23 17:31:19.010932 ip-10-0-135-57 kubenswrapper[2562]: W0423 17:31:19.010902 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9e27568_220b_44de_82bf_f8df701829fa.slice/crio-b2a6480a0c0d1bc16b364f803822f0445c8d6e2961aa11b3e01f70079a8d1850 WatchSource:0}: Error finding container b2a6480a0c0d1bc16b364f803822f0445c8d6e2961aa11b3e01f70079a8d1850: Status 404 returned error can't find the container with id b2a6480a0c0d1bc16b364f803822f0445c8d6e2961aa11b3e01f70079a8d1850 Apr 23 17:31:19.303696 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:31:19.303659 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7545595b5-7952m" event={"ID":"e9e27568-220b-44de-82bf-f8df701829fa","Type":"ContainerStarted","Data":"fcf6e3a896acbe14207a873f059af5a53e697962dafe05af920f7be1cf7f939c"} Apr 23 17:31:19.303696 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:31:19.303699 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7545595b5-7952m" event={"ID":"e9e27568-220b-44de-82bf-f8df701829fa","Type":"ContainerStarted","Data":"b2a6480a0c0d1bc16b364f803822f0445c8d6e2961aa11b3e01f70079a8d1850"} Apr 23 17:31:20.307584 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:31:20.307494 2562 generic.go:358] "Generic (PLEG): container finished" podID="e9e27568-220b-44de-82bf-f8df701829fa" containerID="fcf6e3a896acbe14207a873f059af5a53e697962dafe05af920f7be1cf7f939c" exitCode=0 Apr 23 17:31:20.307584 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:31:20.307554 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7545595b5-7952m" event={"ID":"e9e27568-220b-44de-82bf-f8df701829fa","Type":"ContainerDied","Data":"fcf6e3a896acbe14207a873f059af5a53e697962dafe05af920f7be1cf7f939c"} Apr 23 17:31:21.312203 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:31:21.312166 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7545595b5-7952m" event={"ID":"e9e27568-220b-44de-82bf-f8df701829fa","Type":"ContainerStarted","Data":"68a81575d67aafa936e271c4b30328d9fea20037210709d1fe38d1230fe1a22e"} Apr 23 17:31:21.312642 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:31:21.312361 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7545595b5-7952m" Apr 23 17:31:21.313751 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:31:21.313715 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7545595b5-7952m" podUID="e9e27568-220b-44de-82bf-f8df701829fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.63:8080: connect: connection refused" Apr 23 17:31:21.338193 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:31:21.338132 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7545595b5-7952m" podStartSLOduration=3.338111813 podStartE2EDuration="3.338111813s" podCreationTimestamp="2026-04-23 17:31:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:31:21.336047842 +0000 UTC m=+3361.188862908" watchObservedRunningTime="2026-04-23 17:31:21.338111813 +0000 UTC m=+3361.190926877" Apr 23 17:31:22.317153 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:31:22.317118 2562 generic.go:358] "Generic (PLEG): container finished" podID="96499f1a-002a-4f25-9600-fef8def7e241" containerID="23c759187bc8c5e44ff634c8797fe35864065e4113be496baf39daadccefd157" exitCode=0 Apr 23 17:31:22.317533 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:31:22.317194 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-sfmj4" event={"ID":"96499f1a-002a-4f25-9600-fef8def7e241","Type":"ContainerDied","Data":"23c759187bc8c5e44ff634c8797fe35864065e4113be496baf39daadccefd157"} Apr 23 17:31:22.317697 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:31:22.317674 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7545595b5-7952m" podUID="e9e27568-220b-44de-82bf-f8df701829fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.63:8080: connect: connection refused" Apr 23 17:31:22.409287 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:31:22.409264 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-sfmj4" Apr 23 17:31:22.455348 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:31:22.455313 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/96499f1a-002a-4f25-9600-fef8def7e241-kserve-provision-location\") pod \"96499f1a-002a-4f25-9600-fef8def7e241\" (UID: \"96499f1a-002a-4f25-9600-fef8def7e241\") " Apr 23 17:31:22.455685 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:31:22.455664 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96499f1a-002a-4f25-9600-fef8def7e241-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "96499f1a-002a-4f25-9600-fef8def7e241" (UID: "96499f1a-002a-4f25-9600-fef8def7e241"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:31:22.556058 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:31:22.555962 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/96499f1a-002a-4f25-9600-fef8def7e241-kserve-provision-location\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 17:31:23.321847 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:31:23.321810 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-sfmj4" event={"ID":"96499f1a-002a-4f25-9600-fef8def7e241","Type":"ContainerDied","Data":"4c260b4e1dc7eef6902985e0eba2f70488629dd4dafcf32a5eabc1c8e18e1ad3"} Apr 23 17:31:23.321847 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:31:23.321852 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-sfmj4" Apr 23 17:31:23.322322 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:31:23.321866 2562 scope.go:117] "RemoveContainer" containerID="23c759187bc8c5e44ff634c8797fe35864065e4113be496baf39daadccefd157" Apr 23 17:31:23.329754 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:31:23.329723 2562 scope.go:117] "RemoveContainer" containerID="5af6b750b268e45c8a2747c107e9403c1f0f181f550dc9f187e11b01a93c1e7d" Apr 23 17:31:23.344530 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:31:23.344498 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-sfmj4"] Apr 23 17:31:23.349581 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:31:23.349555 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-sfmj4"] Apr 23 17:31:24.741534 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:31:24.741498 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96499f1a-002a-4f25-9600-fef8def7e241" path="/var/lib/kubelet/pods/96499f1a-002a-4f25-9600-fef8def7e241/volumes" Apr 23 17:31:32.318029 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:31:32.317985 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7545595b5-7952m" podUID="e9e27568-220b-44de-82bf-f8df701829fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.63:8080: connect: connection refused" Apr 23 17:31:42.317976 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:31:42.317934 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7545595b5-7952m" podUID="e9e27568-220b-44de-82bf-f8df701829fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.63:8080: connect: connection refused" Apr 23 17:31:52.318491 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:31:52.318441 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7545595b5-7952m" podUID="e9e27568-220b-44de-82bf-f8df701829fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.63:8080: connect: connection refused" Apr 23 17:32:02.317705 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:32:02.317660 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7545595b5-7952m" podUID="e9e27568-220b-44de-82bf-f8df701829fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.63:8080: connect: connection refused" Apr 23 17:32:12.318681 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:32:12.318630 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7545595b5-7952m" podUID="e9e27568-220b-44de-82bf-f8df701829fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.63:8080: connect: connection refused" Apr 23 17:32:22.318052 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:32:22.318003 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7545595b5-7952m" podUID="e9e27568-220b-44de-82bf-f8df701829fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.63:8080: connect: connection refused" Apr 23 17:32:32.319689 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:32:32.319653 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7545595b5-7952m" Apr 23 17:32:38.650279 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:32:38.650242 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7545595b5-7952m"] Apr 23 17:32:38.650803 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:32:38.650607 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7545595b5-7952m" podUID="e9e27568-220b-44de-82bf-f8df701829fa" containerName="kserve-container" containerID="cri-o://68a81575d67aafa936e271c4b30328d9fea20037210709d1fe38d1230fe1a22e" gracePeriod=30 Apr 23 17:32:38.782413 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:32:38.782380 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5b8c65f485-qk82t"] Apr 23 17:32:38.782703 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:32:38.782691 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="96499f1a-002a-4f25-9600-fef8def7e241" containerName="kserve-container" Apr 23 17:32:38.782781 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:32:38.782705 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="96499f1a-002a-4f25-9600-fef8def7e241" containerName="kserve-container" Apr 23 17:32:38.782781 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:32:38.782716 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="96499f1a-002a-4f25-9600-fef8def7e241" containerName="storage-initializer" Apr 23 17:32:38.782781 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:32:38.782721 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="96499f1a-002a-4f25-9600-fef8def7e241" containerName="storage-initializer" Apr 23 17:32:38.782888 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:32:38.782812 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="96499f1a-002a-4f25-9600-fef8def7e241" containerName="kserve-container" Apr 23 17:32:38.785709 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:32:38.785693 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5b8c65f485-qk82t" Apr 23 17:32:38.788506 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:32:38.788481 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 23 17:32:38.795161 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:32:38.795135 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5b8c65f485-qk82t"] Apr 23 17:32:38.916147 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:32:38.916049 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/98313a47-3572-4852-a557-d816e450fcef-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5b8c65f485-qk82t\" (UID: \"98313a47-3572-4852-a557-d816e450fcef\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5b8c65f485-qk82t" Apr 23 17:32:38.916147 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:32:38.916105 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/98313a47-3572-4852-a557-d816e450fcef-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5b8c65f485-qk82t\" (UID: \"98313a47-3572-4852-a557-d816e450fcef\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5b8c65f485-qk82t" Apr 23 17:32:39.016909 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:32:39.016871 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/98313a47-3572-4852-a557-d816e450fcef-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5b8c65f485-qk82t\" (UID: \"98313a47-3572-4852-a557-d816e450fcef\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5b8c65f485-qk82t" Apr 23 17:32:39.017091 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:32:39.016956 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/98313a47-3572-4852-a557-d816e450fcef-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5b8c65f485-qk82t\" (UID: \"98313a47-3572-4852-a557-d816e450fcef\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5b8c65f485-qk82t" Apr 23 17:32:39.017286 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:32:39.017269 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/98313a47-3572-4852-a557-d816e450fcef-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5b8c65f485-qk82t\" (UID: \"98313a47-3572-4852-a557-d816e450fcef\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5b8c65f485-qk82t" Apr 23 17:32:39.017526 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:32:39.017505 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/98313a47-3572-4852-a557-d816e450fcef-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5b8c65f485-qk82t\" (UID: \"98313a47-3572-4852-a557-d816e450fcef\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5b8c65f485-qk82t" Apr 23 17:32:39.096521 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:32:39.096483 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5b8c65f485-qk82t" Apr 23 17:32:39.228391 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:32:39.228367 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5b8c65f485-qk82t"] Apr 23 17:32:39.230263 ip-10-0-135-57 kubenswrapper[2562]: W0423 17:32:39.230228 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98313a47_3572_4852_a557_d816e450fcef.slice/crio-40abf1d1e95c71dfc98bf59d7d4d1180558e368e17d2d629c79b1958d5993adc WatchSource:0}: Error finding container 40abf1d1e95c71dfc98bf59d7d4d1180558e368e17d2d629c79b1958d5993adc: Status 404 returned error can't find the container with id 40abf1d1e95c71dfc98bf59d7d4d1180558e368e17d2d629c79b1958d5993adc Apr 23 17:32:39.232180 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:32:39.232160 2562 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 17:32:39.552502 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:32:39.552405 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5b8c65f485-qk82t" event={"ID":"98313a47-3572-4852-a557-d816e450fcef","Type":"ContainerStarted","Data":"2de1d99d001def4fba4dc506872a7a29f2402ff06b0bc84b6b1c32a9a1a22ba5"} Apr 23 17:32:39.552502 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:32:39.552445 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5b8c65f485-qk82t" event={"ID":"98313a47-3572-4852-a557-d816e450fcef","Type":"ContainerStarted","Data":"40abf1d1e95c71dfc98bf59d7d4d1180558e368e17d2d629c79b1958d5993adc"} Apr 23 17:32:40.556951 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:32:40.556871 2562 generic.go:358] "Generic (PLEG): container finished" podID="98313a47-3572-4852-a557-d816e450fcef" containerID="2de1d99d001def4fba4dc506872a7a29f2402ff06b0bc84b6b1c32a9a1a22ba5" exitCode=0 Apr 23 17:32:40.556951 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:32:40.556918 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5b8c65f485-qk82t" event={"ID":"98313a47-3572-4852-a557-d816e450fcef","Type":"ContainerDied","Data":"2de1d99d001def4fba4dc506872a7a29f2402ff06b0bc84b6b1c32a9a1a22ba5"} Apr 23 17:32:41.563831 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:32:41.563784 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5b8c65f485-qk82t" event={"ID":"98313a47-3572-4852-a557-d816e450fcef","Type":"ContainerStarted","Data":"6cca363d65e971661ccd3a0efdc8218486751f1a46971869fda7fecf8df79a95"} Apr 23 17:32:41.564253 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:32:41.564024 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5b8c65f485-qk82t" Apr 23 17:32:41.565346 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:32:41.565317 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5b8c65f485-qk82t" podUID="98313a47-3572-4852-a557-d816e450fcef" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.64:8080: connect: connection refused" Apr 23 17:32:41.584692 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:32:41.584644 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5b8c65f485-qk82t" podStartSLOduration=3.584630378 podStartE2EDuration="3.584630378s" podCreationTimestamp="2026-04-23 17:32:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:32:41.582915144 +0000 UTC m=+3441.435730245" watchObservedRunningTime="2026-04-23 17:32:41.584630378 +0000 UTC m=+3441.437445868" Apr 23 17:32:42.317896 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:32:42.317852 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7545595b5-7952m" podUID="e9e27568-220b-44de-82bf-f8df701829fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.63:8080: connect: connection refused" Apr 23 17:32:42.568485 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:32:42.568403 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5b8c65f485-qk82t" podUID="98313a47-3572-4852-a557-d816e450fcef" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.64:8080: connect: connection refused" Apr 23 17:32:43.186966 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:32:43.186943 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7545595b5-7952m" Apr 23 17:32:43.355890 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:32:43.355801 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e9e27568-220b-44de-82bf-f8df701829fa-kserve-provision-location\") pod \"e9e27568-220b-44de-82bf-f8df701829fa\" (UID: \"e9e27568-220b-44de-82bf-f8df701829fa\") " Apr 23 17:32:43.356165 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:32:43.356144 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9e27568-220b-44de-82bf-f8df701829fa-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e9e27568-220b-44de-82bf-f8df701829fa" (UID: "e9e27568-220b-44de-82bf-f8df701829fa"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:32:43.457337 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:32:43.457301 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e9e27568-220b-44de-82bf-f8df701829fa-kserve-provision-location\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 17:32:43.572760 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:32:43.572707 2562 generic.go:358] "Generic (PLEG): container finished" podID="e9e27568-220b-44de-82bf-f8df701829fa" containerID="68a81575d67aafa936e271c4b30328d9fea20037210709d1fe38d1230fe1a22e" exitCode=0 Apr 23 17:32:43.573148 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:32:43.572797 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7545595b5-7952m" Apr 23 17:32:43.573148 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:32:43.572797 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7545595b5-7952m" event={"ID":"e9e27568-220b-44de-82bf-f8df701829fa","Type":"ContainerDied","Data":"68a81575d67aafa936e271c4b30328d9fea20037210709d1fe38d1230fe1a22e"} Apr 23 17:32:43.573148 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:32:43.572838 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7545595b5-7952m" event={"ID":"e9e27568-220b-44de-82bf-f8df701829fa","Type":"ContainerDied","Data":"b2a6480a0c0d1bc16b364f803822f0445c8d6e2961aa11b3e01f70079a8d1850"} Apr 23 17:32:43.573148 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:32:43.572854 2562 scope.go:117] "RemoveContainer" containerID="68a81575d67aafa936e271c4b30328d9fea20037210709d1fe38d1230fe1a22e" Apr 23 17:32:43.580872 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:32:43.580828 2562 scope.go:117] "RemoveContainer" containerID="fcf6e3a896acbe14207a873f059af5a53e697962dafe05af920f7be1cf7f939c" Apr 23 17:32:43.588314 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:32:43.588298 2562 scope.go:117] "RemoveContainer" containerID="68a81575d67aafa936e271c4b30328d9fea20037210709d1fe38d1230fe1a22e" Apr 23 17:32:43.588585 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:32:43.588565 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68a81575d67aafa936e271c4b30328d9fea20037210709d1fe38d1230fe1a22e\": container with ID starting with 68a81575d67aafa936e271c4b30328d9fea20037210709d1fe38d1230fe1a22e not found: ID does not exist" containerID="68a81575d67aafa936e271c4b30328d9fea20037210709d1fe38d1230fe1a22e" Apr 23 17:32:43.588648 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:32:43.588594 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68a81575d67aafa936e271c4b30328d9fea20037210709d1fe38d1230fe1a22e"} err="failed to get container status \"68a81575d67aafa936e271c4b30328d9fea20037210709d1fe38d1230fe1a22e\": rpc error: code = NotFound desc = could not find container \"68a81575d67aafa936e271c4b30328d9fea20037210709d1fe38d1230fe1a22e\": container with ID starting with 68a81575d67aafa936e271c4b30328d9fea20037210709d1fe38d1230fe1a22e not found: ID does not exist" Apr 23 17:32:43.588648 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:32:43.588613 2562 scope.go:117] "RemoveContainer" containerID="fcf6e3a896acbe14207a873f059af5a53e697962dafe05af920f7be1cf7f939c" Apr 23 17:32:43.588855 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:32:43.588838 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcf6e3a896acbe14207a873f059af5a53e697962dafe05af920f7be1cf7f939c\": container with ID starting with fcf6e3a896acbe14207a873f059af5a53e697962dafe05af920f7be1cf7f939c not found: ID does not exist" containerID="fcf6e3a896acbe14207a873f059af5a53e697962dafe05af920f7be1cf7f939c" Apr 23 17:32:43.588904 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:32:43.588861 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcf6e3a896acbe14207a873f059af5a53e697962dafe05af920f7be1cf7f939c"} err="failed to get container status \"fcf6e3a896acbe14207a873f059af5a53e697962dafe05af920f7be1cf7f939c\": rpc error: code = NotFound desc = could not find container \"fcf6e3a896acbe14207a873f059af5a53e697962dafe05af920f7be1cf7f939c\": container with ID starting with fcf6e3a896acbe14207a873f059af5a53e697962dafe05af920f7be1cf7f939c not found: ID does not exist" Apr 23 17:32:43.595788 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:32:43.595732 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7545595b5-7952m"] Apr 23 17:32:43.605313 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:32:43.605290 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-7545595b5-7952m"] Apr 23 17:32:44.742990 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:32:44.742959 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9e27568-220b-44de-82bf-f8df701829fa" path="/var/lib/kubelet/pods/e9e27568-220b-44de-82bf-f8df701829fa/volumes" Apr 23 17:32:52.568837 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:32:52.568790 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5b8c65f485-qk82t" podUID="98313a47-3572-4852-a557-d816e450fcef" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.64:8080: connect: connection refused" Apr 23 17:33:02.568725 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:33:02.568681 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5b8c65f485-qk82t" podUID="98313a47-3572-4852-a557-d816e450fcef" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.64:8080: connect: connection refused" Apr 23 17:33:12.568547 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:33:12.568502 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5b8c65f485-qk82t" podUID="98313a47-3572-4852-a557-d816e450fcef" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.64:8080: connect: connection refused" Apr 23 17:33:22.568922 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:33:22.568876 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5b8c65f485-qk82t" podUID="98313a47-3572-4852-a557-d816e450fcef" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.64:8080: connect: connection refused" Apr 23 17:33:32.568709 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:33:32.568664 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5b8c65f485-qk82t" podUID="98313a47-3572-4852-a557-d816e450fcef" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.64:8080: connect: connection refused" Apr 23 17:33:42.568909 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:33:42.568815 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5b8c65f485-qk82t" podUID="98313a47-3572-4852-a557-d816e450fcef" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.64:8080: connect: connection refused" Apr 23 17:33:52.569623 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:33:52.569591 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5b8c65f485-qk82t" Apr 23 17:33:58.839714 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:33:58.839682 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5b8c65f485-qk82t"] Apr 23 17:33:58.840280 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:33:58.840004 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5b8c65f485-qk82t" podUID="98313a47-3572-4852-a557-d816e450fcef" containerName="kserve-container" containerID="cri-o://6cca363d65e971661ccd3a0efdc8218486751f1a46971869fda7fecf8df79a95" gracePeriod=30 Apr 23 17:33:59.947713 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:33:59.947678 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69bb8bfff5-n78pc"] Apr 23 17:33:59.948183 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:33:59.948010 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e9e27568-220b-44de-82bf-f8df701829fa" containerName="kserve-container" Apr 23 17:33:59.948183 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:33:59.948022 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9e27568-220b-44de-82bf-f8df701829fa" containerName="kserve-container" Apr 23 17:33:59.948183 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:33:59.948030 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e9e27568-220b-44de-82bf-f8df701829fa" containerName="storage-initializer" Apr 23 17:33:59.948183 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:33:59.948035 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9e27568-220b-44de-82bf-f8df701829fa" containerName="storage-initializer" Apr 23 17:33:59.948183 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:33:59.948101 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="e9e27568-220b-44de-82bf-f8df701829fa" containerName="kserve-container" Apr 23 17:33:59.951433 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:33:59.951411 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69bb8bfff5-n78pc" Apr 23 17:33:59.960941 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:33:59.960916 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69bb8bfff5-n78pc"] Apr 23 17:33:59.979056 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:33:59.979024 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/524ff90d-edcf-4a72-9856-0101b0ec89cf-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-69bb8bfff5-n78pc\" (UID: \"524ff90d-edcf-4a72-9856-0101b0ec89cf\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69bb8bfff5-n78pc" Apr 23 17:34:00.080293 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:00.080252 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/524ff90d-edcf-4a72-9856-0101b0ec89cf-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-69bb8bfff5-n78pc\" (UID: \"524ff90d-edcf-4a72-9856-0101b0ec89cf\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69bb8bfff5-n78pc" Apr 23 17:34:00.080628 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:00.080609 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/524ff90d-edcf-4a72-9856-0101b0ec89cf-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-69bb8bfff5-n78pc\" (UID: \"524ff90d-edcf-4a72-9856-0101b0ec89cf\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69bb8bfff5-n78pc" Apr 23 17:34:00.262947 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:00.262855 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69bb8bfff5-n78pc" Apr 23 17:34:00.389422 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:00.389395 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69bb8bfff5-n78pc"] Apr 23 17:34:00.392106 ip-10-0-135-57 kubenswrapper[2562]: W0423 17:34:00.392079 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod524ff90d_edcf_4a72_9856_0101b0ec89cf.slice/crio-3313366be59f7b66bb01b32b86579d49db4661ddfefe256a6d3b6e926d9374a1 WatchSource:0}: Error finding container 3313366be59f7b66bb01b32b86579d49db4661ddfefe256a6d3b6e926d9374a1: Status 404 returned error can't find the container with id 3313366be59f7b66bb01b32b86579d49db4661ddfefe256a6d3b6e926d9374a1 Apr 23 17:34:00.816277 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:00.816232 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69bb8bfff5-n78pc" event={"ID":"524ff90d-edcf-4a72-9856-0101b0ec89cf","Type":"ContainerStarted","Data":"43f6b8b49572211bba24278ee62bc236e5f7164d3b5c0556444c5da8af2b388d"} Apr 23 17:34:00.816277 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:00.816279 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69bb8bfff5-n78pc" event={"ID":"524ff90d-edcf-4a72-9856-0101b0ec89cf","Type":"ContainerStarted","Data":"3313366be59f7b66bb01b32b86579d49db4661ddfefe256a6d3b6e926d9374a1"} Apr 23 17:34:02.568625 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:02.568573 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5b8c65f485-qk82t" podUID="98313a47-3572-4852-a557-d816e450fcef" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.64:8080: connect: connection refused" Apr 23 17:34:03.289757 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:03.289714 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5b8c65f485-qk82t" Apr 23 17:34:03.306458 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:03.306425 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/98313a47-3572-4852-a557-d816e450fcef-cabundle-cert\") pod \"98313a47-3572-4852-a557-d816e450fcef\" (UID: \"98313a47-3572-4852-a557-d816e450fcef\") " Apr 23 17:34:03.306691 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:03.306490 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/98313a47-3572-4852-a557-d816e450fcef-kserve-provision-location\") pod \"98313a47-3572-4852-a557-d816e450fcef\" (UID: \"98313a47-3572-4852-a557-d816e450fcef\") " Apr 23 17:34:03.307056 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:03.307022 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98313a47-3572-4852-a557-d816e450fcef-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "98313a47-3572-4852-a557-d816e450fcef" (UID: "98313a47-3572-4852-a557-d816e450fcef"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:34:03.307157 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:03.307078 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98313a47-3572-4852-a557-d816e450fcef-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "98313a47-3572-4852-a557-d816e450fcef" (UID: "98313a47-3572-4852-a557-d816e450fcef"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:34:03.407373 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:03.407330 2562 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/98313a47-3572-4852-a557-d816e450fcef-cabundle-cert\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 17:34:03.407373 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:03.407370 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/98313a47-3572-4852-a557-d816e450fcef-kserve-provision-location\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 17:34:03.826762 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:03.826655 2562 generic.go:358] "Generic (PLEG): container finished" podID="98313a47-3572-4852-a557-d816e450fcef" containerID="6cca363d65e971661ccd3a0efdc8218486751f1a46971869fda7fecf8df79a95" exitCode=0 Apr 23 17:34:03.826762 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:03.826718 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5b8c65f485-qk82t" Apr 23 17:34:03.826762 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:03.826720 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5b8c65f485-qk82t" event={"ID":"98313a47-3572-4852-a557-d816e450fcef","Type":"ContainerDied","Data":"6cca363d65e971661ccd3a0efdc8218486751f1a46971869fda7fecf8df79a95"} Apr 23 17:34:03.827317 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:03.826782 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5b8c65f485-qk82t" event={"ID":"98313a47-3572-4852-a557-d816e450fcef","Type":"ContainerDied","Data":"40abf1d1e95c71dfc98bf59d7d4d1180558e368e17d2d629c79b1958d5993adc"} Apr 23 17:34:03.827317 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:03.826802 2562 scope.go:117] "RemoveContainer" containerID="6cca363d65e971661ccd3a0efdc8218486751f1a46971869fda7fecf8df79a95" Apr 23 17:34:03.834842 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:03.834825 2562 scope.go:117] "RemoveContainer" containerID="2de1d99d001def4fba4dc506872a7a29f2402ff06b0bc84b6b1c32a9a1a22ba5" Apr 23 17:34:03.841850 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:03.841829 2562 scope.go:117] "RemoveContainer" containerID="6cca363d65e971661ccd3a0efdc8218486751f1a46971869fda7fecf8df79a95" Apr 23 17:34:03.842166 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:34:03.842144 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cca363d65e971661ccd3a0efdc8218486751f1a46971869fda7fecf8df79a95\": container with ID starting with 6cca363d65e971661ccd3a0efdc8218486751f1a46971869fda7fecf8df79a95 not found: ID does not exist" containerID="6cca363d65e971661ccd3a0efdc8218486751f1a46971869fda7fecf8df79a95" Apr 23 17:34:03.842218 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:03.842176 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cca363d65e971661ccd3a0efdc8218486751f1a46971869fda7fecf8df79a95"} err="failed to get container status \"6cca363d65e971661ccd3a0efdc8218486751f1a46971869fda7fecf8df79a95\": rpc error: code = NotFound desc = could not find container \"6cca363d65e971661ccd3a0efdc8218486751f1a46971869fda7fecf8df79a95\": container with ID starting with 6cca363d65e971661ccd3a0efdc8218486751f1a46971869fda7fecf8df79a95 not found: ID does not exist" Apr 23 17:34:03.842218 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:03.842195 2562 scope.go:117] "RemoveContainer" containerID="2de1d99d001def4fba4dc506872a7a29f2402ff06b0bc84b6b1c32a9a1a22ba5" Apr 23 17:34:03.842418 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:34:03.842400 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2de1d99d001def4fba4dc506872a7a29f2402ff06b0bc84b6b1c32a9a1a22ba5\": container with ID starting with 2de1d99d001def4fba4dc506872a7a29f2402ff06b0bc84b6b1c32a9a1a22ba5 not found: ID does not exist" containerID="2de1d99d001def4fba4dc506872a7a29f2402ff06b0bc84b6b1c32a9a1a22ba5" Apr 23 17:34:03.842472 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:03.842427 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2de1d99d001def4fba4dc506872a7a29f2402ff06b0bc84b6b1c32a9a1a22ba5"} err="failed to get container status \"2de1d99d001def4fba4dc506872a7a29f2402ff06b0bc84b6b1c32a9a1a22ba5\": rpc error: code = NotFound desc = could not find container \"2de1d99d001def4fba4dc506872a7a29f2402ff06b0bc84b6b1c32a9a1a22ba5\": container with ID starting with 2de1d99d001def4fba4dc506872a7a29f2402ff06b0bc84b6b1c32a9a1a22ba5 not found: ID does not exist" Apr 23 17:34:03.848017 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:03.847995 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5b8c65f485-qk82t"] Apr 23 17:34:03.850075 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:03.850055 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5b8c65f485-qk82t"] Apr 23 17:34:04.741988 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:04.741956 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98313a47-3572-4852-a557-d816e450fcef" path="/var/lib/kubelet/pods/98313a47-3572-4852-a557-d816e450fcef/volumes" Apr 23 17:34:06.103649 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:34:06.103619 2562 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod524ff90d_edcf_4a72_9856_0101b0ec89cf.slice/crio-conmon-43f6b8b49572211bba24278ee62bc236e5f7164d3b5c0556444c5da8af2b388d.scope\": RecentStats: unable to find data in memory cache]" Apr 23 17:34:06.837365 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:06.837339 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-69bb8bfff5-n78pc_524ff90d-edcf-4a72-9856-0101b0ec89cf/storage-initializer/0.log" Apr 23 17:34:06.837534 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:06.837378 2562 generic.go:358] "Generic (PLEG): container finished" podID="524ff90d-edcf-4a72-9856-0101b0ec89cf" containerID="43f6b8b49572211bba24278ee62bc236e5f7164d3b5c0556444c5da8af2b388d" exitCode=1 Apr 23 17:34:06.837534 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:06.837460 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69bb8bfff5-n78pc" event={"ID":"524ff90d-edcf-4a72-9856-0101b0ec89cf","Type":"ContainerDied","Data":"43f6b8b49572211bba24278ee62bc236e5f7164d3b5c0556444c5da8af2b388d"} Apr 23 17:34:07.842076 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:07.842051 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-69bb8bfff5-n78pc_524ff90d-edcf-4a72-9856-0101b0ec89cf/storage-initializer/0.log" Apr 23 17:34:07.842517 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:07.842172 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69bb8bfff5-n78pc" event={"ID":"524ff90d-edcf-4a72-9856-0101b0ec89cf","Type":"ContainerStarted","Data":"674ac7b06c3c0ee4617ec597970fc8cc32f932b5a522a549d39fa4bfbbea612f"} Apr 23 17:34:09.931958 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:09.931920 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69bb8bfff5-n78pc"] Apr 23 17:34:09.932376 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:09.932146 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69bb8bfff5-n78pc" podUID="524ff90d-edcf-4a72-9856-0101b0ec89cf" containerName="storage-initializer" containerID="cri-o://674ac7b06c3c0ee4617ec597970fc8cc32f932b5a522a549d39fa4bfbbea612f" gracePeriod=30 Apr 23 17:34:11.057677 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:11.057631 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-754bf5d7d9-fdzgr"] Apr 23 17:34:11.058146 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:11.057961 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="98313a47-3572-4852-a557-d816e450fcef" containerName="storage-initializer" Apr 23 17:34:11.058146 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:11.057972 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="98313a47-3572-4852-a557-d816e450fcef" containerName="storage-initializer" Apr 23 17:34:11.058146 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:11.057979 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="98313a47-3572-4852-a557-d816e450fcef" containerName="kserve-container" Apr 23 17:34:11.058146 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:11.057985 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="98313a47-3572-4852-a557-d816e450fcef" containerName="kserve-container" Apr 23 17:34:11.058146 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:11.058034 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="98313a47-3572-4852-a557-d816e450fcef" containerName="kserve-container" Apr 23 17:34:11.060809 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:11.060792 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-754bf5d7d9-fdzgr" Apr 23 17:34:11.063607 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:11.063584 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 23 17:34:11.069888 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:11.069864 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-754bf5d7d9-fdzgr"] Apr 23 17:34:11.174356 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:11.174302 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/b43d81f5-d692-4d9c-93c6-17b8f34f92f3-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-754bf5d7d9-fdzgr\" (UID: \"b43d81f5-d692-4d9c-93c6-17b8f34f92f3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-754bf5d7d9-fdzgr" Apr 23 17:34:11.174543 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:11.174373 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b43d81f5-d692-4d9c-93c6-17b8f34f92f3-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-754bf5d7d9-fdzgr\" (UID: \"b43d81f5-d692-4d9c-93c6-17b8f34f92f3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-754bf5d7d9-fdzgr" Apr 23 17:34:11.275606 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:11.275564 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/b43d81f5-d692-4d9c-93c6-17b8f34f92f3-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-754bf5d7d9-fdzgr\" (UID: \"b43d81f5-d692-4d9c-93c6-17b8f34f92f3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-754bf5d7d9-fdzgr" Apr 23 17:34:11.275813 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:11.275618 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b43d81f5-d692-4d9c-93c6-17b8f34f92f3-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-754bf5d7d9-fdzgr\" (UID: \"b43d81f5-d692-4d9c-93c6-17b8f34f92f3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-754bf5d7d9-fdzgr" Apr 23 17:34:11.275982 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:11.275962 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b43d81f5-d692-4d9c-93c6-17b8f34f92f3-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-754bf5d7d9-fdzgr\" (UID: \"b43d81f5-d692-4d9c-93c6-17b8f34f92f3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-754bf5d7d9-fdzgr" Apr 23 17:34:11.276280 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:11.276254 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/b43d81f5-d692-4d9c-93c6-17b8f34f92f3-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-754bf5d7d9-fdzgr\" (UID: \"b43d81f5-d692-4d9c-93c6-17b8f34f92f3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-754bf5d7d9-fdzgr" Apr 23 17:34:11.371795 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:11.371704 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-754bf5d7d9-fdzgr" Apr 23 17:34:11.500761 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:11.500548 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-754bf5d7d9-fdzgr"] Apr 23 17:34:11.503616 ip-10-0-135-57 kubenswrapper[2562]: W0423 17:34:11.503586 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb43d81f5_d692_4d9c_93c6_17b8f34f92f3.slice/crio-d95421759c07bad85609bb3f3ce04be312c2c50d710ded0b72691ab00c3016da WatchSource:0}: Error finding container d95421759c07bad85609bb3f3ce04be312c2c50d710ded0b72691ab00c3016da: Status 404 returned error can't find the container with id d95421759c07bad85609bb3f3ce04be312c2c50d710ded0b72691ab00c3016da Apr 23 17:34:11.779503 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:11.779476 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-69bb8bfff5-n78pc_524ff90d-edcf-4a72-9856-0101b0ec89cf/storage-initializer/1.log" Apr 23 17:34:11.779927 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:11.779907 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-69bb8bfff5-n78pc_524ff90d-edcf-4a72-9856-0101b0ec89cf/storage-initializer/0.log" Apr 23 17:34:11.780013 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:11.779972 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69bb8bfff5-n78pc" Apr 23 17:34:11.854976 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:11.854932 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-754bf5d7d9-fdzgr" event={"ID":"b43d81f5-d692-4d9c-93c6-17b8f34f92f3","Type":"ContainerStarted","Data":"d6802b3bf98b9d718ee4a7095aa2db72ebbc744d8d29c3af761508965c2630af"} Apr 23 17:34:11.854976 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:11.854978 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-754bf5d7d9-fdzgr" event={"ID":"b43d81f5-d692-4d9c-93c6-17b8f34f92f3","Type":"ContainerStarted","Data":"d95421759c07bad85609bb3f3ce04be312c2c50d710ded0b72691ab00c3016da"} Apr 23 17:34:11.856146 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:11.856125 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-69bb8bfff5-n78pc_524ff90d-edcf-4a72-9856-0101b0ec89cf/storage-initializer/1.log" Apr 23 17:34:11.856516 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:11.856501 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-69bb8bfff5-n78pc_524ff90d-edcf-4a72-9856-0101b0ec89cf/storage-initializer/0.log" Apr 23 17:34:11.856571 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:11.856536 2562 generic.go:358] "Generic (PLEG): container finished" podID="524ff90d-edcf-4a72-9856-0101b0ec89cf" containerID="674ac7b06c3c0ee4617ec597970fc8cc32f932b5a522a549d39fa4bfbbea612f" exitCode=1 Apr 23 17:34:11.856610 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:11.856597 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69bb8bfff5-n78pc" Apr 23 17:34:11.856645 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:11.856626 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69bb8bfff5-n78pc" event={"ID":"524ff90d-edcf-4a72-9856-0101b0ec89cf","Type":"ContainerDied","Data":"674ac7b06c3c0ee4617ec597970fc8cc32f932b5a522a549d39fa4bfbbea612f"} Apr 23 17:34:11.856684 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:11.856656 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69bb8bfff5-n78pc" event={"ID":"524ff90d-edcf-4a72-9856-0101b0ec89cf","Type":"ContainerDied","Data":"3313366be59f7b66bb01b32b86579d49db4661ddfefe256a6d3b6e926d9374a1"} Apr 23 17:34:11.856718 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:11.856683 2562 scope.go:117] "RemoveContainer" containerID="674ac7b06c3c0ee4617ec597970fc8cc32f932b5a522a549d39fa4bfbbea612f" Apr 23 17:34:11.865661 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:11.865641 2562 scope.go:117] "RemoveContainer" containerID="43f6b8b49572211bba24278ee62bc236e5f7164d3b5c0556444c5da8af2b388d" Apr 23 17:34:11.874424 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:11.874406 2562 scope.go:117] "RemoveContainer" containerID="674ac7b06c3c0ee4617ec597970fc8cc32f932b5a522a549d39fa4bfbbea612f" Apr 23 17:34:11.874737 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:34:11.874712 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"674ac7b06c3c0ee4617ec597970fc8cc32f932b5a522a549d39fa4bfbbea612f\": container with ID starting with 674ac7b06c3c0ee4617ec597970fc8cc32f932b5a522a549d39fa4bfbbea612f not found: ID does not exist" containerID="674ac7b06c3c0ee4617ec597970fc8cc32f932b5a522a549d39fa4bfbbea612f" Apr 23 17:34:11.874854 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:11.874772 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"674ac7b06c3c0ee4617ec597970fc8cc32f932b5a522a549d39fa4bfbbea612f"} err="failed to get container status \"674ac7b06c3c0ee4617ec597970fc8cc32f932b5a522a549d39fa4bfbbea612f\": rpc error: code = NotFound desc = could not find container \"674ac7b06c3c0ee4617ec597970fc8cc32f932b5a522a549d39fa4bfbbea612f\": container with ID starting with 674ac7b06c3c0ee4617ec597970fc8cc32f932b5a522a549d39fa4bfbbea612f not found: ID does not exist" Apr 23 17:34:11.874854 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:11.874801 2562 scope.go:117] "RemoveContainer" containerID="43f6b8b49572211bba24278ee62bc236e5f7164d3b5c0556444c5da8af2b388d" Apr 23 17:34:11.875109 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:34:11.875083 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43f6b8b49572211bba24278ee62bc236e5f7164d3b5c0556444c5da8af2b388d\": container with ID starting with 43f6b8b49572211bba24278ee62bc236e5f7164d3b5c0556444c5da8af2b388d not found: ID does not exist" containerID="43f6b8b49572211bba24278ee62bc236e5f7164d3b5c0556444c5da8af2b388d" Apr 23 17:34:11.875180 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:11.875123 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43f6b8b49572211bba24278ee62bc236e5f7164d3b5c0556444c5da8af2b388d"} err="failed to get container status \"43f6b8b49572211bba24278ee62bc236e5f7164d3b5c0556444c5da8af2b388d\": rpc error: code = NotFound desc = could not find container \"43f6b8b49572211bba24278ee62bc236e5f7164d3b5c0556444c5da8af2b388d\": container with ID starting with 43f6b8b49572211bba24278ee62bc236e5f7164d3b5c0556444c5da8af2b388d not found: ID does not exist" Apr 23 17:34:11.880092 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:11.880069 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/524ff90d-edcf-4a72-9856-0101b0ec89cf-kserve-provision-location\") pod \"524ff90d-edcf-4a72-9856-0101b0ec89cf\" (UID: \"524ff90d-edcf-4a72-9856-0101b0ec89cf\") " Apr 23 17:34:11.880372 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:11.880350 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/524ff90d-edcf-4a72-9856-0101b0ec89cf-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "524ff90d-edcf-4a72-9856-0101b0ec89cf" (UID: "524ff90d-edcf-4a72-9856-0101b0ec89cf"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:34:11.981546 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:11.981507 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/524ff90d-edcf-4a72-9856-0101b0ec89cf-kserve-provision-location\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 17:34:12.196244 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:12.196212 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69bb8bfff5-n78pc"] Apr 23 17:34:12.199650 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:12.199625 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-69bb8bfff5-n78pc"] Apr 23 17:34:12.741890 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:12.741855 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="524ff90d-edcf-4a72-9856-0101b0ec89cf" path="/var/lib/kubelet/pods/524ff90d-edcf-4a72-9856-0101b0ec89cf/volumes" Apr 23 17:34:12.861070 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:12.861034 2562 generic.go:358] "Generic (PLEG): container finished" podID="b43d81f5-d692-4d9c-93c6-17b8f34f92f3" containerID="d6802b3bf98b9d718ee4a7095aa2db72ebbc744d8d29c3af761508965c2630af" exitCode=0 Apr 23 17:34:12.861250 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:12.861123 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-754bf5d7d9-fdzgr" event={"ID":"b43d81f5-d692-4d9c-93c6-17b8f34f92f3","Type":"ContainerDied","Data":"d6802b3bf98b9d718ee4a7095aa2db72ebbc744d8d29c3af761508965c2630af"} Apr 23 17:34:13.866263 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:13.866226 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-754bf5d7d9-fdzgr" event={"ID":"b43d81f5-d692-4d9c-93c6-17b8f34f92f3","Type":"ContainerStarted","Data":"5c5ec1b52e4899899097abacfc968b33eb37add76f37db99ff0bc7875bc41e19"} Apr 23 17:34:13.866668 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:13.866498 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-754bf5d7d9-fdzgr" Apr 23 17:34:13.867752 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:13.867710 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-754bf5d7d9-fdzgr" podUID="b43d81f5-d692-4d9c-93c6-17b8f34f92f3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.66:8080: connect: connection refused" Apr 23 17:34:13.884262 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:13.884203 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-754bf5d7d9-fdzgr" podStartSLOduration=2.884187967 podStartE2EDuration="2.884187967s" podCreationTimestamp="2026-04-23 17:34:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:34:13.884032489 +0000 UTC m=+3533.736847553" watchObservedRunningTime="2026-04-23 17:34:13.884187967 +0000 UTC m=+3533.737003033" Apr 23 17:34:14.870268 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:14.870221 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-754bf5d7d9-fdzgr" podUID="b43d81f5-d692-4d9c-93c6-17b8f34f92f3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.66:8080: connect: connection refused" Apr 23 17:34:24.870469 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:24.870424 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-754bf5d7d9-fdzgr" podUID="b43d81f5-d692-4d9c-93c6-17b8f34f92f3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.66:8080: connect: connection refused" Apr 23 17:34:34.871198 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:34.871143 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-754bf5d7d9-fdzgr" podUID="b43d81f5-d692-4d9c-93c6-17b8f34f92f3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.66:8080: connect: connection refused" Apr 23 17:34:44.871253 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:44.871209 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-754bf5d7d9-fdzgr" podUID="b43d81f5-d692-4d9c-93c6-17b8f34f92f3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.66:8080: connect: connection refused" Apr 23 17:34:54.871047 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:34:54.871000 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-754bf5d7d9-fdzgr" podUID="b43d81f5-d692-4d9c-93c6-17b8f34f92f3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.66:8080: connect: connection refused" Apr 23 17:35:04.870441 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:04.870392 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-754bf5d7d9-fdzgr" podUID="b43d81f5-d692-4d9c-93c6-17b8f34f92f3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.66:8080: connect: connection refused" Apr 23 17:35:14.870533 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:14.870488 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-754bf5d7d9-fdzgr" podUID="b43d81f5-d692-4d9c-93c6-17b8f34f92f3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.66:8080: connect: connection refused" Apr 23 17:35:19.739893 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:19.739859 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-754bf5d7d9-fdzgr" Apr 23 17:35:21.085427 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:21.085399 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-754bf5d7d9-fdzgr"] Apr 23 17:35:21.085824 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:21.085620 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-754bf5d7d9-fdzgr" podUID="b43d81f5-d692-4d9c-93c6-17b8f34f92f3" containerName="kserve-container" containerID="cri-o://5c5ec1b52e4899899097abacfc968b33eb37add76f37db99ff0bc7875bc41e19" gracePeriod=30 Apr 23 17:35:22.168691 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:22.168654 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66544dbf9-jqgd4"] Apr 23 17:35:22.169076 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:22.169003 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="524ff90d-edcf-4a72-9856-0101b0ec89cf" containerName="storage-initializer" Apr 23 17:35:22.169076 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:22.169016 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="524ff90d-edcf-4a72-9856-0101b0ec89cf" containerName="storage-initializer" Apr 23 17:35:22.169076 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:22.169026 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="524ff90d-edcf-4a72-9856-0101b0ec89cf" containerName="storage-initializer" Apr 23 17:35:22.169076 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:22.169031 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="524ff90d-edcf-4a72-9856-0101b0ec89cf" containerName="storage-initializer" Apr 23 17:35:22.169218 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:22.169092 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="524ff90d-edcf-4a72-9856-0101b0ec89cf" containerName="storage-initializer" Apr 23 17:35:22.169218 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:22.169104 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="524ff90d-edcf-4a72-9856-0101b0ec89cf" containerName="storage-initializer" Apr 23 17:35:22.171936 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:22.171920 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66544dbf9-jqgd4" Apr 23 17:35:22.184337 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:22.184310 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66544dbf9-jqgd4"] Apr 23 17:35:22.295113 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:22.295075 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42b9d18f-a215-4777-83ad-4fffd2a0d594-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-66544dbf9-jqgd4\" (UID: \"42b9d18f-a215-4777-83ad-4fffd2a0d594\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66544dbf9-jqgd4" Apr 23 17:35:22.395833 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:22.395795 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42b9d18f-a215-4777-83ad-4fffd2a0d594-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-66544dbf9-jqgd4\" (UID: \"42b9d18f-a215-4777-83ad-4fffd2a0d594\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66544dbf9-jqgd4" Apr 23 17:35:22.396166 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:22.396148 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42b9d18f-a215-4777-83ad-4fffd2a0d594-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-66544dbf9-jqgd4\" (UID: \"42b9d18f-a215-4777-83ad-4fffd2a0d594\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66544dbf9-jqgd4" Apr 23 17:35:22.481929 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:22.481841 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66544dbf9-jqgd4" Apr 23 17:35:22.605815 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:22.605791 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66544dbf9-jqgd4"] Apr 23 17:35:22.608165 ip-10-0-135-57 kubenswrapper[2562]: W0423 17:35:22.608138 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42b9d18f_a215_4777_83ad_4fffd2a0d594.slice/crio-48713cc1442dc0a87a0e713dd658027c5472b852c23af9a6b1ebcb0422afac69 WatchSource:0}: Error finding container 48713cc1442dc0a87a0e713dd658027c5472b852c23af9a6b1ebcb0422afac69: Status 404 returned error can't find the container with id 48713cc1442dc0a87a0e713dd658027c5472b852c23af9a6b1ebcb0422afac69 Apr 23 17:35:23.082566 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:23.082524 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66544dbf9-jqgd4" event={"ID":"42b9d18f-a215-4777-83ad-4fffd2a0d594","Type":"ContainerStarted","Data":"1f0d8930ddb40863a145a47f8ad7ad4fe0272b848e095ff828cafcca0c602b9f"} Apr 23 17:35:23.082760 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:23.082573 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66544dbf9-jqgd4" event={"ID":"42b9d18f-a215-4777-83ad-4fffd2a0d594","Type":"ContainerStarted","Data":"48713cc1442dc0a87a0e713dd658027c5472b852c23af9a6b1ebcb0422afac69"} Apr 23 17:35:25.429523 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:25.429502 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-754bf5d7d9-fdzgr" Apr 23 17:35:25.525830 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:25.525726 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b43d81f5-d692-4d9c-93c6-17b8f34f92f3-kserve-provision-location\") pod \"b43d81f5-d692-4d9c-93c6-17b8f34f92f3\" (UID: \"b43d81f5-d692-4d9c-93c6-17b8f34f92f3\") " Apr 23 17:35:25.525830 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:25.525811 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/b43d81f5-d692-4d9c-93c6-17b8f34f92f3-cabundle-cert\") pod \"b43d81f5-d692-4d9c-93c6-17b8f34f92f3\" (UID: \"b43d81f5-d692-4d9c-93c6-17b8f34f92f3\") " Apr 23 17:35:25.526051 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:25.526027 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b43d81f5-d692-4d9c-93c6-17b8f34f92f3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b43d81f5-d692-4d9c-93c6-17b8f34f92f3" (UID: "b43d81f5-d692-4d9c-93c6-17b8f34f92f3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:35:25.526131 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:25.526111 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b43d81f5-d692-4d9c-93c6-17b8f34f92f3-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "b43d81f5-d692-4d9c-93c6-17b8f34f92f3" (UID: "b43d81f5-d692-4d9c-93c6-17b8f34f92f3"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:35:25.626766 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:25.626709 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b43d81f5-d692-4d9c-93c6-17b8f34f92f3-kserve-provision-location\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 17:35:25.626766 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:25.626760 2562 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/b43d81f5-d692-4d9c-93c6-17b8f34f92f3-cabundle-cert\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 17:35:26.092716 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:26.092680 2562 generic.go:358] "Generic (PLEG): container finished" podID="b43d81f5-d692-4d9c-93c6-17b8f34f92f3" containerID="5c5ec1b52e4899899097abacfc968b33eb37add76f37db99ff0bc7875bc41e19" exitCode=0 Apr 23 17:35:26.092900 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:26.092767 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-754bf5d7d9-fdzgr" Apr 23 17:35:26.092900 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:26.092772 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-754bf5d7d9-fdzgr" event={"ID":"b43d81f5-d692-4d9c-93c6-17b8f34f92f3","Type":"ContainerDied","Data":"5c5ec1b52e4899899097abacfc968b33eb37add76f37db99ff0bc7875bc41e19"} Apr 23 17:35:26.092900 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:26.092815 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-754bf5d7d9-fdzgr" event={"ID":"b43d81f5-d692-4d9c-93c6-17b8f34f92f3","Type":"ContainerDied","Data":"d95421759c07bad85609bb3f3ce04be312c2c50d710ded0b72691ab00c3016da"} Apr 23 17:35:26.092900 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:26.092836 2562 scope.go:117] "RemoveContainer" containerID="5c5ec1b52e4899899097abacfc968b33eb37add76f37db99ff0bc7875bc41e19" Apr 23 17:35:26.100631 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:26.100608 2562 scope.go:117] "RemoveContainer" containerID="d6802b3bf98b9d718ee4a7095aa2db72ebbc744d8d29c3af761508965c2630af" Apr 23 17:35:26.109877 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:26.109858 2562 scope.go:117] "RemoveContainer" containerID="5c5ec1b52e4899899097abacfc968b33eb37add76f37db99ff0bc7875bc41e19" Apr 23 17:35:26.110148 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:35:26.110128 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c5ec1b52e4899899097abacfc968b33eb37add76f37db99ff0bc7875bc41e19\": container with ID starting with 5c5ec1b52e4899899097abacfc968b33eb37add76f37db99ff0bc7875bc41e19 not found: ID does not exist" containerID="5c5ec1b52e4899899097abacfc968b33eb37add76f37db99ff0bc7875bc41e19" Apr 23 17:35:26.110198 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:26.110159 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c5ec1b52e4899899097abacfc968b33eb37add76f37db99ff0bc7875bc41e19"} err="failed to get container status \"5c5ec1b52e4899899097abacfc968b33eb37add76f37db99ff0bc7875bc41e19\": rpc error: code = NotFound desc = could not find container \"5c5ec1b52e4899899097abacfc968b33eb37add76f37db99ff0bc7875bc41e19\": container with ID starting with 5c5ec1b52e4899899097abacfc968b33eb37add76f37db99ff0bc7875bc41e19 not found: ID does not exist" Apr 23 17:35:26.110198 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:26.110177 2562 scope.go:117] "RemoveContainer" containerID="d6802b3bf98b9d718ee4a7095aa2db72ebbc744d8d29c3af761508965c2630af" Apr 23 17:35:26.110423 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:35:26.110405 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6802b3bf98b9d718ee4a7095aa2db72ebbc744d8d29c3af761508965c2630af\": container with ID starting with d6802b3bf98b9d718ee4a7095aa2db72ebbc744d8d29c3af761508965c2630af not found: ID does not exist" containerID="d6802b3bf98b9d718ee4a7095aa2db72ebbc744d8d29c3af761508965c2630af" Apr 23 17:35:26.110487 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:26.110432 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6802b3bf98b9d718ee4a7095aa2db72ebbc744d8d29c3af761508965c2630af"} err="failed to get container status \"d6802b3bf98b9d718ee4a7095aa2db72ebbc744d8d29c3af761508965c2630af\": rpc error: code = NotFound desc = could not find container \"d6802b3bf98b9d718ee4a7095aa2db72ebbc744d8d29c3af761508965c2630af\": container with ID starting with d6802b3bf98b9d718ee4a7095aa2db72ebbc744d8d29c3af761508965c2630af not found: ID does not exist" Apr 23 17:35:26.115415 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:26.115393 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-754bf5d7d9-fdzgr"] Apr 23 17:35:26.117995 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:26.117975 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-754bf5d7d9-fdzgr"] Apr 23 17:35:26.323185 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:35:26.323147 2562 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42b9d18f_a215_4777_83ad_4fffd2a0d594.slice/crio-1f0d8930ddb40863a145a47f8ad7ad4fe0272b848e095ff828cafcca0c602b9f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42b9d18f_a215_4777_83ad_4fffd2a0d594.slice/crio-conmon-1f0d8930ddb40863a145a47f8ad7ad4fe0272b848e095ff828cafcca0c602b9f.scope\": RecentStats: unable to find data in memory cache]" Apr 23 17:35:26.741818 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:26.741786 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b43d81f5-d692-4d9c-93c6-17b8f34f92f3" path="/var/lib/kubelet/pods/b43d81f5-d692-4d9c-93c6-17b8f34f92f3/volumes" Apr 23 17:35:27.098284 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:27.098210 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-66544dbf9-jqgd4_42b9d18f-a215-4777-83ad-4fffd2a0d594/storage-initializer/0.log" Apr 23 17:35:27.098284 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:27.098247 2562 generic.go:358] "Generic (PLEG): container finished" podID="42b9d18f-a215-4777-83ad-4fffd2a0d594" containerID="1f0d8930ddb40863a145a47f8ad7ad4fe0272b848e095ff828cafcca0c602b9f" exitCode=1 Apr 23 17:35:27.098284 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:27.098275 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66544dbf9-jqgd4" event={"ID":"42b9d18f-a215-4777-83ad-4fffd2a0d594","Type":"ContainerDied","Data":"1f0d8930ddb40863a145a47f8ad7ad4fe0272b848e095ff828cafcca0c602b9f"} Apr 23 17:35:28.102718 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:28.102689 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-66544dbf9-jqgd4_42b9d18f-a215-4777-83ad-4fffd2a0d594/storage-initializer/0.log" Apr 23 17:35:28.103134 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:28.102807 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66544dbf9-jqgd4" event={"ID":"42b9d18f-a215-4777-83ad-4fffd2a0d594","Type":"ContainerStarted","Data":"9e12ec11de61c2831f15f6e313a008ceac089449c6356b1b194da8c07105274d"} Apr 23 17:35:32.114561 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:32.114531 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-66544dbf9-jqgd4_42b9d18f-a215-4777-83ad-4fffd2a0d594/storage-initializer/1.log" Apr 23 17:35:32.114974 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:32.114930 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-66544dbf9-jqgd4_42b9d18f-a215-4777-83ad-4fffd2a0d594/storage-initializer/0.log" Apr 23 17:35:32.114974 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:32.114966 2562 generic.go:358] "Generic (PLEG): container finished" podID="42b9d18f-a215-4777-83ad-4fffd2a0d594" containerID="9e12ec11de61c2831f15f6e313a008ceac089449c6356b1b194da8c07105274d" exitCode=1 Apr 23 17:35:32.115100 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:32.115048 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66544dbf9-jqgd4" event={"ID":"42b9d18f-a215-4777-83ad-4fffd2a0d594","Type":"ContainerDied","Data":"9e12ec11de61c2831f15f6e313a008ceac089449c6356b1b194da8c07105274d"} Apr 23 17:35:32.115157 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:32.115102 2562 scope.go:117] "RemoveContainer" containerID="1f0d8930ddb40863a145a47f8ad7ad4fe0272b848e095ff828cafcca0c602b9f" Apr 23 17:35:32.115538 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:32.115519 2562 scope.go:117] "RemoveContainer" containerID="1f0d8930ddb40863a145a47f8ad7ad4fe0272b848e095ff828cafcca0c602b9f" Apr 23 17:35:32.125355 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:35:32.125330 2562 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-custom-fail-predictor-66544dbf9-jqgd4_kserve-ci-e2e-test_42b9d18f-a215-4777-83ad-4fffd2a0d594_0 in pod sandbox 48713cc1442dc0a87a0e713dd658027c5472b852c23af9a6b1ebcb0422afac69 from index: no such id: '1f0d8930ddb40863a145a47f8ad7ad4fe0272b848e095ff828cafcca0c602b9f'" containerID="1f0d8930ddb40863a145a47f8ad7ad4fe0272b848e095ff828cafcca0c602b9f" Apr 23 17:35:32.125426 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:35:32.125372 2562 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-custom-fail-predictor-66544dbf9-jqgd4_kserve-ci-e2e-test_42b9d18f-a215-4777-83ad-4fffd2a0d594_0 in pod sandbox 48713cc1442dc0a87a0e713dd658027c5472b852c23af9a6b1ebcb0422afac69 from index: no such id: '1f0d8930ddb40863a145a47f8ad7ad4fe0272b848e095ff828cafcca0c602b9f'; Skipping pod \"isvc-sklearn-s3-tls-custom-fail-predictor-66544dbf9-jqgd4_kserve-ci-e2e-test(42b9d18f-a215-4777-83ad-4fffd2a0d594)\"" logger="UnhandledError" Apr 23 17:35:32.126692 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:35:32.126671 2562 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-sklearn-s3-tls-custom-fail-predictor-66544dbf9-jqgd4_kserve-ci-e2e-test(42b9d18f-a215-4777-83ad-4fffd2a0d594)\"" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66544dbf9-jqgd4" podUID="42b9d18f-a215-4777-83ad-4fffd2a0d594" Apr 23 17:35:32.168883 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:32.168850 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66544dbf9-jqgd4"] Apr 23 17:35:33.119336 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:33.119309 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-66544dbf9-jqgd4_42b9d18f-a215-4777-83ad-4fffd2a0d594/storage-initializer/1.log" Apr 23 17:35:33.242599 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:33.242574 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-66544dbf9-jqgd4_42b9d18f-a215-4777-83ad-4fffd2a0d594/storage-initializer/1.log" Apr 23 17:35:33.242707 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:33.242637 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66544dbf9-jqgd4" Apr 23 17:35:33.272400 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:33.272366 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598c68cd88-ng492"] Apr 23 17:35:33.272670 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:33.272659 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42b9d18f-a215-4777-83ad-4fffd2a0d594" containerName="storage-initializer" Apr 23 17:35:33.272721 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:33.272672 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="42b9d18f-a215-4777-83ad-4fffd2a0d594" containerName="storage-initializer" Apr 23 17:35:33.272721 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:33.272686 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b43d81f5-d692-4d9c-93c6-17b8f34f92f3" containerName="storage-initializer" Apr 23 17:35:33.272721 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:33.272693 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="b43d81f5-d692-4d9c-93c6-17b8f34f92f3" containerName="storage-initializer" Apr 23 17:35:33.272721 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:33.272706 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b43d81f5-d692-4d9c-93c6-17b8f34f92f3" containerName="kserve-container" Apr 23 17:35:33.272721 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:33.272713 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="b43d81f5-d692-4d9c-93c6-17b8f34f92f3" containerName="kserve-container" Apr 23 17:35:33.272919 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:33.272772 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="42b9d18f-a215-4777-83ad-4fffd2a0d594" containerName="storage-initializer" Apr 23 17:35:33.272919 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:33.272783 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="b43d81f5-d692-4d9c-93c6-17b8f34f92f3" containerName="kserve-container" Apr 23 17:35:33.272919 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:33.272831 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42b9d18f-a215-4777-83ad-4fffd2a0d594" containerName="storage-initializer" Apr 23 17:35:33.272919 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:33.272837 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="42b9d18f-a215-4777-83ad-4fffd2a0d594" containerName="storage-initializer" Apr 23 17:35:33.272919 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:33.272883 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="42b9d18f-a215-4777-83ad-4fffd2a0d594" containerName="storage-initializer" Apr 23 17:35:33.275784 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:33.275767 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598c68cd88-ng492" Apr 23 17:35:33.279768 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:33.279738 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 23 17:35:33.285967 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:33.285947 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598c68cd88-ng492"] Apr 23 17:35:33.392249 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:33.392223 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42b9d18f-a215-4777-83ad-4fffd2a0d594-kserve-provision-location\") pod \"42b9d18f-a215-4777-83ad-4fffd2a0d594\" (UID: \"42b9d18f-a215-4777-83ad-4fffd2a0d594\") " Apr 23 17:35:33.392441 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:33.392358 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5216d315-e636-453e-adeb-bce3b30f3a30-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-598c68cd88-ng492\" (UID: \"5216d315-e636-453e-adeb-bce3b30f3a30\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598c68cd88-ng492" Apr 23 17:35:33.392441 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:33.392400 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/5216d315-e636-453e-adeb-bce3b30f3a30-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-598c68cd88-ng492\" (UID: \"5216d315-e636-453e-adeb-bce3b30f3a30\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598c68cd88-ng492" Apr 23 17:35:33.392559 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:33.392532 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42b9d18f-a215-4777-83ad-4fffd2a0d594-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "42b9d18f-a215-4777-83ad-4fffd2a0d594" (UID: "42b9d18f-a215-4777-83ad-4fffd2a0d594"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:35:33.493530 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:33.493484 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5216d315-e636-453e-adeb-bce3b30f3a30-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-598c68cd88-ng492\" (UID: \"5216d315-e636-453e-adeb-bce3b30f3a30\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598c68cd88-ng492" Apr 23 17:35:33.493705 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:33.493538 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/5216d315-e636-453e-adeb-bce3b30f3a30-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-598c68cd88-ng492\" (UID: \"5216d315-e636-453e-adeb-bce3b30f3a30\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598c68cd88-ng492" Apr 23 17:35:33.493705 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:33.493582 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42b9d18f-a215-4777-83ad-4fffd2a0d594-kserve-provision-location\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 17:35:33.493921 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:33.493899 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5216d315-e636-453e-adeb-bce3b30f3a30-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-598c68cd88-ng492\" (UID: \"5216d315-e636-453e-adeb-bce3b30f3a30\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598c68cd88-ng492" Apr 23 17:35:33.494238 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:33.494216 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/5216d315-e636-453e-adeb-bce3b30f3a30-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-598c68cd88-ng492\" (UID: \"5216d315-e636-453e-adeb-bce3b30f3a30\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598c68cd88-ng492" Apr 23 17:35:33.586260 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:33.586222 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598c68cd88-ng492" Apr 23 17:35:33.710627 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:33.710606 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598c68cd88-ng492"] Apr 23 17:35:33.712814 ip-10-0-135-57 kubenswrapper[2562]: W0423 17:35:33.712786 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5216d315_e636_453e_adeb_bce3b30f3a30.slice/crio-90aeb390ab54c94fc248db4b52392d8a8b9e6ca54087718b93733209cc7d0f74 WatchSource:0}: Error finding container 90aeb390ab54c94fc248db4b52392d8a8b9e6ca54087718b93733209cc7d0f74: Status 404 returned error can't find the container with id 90aeb390ab54c94fc248db4b52392d8a8b9e6ca54087718b93733209cc7d0f74 Apr 23 17:35:34.124277 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:34.124192 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598c68cd88-ng492" event={"ID":"5216d315-e636-453e-adeb-bce3b30f3a30","Type":"ContainerStarted","Data":"948383c7e99b2266023954fd2ee9562db560fea8fbd8162a04c4deeee0737e2d"} Apr 23 17:35:34.124277 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:34.124230 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598c68cd88-ng492" event={"ID":"5216d315-e636-453e-adeb-bce3b30f3a30","Type":"ContainerStarted","Data":"90aeb390ab54c94fc248db4b52392d8a8b9e6ca54087718b93733209cc7d0f74"} Apr 23 17:35:34.125466 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:34.125443 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-66544dbf9-jqgd4_42b9d18f-a215-4777-83ad-4fffd2a0d594/storage-initializer/1.log" Apr 23 17:35:34.125586 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:34.125530 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66544dbf9-jqgd4" event={"ID":"42b9d18f-a215-4777-83ad-4fffd2a0d594","Type":"ContainerDied","Data":"48713cc1442dc0a87a0e713dd658027c5472b852c23af9a6b1ebcb0422afac69"} Apr 23 17:35:34.125586 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:34.125553 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66544dbf9-jqgd4" Apr 23 17:35:34.125586 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:34.125567 2562 scope.go:117] "RemoveContainer" containerID="9e12ec11de61c2831f15f6e313a008ceac089449c6356b1b194da8c07105274d" Apr 23 17:35:34.171104 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:34.171067 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66544dbf9-jqgd4"] Apr 23 17:35:34.175851 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:34.175819 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66544dbf9-jqgd4"] Apr 23 17:35:34.742401 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:34.742363 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42b9d18f-a215-4777-83ad-4fffd2a0d594" path="/var/lib/kubelet/pods/42b9d18f-a215-4777-83ad-4fffd2a0d594/volumes" Apr 23 17:35:35.129689 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:35.129651 2562 generic.go:358] "Generic (PLEG): container finished" podID="5216d315-e636-453e-adeb-bce3b30f3a30" containerID="948383c7e99b2266023954fd2ee9562db560fea8fbd8162a04c4deeee0737e2d" exitCode=0 Apr 23 17:35:35.130138 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:35.129738 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598c68cd88-ng492" event={"ID":"5216d315-e636-453e-adeb-bce3b30f3a30","Type":"ContainerDied","Data":"948383c7e99b2266023954fd2ee9562db560fea8fbd8162a04c4deeee0737e2d"} Apr 23 17:35:36.135686 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:36.135649 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598c68cd88-ng492" event={"ID":"5216d315-e636-453e-adeb-bce3b30f3a30","Type":"ContainerStarted","Data":"442680f6a1d92757a12d4989d1322f48bc027bd18dff8a0c1e2c04930ce8802a"} Apr 23 17:35:36.136286 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:36.135861 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598c68cd88-ng492" Apr 23 17:35:36.137149 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:36.137117 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598c68cd88-ng492" podUID="5216d315-e636-453e-adeb-bce3b30f3a30" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.68:8080: connect: connection refused" Apr 23 17:35:36.167452 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:36.167398 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598c68cd88-ng492" podStartSLOduration=3.167382544 podStartE2EDuration="3.167382544s" podCreationTimestamp="2026-04-23 17:35:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:35:36.164949802 +0000 UTC m=+3616.017764866" watchObservedRunningTime="2026-04-23 17:35:36.167382544 +0000 UTC m=+3616.020197634" Apr 23 17:35:37.139154 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:37.139102 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598c68cd88-ng492" podUID="5216d315-e636-453e-adeb-bce3b30f3a30" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.68:8080: connect: connection refused" Apr 23 17:35:47.139247 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:47.139199 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598c68cd88-ng492" podUID="5216d315-e636-453e-adeb-bce3b30f3a30" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.68:8080: connect: connection refused" Apr 23 17:35:57.139909 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:35:57.139865 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598c68cd88-ng492" podUID="5216d315-e636-453e-adeb-bce3b30f3a30" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.68:8080: connect: connection refused" Apr 23 17:36:07.140128 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:36:07.140078 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598c68cd88-ng492" podUID="5216d315-e636-453e-adeb-bce3b30f3a30" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.68:8080: connect: connection refused" Apr 23 17:36:17.139440 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:36:17.139384 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598c68cd88-ng492" podUID="5216d315-e636-453e-adeb-bce3b30f3a30" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.68:8080: connect: connection refused" Apr 23 17:36:27.139898 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:36:27.139859 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598c68cd88-ng492" podUID="5216d315-e636-453e-adeb-bce3b30f3a30" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.68:8080: connect: connection refused" Apr 23 17:36:37.139703 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:36:37.139656 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598c68cd88-ng492" podUID="5216d315-e636-453e-adeb-bce3b30f3a30" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.68:8080: connect: connection refused" Apr 23 17:36:47.140945 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:36:47.140916 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598c68cd88-ng492" Apr 23 17:36:53.322179 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:36:53.322142 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598c68cd88-ng492"] Apr 23 17:36:53.322568 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:36:53.322414 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598c68cd88-ng492" podUID="5216d315-e636-453e-adeb-bce3b30f3a30" containerName="kserve-container" containerID="cri-o://442680f6a1d92757a12d4989d1322f48bc027bd18dff8a0c1e2c04930ce8802a" gracePeriod=30 Apr 23 17:36:54.370651 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:36:54.370618 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-767cdd64f8-svqw5"] Apr 23 17:36:54.373911 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:36:54.373895 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-767cdd64f8-svqw5" Apr 23 17:36:54.382428 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:36:54.382399 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-767cdd64f8-svqw5"] Apr 23 17:36:54.488757 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:36:54.488710 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/631f915a-faa8-4b28-ba6f-d325d14f0588-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-767cdd64f8-svqw5\" (UID: \"631f915a-faa8-4b28-ba6f-d325d14f0588\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-767cdd64f8-svqw5" Apr 23 17:36:54.590253 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:36:54.590214 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/631f915a-faa8-4b28-ba6f-d325d14f0588-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-767cdd64f8-svqw5\" (UID: \"631f915a-faa8-4b28-ba6f-d325d14f0588\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-767cdd64f8-svqw5" Apr 23 17:36:54.590621 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:36:54.590601 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/631f915a-faa8-4b28-ba6f-d325d14f0588-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-767cdd64f8-svqw5\" (UID: \"631f915a-faa8-4b28-ba6f-d325d14f0588\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-767cdd64f8-svqw5" Apr 23 17:36:54.684224 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:36:54.684175 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-767cdd64f8-svqw5" Apr 23 17:36:54.809139 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:36:54.809107 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-767cdd64f8-svqw5"] Apr 23 17:36:54.812421 ip-10-0-135-57 kubenswrapper[2562]: W0423 17:36:54.812397 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod631f915a_faa8_4b28_ba6f_d325d14f0588.slice/crio-01e4e731aafbea00ffcd0fd7b1d33b6b1fe91aa0a7239e3afdde9ab4e596a577 WatchSource:0}: Error finding container 01e4e731aafbea00ffcd0fd7b1d33b6b1fe91aa0a7239e3afdde9ab4e596a577: Status 404 returned error can't find the container with id 01e4e731aafbea00ffcd0fd7b1d33b6b1fe91aa0a7239e3afdde9ab4e596a577 Apr 23 17:36:55.374734 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:36:55.374697 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-767cdd64f8-svqw5" event={"ID":"631f915a-faa8-4b28-ba6f-d325d14f0588","Type":"ContainerStarted","Data":"5cfea98d1fc439c7073e7eb69cb84c04eca31724bb99071e224fcb1ba8a947c6"} Apr 23 17:36:55.375213 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:36:55.374759 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-767cdd64f8-svqw5" event={"ID":"631f915a-faa8-4b28-ba6f-d325d14f0588","Type":"ContainerStarted","Data":"01e4e731aafbea00ffcd0fd7b1d33b6b1fe91aa0a7239e3afdde9ab4e596a577"} Apr 23 17:36:57.139839 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:36:57.139794 2562 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598c68cd88-ng492" podUID="5216d315-e636-453e-adeb-bce3b30f3a30" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.68:8080: connect: connection refused" Apr 23 17:36:57.759103 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:36:57.759081 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598c68cd88-ng492" Apr 23 17:36:57.920921 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:36:57.920889 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5216d315-e636-453e-adeb-bce3b30f3a30-kserve-provision-location\") pod \"5216d315-e636-453e-adeb-bce3b30f3a30\" (UID: \"5216d315-e636-453e-adeb-bce3b30f3a30\") " Apr 23 17:36:57.921113 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:36:57.920939 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/5216d315-e636-453e-adeb-bce3b30f3a30-cabundle-cert\") pod \"5216d315-e636-453e-adeb-bce3b30f3a30\" (UID: \"5216d315-e636-453e-adeb-bce3b30f3a30\") " Apr 23 17:36:57.921223 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:36:57.921157 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5216d315-e636-453e-adeb-bce3b30f3a30-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5216d315-e636-453e-adeb-bce3b30f3a30" (UID: "5216d315-e636-453e-adeb-bce3b30f3a30"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:36:57.921306 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:36:57.921286 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5216d315-e636-453e-adeb-bce3b30f3a30-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "5216d315-e636-453e-adeb-bce3b30f3a30" (UID: "5216d315-e636-453e-adeb-bce3b30f3a30"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 17:36:58.022297 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:36:58.022254 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5216d315-e636-453e-adeb-bce3b30f3a30-kserve-provision-location\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 17:36:58.022297 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:36:58.022298 2562 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/5216d315-e636-453e-adeb-bce3b30f3a30-cabundle-cert\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 17:36:58.385713 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:36:58.385674 2562 generic.go:358] "Generic (PLEG): container finished" podID="5216d315-e636-453e-adeb-bce3b30f3a30" containerID="442680f6a1d92757a12d4989d1322f48bc027bd18dff8a0c1e2c04930ce8802a" exitCode=0 Apr 23 17:36:58.386166 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:36:58.385759 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598c68cd88-ng492" Apr 23 17:36:58.386166 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:36:58.385774 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598c68cd88-ng492" event={"ID":"5216d315-e636-453e-adeb-bce3b30f3a30","Type":"ContainerDied","Data":"442680f6a1d92757a12d4989d1322f48bc027bd18dff8a0c1e2c04930ce8802a"} Apr 23 17:36:58.386166 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:36:58.385814 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598c68cd88-ng492" event={"ID":"5216d315-e636-453e-adeb-bce3b30f3a30","Type":"ContainerDied","Data":"90aeb390ab54c94fc248db4b52392d8a8b9e6ca54087718b93733209cc7d0f74"} Apr 23 17:36:58.386166 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:36:58.385833 2562 scope.go:117] "RemoveContainer" containerID="442680f6a1d92757a12d4989d1322f48bc027bd18dff8a0c1e2c04930ce8802a" Apr 23 17:36:58.387341 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:36:58.387323 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-767cdd64f8-svqw5_631f915a-faa8-4b28-ba6f-d325d14f0588/storage-initializer/0.log" Apr 23 17:36:58.387432 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:36:58.387360 2562 generic.go:358] "Generic (PLEG): container finished" podID="631f915a-faa8-4b28-ba6f-d325d14f0588" containerID="5cfea98d1fc439c7073e7eb69cb84c04eca31724bb99071e224fcb1ba8a947c6" exitCode=1 Apr 23 17:36:58.387432 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:36:58.387416 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-767cdd64f8-svqw5" event={"ID":"631f915a-faa8-4b28-ba6f-d325d14f0588","Type":"ContainerDied","Data":"5cfea98d1fc439c7073e7eb69cb84c04eca31724bb99071e224fcb1ba8a947c6"} Apr 23 17:36:58.399821 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:36:58.399798 2562 scope.go:117] "RemoveContainer" containerID="948383c7e99b2266023954fd2ee9562db560fea8fbd8162a04c4deeee0737e2d" Apr 23 17:36:58.406818 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:36:58.406797 2562 scope.go:117] "RemoveContainer" containerID="442680f6a1d92757a12d4989d1322f48bc027bd18dff8a0c1e2c04930ce8802a" Apr 23 17:36:58.407095 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:36:58.407071 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"442680f6a1d92757a12d4989d1322f48bc027bd18dff8a0c1e2c04930ce8802a\": container with ID starting with 442680f6a1d92757a12d4989d1322f48bc027bd18dff8a0c1e2c04930ce8802a not found: ID does not exist" containerID="442680f6a1d92757a12d4989d1322f48bc027bd18dff8a0c1e2c04930ce8802a" Apr 23 17:36:58.407192 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:36:58.407102 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"442680f6a1d92757a12d4989d1322f48bc027bd18dff8a0c1e2c04930ce8802a"} err="failed to get container status \"442680f6a1d92757a12d4989d1322f48bc027bd18dff8a0c1e2c04930ce8802a\": rpc error: code = NotFound desc = could not find container \"442680f6a1d92757a12d4989d1322f48bc027bd18dff8a0c1e2c04930ce8802a\": container with ID starting with 442680f6a1d92757a12d4989d1322f48bc027bd18dff8a0c1e2c04930ce8802a not found: ID does not exist" Apr 23 17:36:58.407192 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:36:58.407121 2562 scope.go:117] "RemoveContainer" containerID="948383c7e99b2266023954fd2ee9562db560fea8fbd8162a04c4deeee0737e2d" Apr 23 17:36:58.407367 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:36:58.407346 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"948383c7e99b2266023954fd2ee9562db560fea8fbd8162a04c4deeee0737e2d\": container with ID starting with 948383c7e99b2266023954fd2ee9562db560fea8fbd8162a04c4deeee0737e2d not found: ID does not exist" containerID="948383c7e99b2266023954fd2ee9562db560fea8fbd8162a04c4deeee0737e2d" Apr 23 17:36:58.407437 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:36:58.407373 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"948383c7e99b2266023954fd2ee9562db560fea8fbd8162a04c4deeee0737e2d"} err="failed to get container status \"948383c7e99b2266023954fd2ee9562db560fea8fbd8162a04c4deeee0737e2d\": rpc error: code = NotFound desc = could not find container \"948383c7e99b2266023954fd2ee9562db560fea8fbd8162a04c4deeee0737e2d\": container with ID starting with 948383c7e99b2266023954fd2ee9562db560fea8fbd8162a04c4deeee0737e2d not found: ID does not exist" Apr 23 17:36:58.422609 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:36:58.422576 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598c68cd88-ng492"] Apr 23 17:36:58.424132 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:36:58.424106 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-598c68cd88-ng492"] Apr 23 17:36:58.742269 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:36:58.742237 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5216d315-e636-453e-adeb-bce3b30f3a30" path="/var/lib/kubelet/pods/5216d315-e636-453e-adeb-bce3b30f3a30/volumes" Apr 23 17:36:59.394690 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:36:59.394663 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-767cdd64f8-svqw5_631f915a-faa8-4b28-ba6f-d325d14f0588/storage-initializer/0.log" Apr 23 17:36:59.395104 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:36:59.394764 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-767cdd64f8-svqw5" event={"ID":"631f915a-faa8-4b28-ba6f-d325d14f0588","Type":"ContainerStarted","Data":"a04007f9819e5e04ef8e8c6f9d64672f78d67d87bbc33f6220362c8bc4166e5f"} Apr 23 17:37:04.402382 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:04.400161 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-767cdd64f8-svqw5"] Apr 23 17:37:04.402382 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:04.400565 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-767cdd64f8-svqw5" podUID="631f915a-faa8-4b28-ba6f-d325d14f0588" containerName="storage-initializer" containerID="cri-o://a04007f9819e5e04ef8e8c6f9d64672f78d67d87bbc33f6220362c8bc4166e5f" gracePeriod=30 Apr 23 17:37:04.410660 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:04.410639 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-767cdd64f8-svqw5_631f915a-faa8-4b28-ba6f-d325d14f0588/storage-initializer/1.log" Apr 23 17:37:04.411043 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:04.411030 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-767cdd64f8-svqw5_631f915a-faa8-4b28-ba6f-d325d14f0588/storage-initializer/0.log" Apr 23 17:37:04.411149 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:04.411066 2562 generic.go:358] "Generic (PLEG): container finished" podID="631f915a-faa8-4b28-ba6f-d325d14f0588" containerID="a04007f9819e5e04ef8e8c6f9d64672f78d67d87bbc33f6220362c8bc4166e5f" exitCode=1 Apr 23 17:37:04.411149 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:04.411093 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-767cdd64f8-svqw5" event={"ID":"631f915a-faa8-4b28-ba6f-d325d14f0588","Type":"ContainerDied","Data":"a04007f9819e5e04ef8e8c6f9d64672f78d67d87bbc33f6220362c8bc4166e5f"} Apr 23 17:37:04.411149 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:04.411121 2562 scope.go:117] "RemoveContainer" containerID="5cfea98d1fc439c7073e7eb69cb84c04eca31724bb99071e224fcb1ba8a947c6" Apr 23 17:37:04.530201 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:04.530179 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-767cdd64f8-svqw5_631f915a-faa8-4b28-ba6f-d325d14f0588/storage-initializer/1.log" Apr 23 17:37:04.530340 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:04.530243 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-767cdd64f8-svqw5" Apr 23 17:37:04.680106 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:04.680009 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/631f915a-faa8-4b28-ba6f-d325d14f0588-kserve-provision-location\") pod \"631f915a-faa8-4b28-ba6f-d325d14f0588\" (UID: \"631f915a-faa8-4b28-ba6f-d325d14f0588\") " Apr 23 17:37:04.680333 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:04.680308 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/631f915a-faa8-4b28-ba6f-d325d14f0588-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "631f915a-faa8-4b28-ba6f-d325d14f0588" (UID: "631f915a-faa8-4b28-ba6f-d325d14f0588"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:37:04.781506 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:04.781471 2562 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/631f915a-faa8-4b28-ba6f-d325d14f0588-kserve-provision-location\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 17:37:05.415057 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:05.415028 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-767cdd64f8-svqw5_631f915a-faa8-4b28-ba6f-d325d14f0588/storage-initializer/1.log" Apr 23 17:37:05.415471 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:05.415146 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-767cdd64f8-svqw5" Apr 23 17:37:05.415471 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:05.415161 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-767cdd64f8-svqw5" event={"ID":"631f915a-faa8-4b28-ba6f-d325d14f0588","Type":"ContainerDied","Data":"01e4e731aafbea00ffcd0fd7b1d33b6b1fe91aa0a7239e3afdde9ab4e596a577"} Apr 23 17:37:05.415471 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:05.415205 2562 scope.go:117] "RemoveContainer" containerID="a04007f9819e5e04ef8e8c6f9d64672f78d67d87bbc33f6220362c8bc4166e5f" Apr 23 17:37:05.445981 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:05.445946 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-767cdd64f8-svqw5"] Apr 23 17:37:05.449019 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:05.448985 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-767cdd64f8-svqw5"] Apr 23 17:37:06.256393 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:06.256358 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6dr26/must-gather-bwclz"] Apr 23 17:37:06.256679 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:06.256666 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="631f915a-faa8-4b28-ba6f-d325d14f0588" containerName="storage-initializer" Apr 23 17:37:06.256766 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:06.256680 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="631f915a-faa8-4b28-ba6f-d325d14f0588" containerName="storage-initializer" Apr 23 17:37:06.256766 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:06.256690 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="631f915a-faa8-4b28-ba6f-d325d14f0588" containerName="storage-initializer" Apr 23 17:37:06.256766 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:06.256695 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="631f915a-faa8-4b28-ba6f-d325d14f0588" containerName="storage-initializer" Apr 23 17:37:06.256766 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:06.256703 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5216d315-e636-453e-adeb-bce3b30f3a30" containerName="storage-initializer" Apr 23 17:37:06.256766 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:06.256708 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="5216d315-e636-453e-adeb-bce3b30f3a30" containerName="storage-initializer" Apr 23 17:37:06.256766 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:06.256722 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5216d315-e636-453e-adeb-bce3b30f3a30" containerName="kserve-container" Apr 23 17:37:06.256766 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:06.256727 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="5216d315-e636-453e-adeb-bce3b30f3a30" containerName="kserve-container" Apr 23 17:37:06.256982 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:06.256786 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="631f915a-faa8-4b28-ba6f-d325d14f0588" containerName="storage-initializer" Apr 23 17:37:06.256982 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:06.256796 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="5216d315-e636-453e-adeb-bce3b30f3a30" containerName="kserve-container" Apr 23 17:37:06.256982 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:06.256803 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="631f915a-faa8-4b28-ba6f-d325d14f0588" containerName="storage-initializer" Apr 23 17:37:06.260951 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:06.260932 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6dr26/must-gather-bwclz" Apr 23 17:37:06.263801 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:06.263779 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6dr26\"/\"kube-root-ca.crt\"" Apr 23 17:37:06.263908 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:06.263801 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6dr26\"/\"openshift-service-ca.crt\"" Apr 23 17:37:06.265019 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:06.264996 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-6dr26\"/\"default-dockercfg-5l8wv\"" Apr 23 17:37:06.267798 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:06.267774 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6dr26/must-gather-bwclz"] Apr 23 17:37:06.395929 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:06.395900 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwlp7\" (UniqueName: \"kubernetes.io/projected/d13d9c09-b7d7-426a-98ba-599bc728f39e-kube-api-access-hwlp7\") pod \"must-gather-bwclz\" (UID: \"d13d9c09-b7d7-426a-98ba-599bc728f39e\") " pod="openshift-must-gather-6dr26/must-gather-bwclz" Apr 23 17:37:06.395929 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:06.395933 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d13d9c09-b7d7-426a-98ba-599bc728f39e-must-gather-output\") pod \"must-gather-bwclz\" (UID: \"d13d9c09-b7d7-426a-98ba-599bc728f39e\") " pod="openshift-must-gather-6dr26/must-gather-bwclz" Apr 23 17:37:06.496834 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:06.496793 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hwlp7\" (UniqueName: \"kubernetes.io/projected/d13d9c09-b7d7-426a-98ba-599bc728f39e-kube-api-access-hwlp7\") pod \"must-gather-bwclz\" (UID: \"d13d9c09-b7d7-426a-98ba-599bc728f39e\") " pod="openshift-must-gather-6dr26/must-gather-bwclz" Apr 23 17:37:06.496834 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:06.496833 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d13d9c09-b7d7-426a-98ba-599bc728f39e-must-gather-output\") pod \"must-gather-bwclz\" (UID: \"d13d9c09-b7d7-426a-98ba-599bc728f39e\") " pod="openshift-must-gather-6dr26/must-gather-bwclz" Apr 23 17:37:06.497270 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:06.497144 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d13d9c09-b7d7-426a-98ba-599bc728f39e-must-gather-output\") pod \"must-gather-bwclz\" (UID: \"d13d9c09-b7d7-426a-98ba-599bc728f39e\") " pod="openshift-must-gather-6dr26/must-gather-bwclz" Apr 23 17:37:06.505338 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:06.505319 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwlp7\" (UniqueName: \"kubernetes.io/projected/d13d9c09-b7d7-426a-98ba-599bc728f39e-kube-api-access-hwlp7\") pod \"must-gather-bwclz\" (UID: \"d13d9c09-b7d7-426a-98ba-599bc728f39e\") " pod="openshift-must-gather-6dr26/must-gather-bwclz" Apr 23 17:37:06.581793 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:06.581695 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6dr26/must-gather-bwclz" Apr 23 17:37:06.704435 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:06.704410 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6dr26/must-gather-bwclz"] Apr 23 17:37:06.706764 ip-10-0-135-57 kubenswrapper[2562]: W0423 17:37:06.706722 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd13d9c09_b7d7_426a_98ba_599bc728f39e.slice/crio-8cfe01f49f55e99bad6cb09fc6db2f563b9be2b2b28b2ecaef7f5713006c8281 WatchSource:0}: Error finding container 8cfe01f49f55e99bad6cb09fc6db2f563b9be2b2b28b2ecaef7f5713006c8281: Status 404 returned error can't find the container with id 8cfe01f49f55e99bad6cb09fc6db2f563b9be2b2b28b2ecaef7f5713006c8281 Apr 23 17:37:06.741897 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:06.741861 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="631f915a-faa8-4b28-ba6f-d325d14f0588" path="/var/lib/kubelet/pods/631f915a-faa8-4b28-ba6f-d325d14f0588/volumes" Apr 23 17:37:07.424422 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:07.424382 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6dr26/must-gather-bwclz" event={"ID":"d13d9c09-b7d7-426a-98ba-599bc728f39e","Type":"ContainerStarted","Data":"8cfe01f49f55e99bad6cb09fc6db2f563b9be2b2b28b2ecaef7f5713006c8281"} Apr 23 17:37:11.442834 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:11.442786 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6dr26/must-gather-bwclz" event={"ID":"d13d9c09-b7d7-426a-98ba-599bc728f39e","Type":"ContainerStarted","Data":"ad9e0897d02c3c05a7583e1d25155aac7dec55b4d24fdbdc118a3c4158f2c26d"} Apr 23 17:37:11.442834 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:11.442831 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6dr26/must-gather-bwclz" event={"ID":"d13d9c09-b7d7-426a-98ba-599bc728f39e","Type":"ContainerStarted","Data":"660a2a5237efa218b2fd2bed7093b5733fa3b99994fc4e9cdf6ea894ee2f9644"} Apr 23 17:37:11.461136 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:11.461070 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6dr26/must-gather-bwclz" podStartSLOduration=1.298120141 podStartE2EDuration="5.461050345s" podCreationTimestamp="2026-04-23 17:37:06 +0000 UTC" firstStartedPulling="2026-04-23 17:37:06.708406371 +0000 UTC m=+3706.561221412" lastFinishedPulling="2026-04-23 17:37:10.871336561 +0000 UTC m=+3710.724151616" observedRunningTime="2026-04-23 17:37:11.4592264 +0000 UTC m=+3711.312041465" watchObservedRunningTime="2026-04-23 17:37:11.461050345 +0000 UTC m=+3711.313865413" Apr 23 17:37:31.510500 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:31.510465 2562 generic.go:358] "Generic (PLEG): container finished" podID="d13d9c09-b7d7-426a-98ba-599bc728f39e" containerID="660a2a5237efa218b2fd2bed7093b5733fa3b99994fc4e9cdf6ea894ee2f9644" exitCode=0 Apr 23 17:37:31.510919 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:31.510537 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6dr26/must-gather-bwclz" event={"ID":"d13d9c09-b7d7-426a-98ba-599bc728f39e","Type":"ContainerDied","Data":"660a2a5237efa218b2fd2bed7093b5733fa3b99994fc4e9cdf6ea894ee2f9644"} Apr 23 17:37:31.510972 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:31.510931 2562 scope.go:117] "RemoveContainer" containerID="660a2a5237efa218b2fd2bed7093b5733fa3b99994fc4e9cdf6ea894ee2f9644" Apr 23 17:37:32.331304 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:32.331262 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6dr26_must-gather-bwclz_d13d9c09-b7d7-426a-98ba-599bc728f39e/gather/0.log" Apr 23 17:37:33.014237 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:33.014201 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lzvhv/must-gather-27z4t"] Apr 23 17:37:33.018859 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:33.018842 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lzvhv/must-gather-27z4t" Apr 23 17:37:33.021804 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:33.021783 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-lzvhv\"/\"openshift-service-ca.crt\"" Apr 23 17:37:33.021921 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:33.021783 2562 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-lzvhv\"/\"default-dockercfg-xx7s6\"" Apr 23 17:37:33.022147 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:33.022133 2562 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-lzvhv\"/\"kube-root-ca.crt\"" Apr 23 17:37:33.030568 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:33.030540 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lzvhv/must-gather-27z4t"] Apr 23 17:37:33.136273 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:33.136227 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2l5g9\" (UniqueName: \"kubernetes.io/projected/0a888d54-babc-4d2e-a252-402faa1848b6-kube-api-access-2l5g9\") pod \"must-gather-27z4t\" (UID: \"0a888d54-babc-4d2e-a252-402faa1848b6\") " pod="openshift-must-gather-lzvhv/must-gather-27z4t" Apr 23 17:37:33.136452 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:33.136295 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0a888d54-babc-4d2e-a252-402faa1848b6-must-gather-output\") pod \"must-gather-27z4t\" (UID: \"0a888d54-babc-4d2e-a252-402faa1848b6\") " pod="openshift-must-gather-lzvhv/must-gather-27z4t" Apr 23 17:37:33.237498 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:33.237463 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2l5g9\" (UniqueName: \"kubernetes.io/projected/0a888d54-babc-4d2e-a252-402faa1848b6-kube-api-access-2l5g9\") pod \"must-gather-27z4t\" (UID: \"0a888d54-babc-4d2e-a252-402faa1848b6\") " pod="openshift-must-gather-lzvhv/must-gather-27z4t" Apr 23 17:37:33.237644 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:33.237522 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0a888d54-babc-4d2e-a252-402faa1848b6-must-gather-output\") pod \"must-gather-27z4t\" (UID: \"0a888d54-babc-4d2e-a252-402faa1848b6\") " pod="openshift-must-gather-lzvhv/must-gather-27z4t" Apr 23 17:37:33.237829 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:33.237813 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0a888d54-babc-4d2e-a252-402faa1848b6-must-gather-output\") pod \"must-gather-27z4t\" (UID: \"0a888d54-babc-4d2e-a252-402faa1848b6\") " pod="openshift-must-gather-lzvhv/must-gather-27z4t" Apr 23 17:37:33.246359 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:33.246331 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2l5g9\" (UniqueName: \"kubernetes.io/projected/0a888d54-babc-4d2e-a252-402faa1848b6-kube-api-access-2l5g9\") pod \"must-gather-27z4t\" (UID: \"0a888d54-babc-4d2e-a252-402faa1848b6\") " pod="openshift-must-gather-lzvhv/must-gather-27z4t" Apr 23 17:37:33.328522 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:33.328422 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lzvhv/must-gather-27z4t" Apr 23 17:37:33.451604 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:33.451578 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lzvhv/must-gather-27z4t"] Apr 23 17:37:33.454226 ip-10-0-135-57 kubenswrapper[2562]: W0423 17:37:33.454191 2562 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a888d54_babc_4d2e_a252_402faa1848b6.slice/crio-0f0e2a50088af8a6e3df884bff9fe40bec9b4c769a84fcc3584f33b7d3dddf66 WatchSource:0}: Error finding container 0f0e2a50088af8a6e3df884bff9fe40bec9b4c769a84fcc3584f33b7d3dddf66: Status 404 returned error can't find the container with id 0f0e2a50088af8a6e3df884bff9fe40bec9b4c769a84fcc3584f33b7d3dddf66 Apr 23 17:37:33.517183 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:33.517144 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lzvhv/must-gather-27z4t" event={"ID":"0a888d54-babc-4d2e-a252-402faa1848b6","Type":"ContainerStarted","Data":"0f0e2a50088af8a6e3df884bff9fe40bec9b4c769a84fcc3584f33b7d3dddf66"} Apr 23 17:37:34.523575 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:34.523532 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lzvhv/must-gather-27z4t" event={"ID":"0a888d54-babc-4d2e-a252-402faa1848b6","Type":"ContainerStarted","Data":"8fc86fcdb13b2895a2dbae3765c709bf4285d80317d1d268d550c44192fbd438"} Apr 23 17:37:35.529677 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:35.529617 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lzvhv/must-gather-27z4t" event={"ID":"0a888d54-babc-4d2e-a252-402faa1848b6","Type":"ContainerStarted","Data":"e6584255d1d81cf4fa912cdc4979985f0dbf8db5da45158dfd50f4d6a9d14f34"} Apr 23 17:37:35.548112 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:35.548051 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-lzvhv/must-gather-27z4t" podStartSLOduration=2.6619550480000003 podStartE2EDuration="3.548029584s" podCreationTimestamp="2026-04-23 17:37:32 +0000 UTC" firstStartedPulling="2026-04-23 17:37:33.456005791 +0000 UTC m=+3733.308820833" lastFinishedPulling="2026-04-23 17:37:34.342080328 +0000 UTC m=+3734.194895369" observedRunningTime="2026-04-23 17:37:35.547409313 +0000 UTC m=+3735.400224376" watchObservedRunningTime="2026-04-23 17:37:35.548029584 +0000 UTC m=+3735.400844650" Apr 23 17:37:36.147908 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:36.147879 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-qzdzg_451de1c0-4375-4c37-8a62-0641aa75255d/global-pull-secret-syncer/0.log" Apr 23 17:37:36.263305 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:36.263273 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-86gqj_a58d25fc-7e16-47db-81f3-d8e27f59a92b/konnectivity-agent/0.log" Apr 23 17:37:36.393301 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:36.393270 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-135-57.ec2.internal_cb72edff024abc1bfc776fb556b861df/haproxy/0.log" Apr 23 17:37:37.833129 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:37.833090 2562 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6dr26/must-gather-bwclz"] Apr 23 17:37:37.833678 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:37.833472 2562 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-6dr26/must-gather-bwclz" podUID="d13d9c09-b7d7-426a-98ba-599bc728f39e" containerName="copy" containerID="cri-o://ad9e0897d02c3c05a7583e1d25155aac7dec55b4d24fdbdc118a3c4158f2c26d" gracePeriod=2 Apr 23 17:37:37.841048 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:37.841018 2562 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6dr26/must-gather-bwclz"] Apr 23 17:37:37.841560 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:37.841520 2562 status_manager.go:895] "Failed to get status for pod" podUID="d13d9c09-b7d7-426a-98ba-599bc728f39e" pod="openshift-must-gather-6dr26/must-gather-bwclz" err="pods \"must-gather-bwclz\" is forbidden: User \"system:node:ip-10-0-135-57.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-6dr26\": no relationship found between node 'ip-10-0-135-57.ec2.internal' and this object" Apr 23 17:37:38.227156 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:38.227126 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6dr26_must-gather-bwclz_d13d9c09-b7d7-426a-98ba-599bc728f39e/copy/0.log" Apr 23 17:37:38.241382 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:38.234641 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6dr26/must-gather-bwclz" Apr 23 17:37:38.241382 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:38.237613 2562 status_manager.go:895] "Failed to get status for pod" podUID="d13d9c09-b7d7-426a-98ba-599bc728f39e" pod="openshift-must-gather-6dr26/must-gather-bwclz" err="pods \"must-gather-bwclz\" is forbidden: User \"system:node:ip-10-0-135-57.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-6dr26\": no relationship found between node 'ip-10-0-135-57.ec2.internal' and this object" Apr 23 17:37:38.385555 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:38.385511 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d13d9c09-b7d7-426a-98ba-599bc728f39e-must-gather-output\") pod \"d13d9c09-b7d7-426a-98ba-599bc728f39e\" (UID: \"d13d9c09-b7d7-426a-98ba-599bc728f39e\") " Apr 23 17:37:38.385761 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:38.385638 2562 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwlp7\" (UniqueName: \"kubernetes.io/projected/d13d9c09-b7d7-426a-98ba-599bc728f39e-kube-api-access-hwlp7\") pod \"d13d9c09-b7d7-426a-98ba-599bc728f39e\" (UID: \"d13d9c09-b7d7-426a-98ba-599bc728f39e\") " Apr 23 17:37:38.388293 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:38.387110 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d13d9c09-b7d7-426a-98ba-599bc728f39e-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "d13d9c09-b7d7-426a-98ba-599bc728f39e" (UID: "d13d9c09-b7d7-426a-98ba-599bc728f39e"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 17:37:38.390263 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:38.390238 2562 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d13d9c09-b7d7-426a-98ba-599bc728f39e-kube-api-access-hwlp7" (OuterVolumeSpecName: "kube-api-access-hwlp7") pod "d13d9c09-b7d7-426a-98ba-599bc728f39e" (UID: "d13d9c09-b7d7-426a-98ba-599bc728f39e"). InnerVolumeSpecName "kube-api-access-hwlp7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 17:37:38.487490 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:38.487401 2562 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hwlp7\" (UniqueName: \"kubernetes.io/projected/d13d9c09-b7d7-426a-98ba-599bc728f39e-kube-api-access-hwlp7\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 17:37:38.487490 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:38.487445 2562 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d13d9c09-b7d7-426a-98ba-599bc728f39e-must-gather-output\") on node \"ip-10-0-135-57.ec2.internal\" DevicePath \"\"" Apr 23 17:37:38.545458 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:38.545405 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6dr26_must-gather-bwclz_d13d9c09-b7d7-426a-98ba-599bc728f39e/copy/0.log" Apr 23 17:37:38.545875 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:38.545844 2562 generic.go:358] "Generic (PLEG): container finished" podID="d13d9c09-b7d7-426a-98ba-599bc728f39e" containerID="ad9e0897d02c3c05a7583e1d25155aac7dec55b4d24fdbdc118a3c4158f2c26d" exitCode=143 Apr 23 17:37:38.546008 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:38.545906 2562 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6dr26/must-gather-bwclz" Apr 23 17:37:38.546008 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:38.545938 2562 scope.go:117] "RemoveContainer" containerID="ad9e0897d02c3c05a7583e1d25155aac7dec55b4d24fdbdc118a3c4158f2c26d" Apr 23 17:37:38.548808 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:38.548737 2562 status_manager.go:895] "Failed to get status for pod" podUID="d13d9c09-b7d7-426a-98ba-599bc728f39e" pod="openshift-must-gather-6dr26/must-gather-bwclz" err="pods \"must-gather-bwclz\" is forbidden: User \"system:node:ip-10-0-135-57.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-6dr26\": no relationship found between node 'ip-10-0-135-57.ec2.internal' and this object" Apr 23 17:37:38.560302 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:38.559835 2562 scope.go:117] "RemoveContainer" containerID="660a2a5237efa218b2fd2bed7093b5733fa3b99994fc4e9cdf6ea894ee2f9644" Apr 23 17:37:38.562570 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:38.562468 2562 status_manager.go:895] "Failed to get status for pod" podUID="d13d9c09-b7d7-426a-98ba-599bc728f39e" pod="openshift-must-gather-6dr26/must-gather-bwclz" err="pods \"must-gather-bwclz\" is forbidden: User \"system:node:ip-10-0-135-57.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-6dr26\": no relationship found between node 'ip-10-0-135-57.ec2.internal' and this object" Apr 23 17:37:38.584453 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:38.584416 2562 scope.go:117] "RemoveContainer" containerID="ad9e0897d02c3c05a7583e1d25155aac7dec55b4d24fdbdc118a3c4158f2c26d" Apr 23 17:37:38.584913 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:37:38.584877 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad9e0897d02c3c05a7583e1d25155aac7dec55b4d24fdbdc118a3c4158f2c26d\": container with ID starting with ad9e0897d02c3c05a7583e1d25155aac7dec55b4d24fdbdc118a3c4158f2c26d not found: ID does not exist" containerID="ad9e0897d02c3c05a7583e1d25155aac7dec55b4d24fdbdc118a3c4158f2c26d" Apr 23 17:37:38.585039 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:38.584931 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad9e0897d02c3c05a7583e1d25155aac7dec55b4d24fdbdc118a3c4158f2c26d"} err="failed to get container status \"ad9e0897d02c3c05a7583e1d25155aac7dec55b4d24fdbdc118a3c4158f2c26d\": rpc error: code = NotFound desc = could not find container \"ad9e0897d02c3c05a7583e1d25155aac7dec55b4d24fdbdc118a3c4158f2c26d\": container with ID starting with ad9e0897d02c3c05a7583e1d25155aac7dec55b4d24fdbdc118a3c4158f2c26d not found: ID does not exist" Apr 23 17:37:38.585039 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:38.584960 2562 scope.go:117] "RemoveContainer" containerID="660a2a5237efa218b2fd2bed7093b5733fa3b99994fc4e9cdf6ea894ee2f9644" Apr 23 17:37:38.585344 ip-10-0-135-57 kubenswrapper[2562]: E0423 17:37:38.585288 2562 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"660a2a5237efa218b2fd2bed7093b5733fa3b99994fc4e9cdf6ea894ee2f9644\": container with ID starting with 660a2a5237efa218b2fd2bed7093b5733fa3b99994fc4e9cdf6ea894ee2f9644 not found: ID does not exist" containerID="660a2a5237efa218b2fd2bed7093b5733fa3b99994fc4e9cdf6ea894ee2f9644" Apr 23 17:37:38.585444 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:38.585353 2562 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"660a2a5237efa218b2fd2bed7093b5733fa3b99994fc4e9cdf6ea894ee2f9644"} err="failed to get container status \"660a2a5237efa218b2fd2bed7093b5733fa3b99994fc4e9cdf6ea894ee2f9644\": rpc error: code = NotFound desc = could not find container \"660a2a5237efa218b2fd2bed7093b5733fa3b99994fc4e9cdf6ea894ee2f9644\": container with ID starting with 660a2a5237efa218b2fd2bed7093b5733fa3b99994fc4e9cdf6ea894ee2f9644 not found: ID does not exist" Apr 23 17:37:38.745374 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:38.745272 2562 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d13d9c09-b7d7-426a-98ba-599bc728f39e" path="/var/lib/kubelet/pods/d13d9c09-b7d7-426a-98ba-599bc728f39e/volumes" Apr 23 17:37:40.422482 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:40.422447 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-6c9b88dfdd-b9ljj_3b9aeaea-8c1b-40ec-b888-528be672bf52/metrics-server/0.log" Apr 23 17:37:40.448792 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:40.448764 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-wrchg_848ea94b-d7c7-4f3c-8f74-d35ab6f9770a/monitoring-plugin/0.log" Apr 23 17:37:40.481497 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:40.481473 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cxgtz_354460b2-9790-4c05-9a2a-dc4bab2fa675/node-exporter/0.log" Apr 23 17:37:40.508442 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:40.508416 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cxgtz_354460b2-9790-4c05-9a2a-dc4bab2fa675/kube-rbac-proxy/0.log" Apr 23 17:37:40.529822 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:40.529795 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cxgtz_354460b2-9790-4c05-9a2a-dc4bab2fa675/init-textfile/0.log" Apr 23 17:37:40.947788 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:40.947736 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_578680f3-31a4-4c7d-9df3-703f5b279c9a/prometheus/0.log" Apr 23 17:37:40.984609 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:40.984577 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_578680f3-31a4-4c7d-9df3-703f5b279c9a/config-reloader/0.log" Apr 23 17:37:41.042380 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:41.042350 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_578680f3-31a4-4c7d-9df3-703f5b279c9a/thanos-sidecar/0.log" Apr 23 17:37:41.097475 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:41.097438 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_578680f3-31a4-4c7d-9df3-703f5b279c9a/kube-rbac-proxy-web/0.log" Apr 23 17:37:41.144756 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:41.144707 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_578680f3-31a4-4c7d-9df3-703f5b279c9a/kube-rbac-proxy/0.log" Apr 23 17:37:41.193784 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:41.193737 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_578680f3-31a4-4c7d-9df3-703f5b279c9a/kube-rbac-proxy-thanos/0.log" Apr 23 17:37:41.222579 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:41.222553 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_578680f3-31a4-4c7d-9df3-703f5b279c9a/init-config-reloader/0.log" Apr 23 17:37:41.262104 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:41.262006 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-9qng7_46010ce7-8871-4f06-90d6-4933fe17216c/prometheus-operator/0.log" Apr 23 17:37:41.284650 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:41.284578 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-9qng7_46010ce7-8871-4f06-90d6-4933fe17216c/kube-rbac-proxy/0.log" Apr 23 17:37:41.349355 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:41.349327 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6bfdbdfb67-mxpcf_acf14525-d550-4e69-b0a5-9d0557d765ad/telemeter-client/0.log" Apr 23 17:37:41.374108 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:41.374063 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6bfdbdfb67-mxpcf_acf14525-d550-4e69-b0a5-9d0557d765ad/reload/0.log" Apr 23 17:37:41.401860 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:41.401829 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6bfdbdfb67-mxpcf_acf14525-d550-4e69-b0a5-9d0557d765ad/kube-rbac-proxy/0.log" Apr 23 17:37:41.438236 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:41.438196 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7cbf84d895-rj6tl_d2eb8e62-9f07-4d19-9d67-f66771d26287/thanos-query/0.log" Apr 23 17:37:41.465128 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:41.465052 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7cbf84d895-rj6tl_d2eb8e62-9f07-4d19-9d67-f66771d26287/kube-rbac-proxy-web/0.log" Apr 23 17:37:41.499736 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:41.499707 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7cbf84d895-rj6tl_d2eb8e62-9f07-4d19-9d67-f66771d26287/kube-rbac-proxy/0.log" Apr 23 17:37:41.523088 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:41.523052 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7cbf84d895-rj6tl_d2eb8e62-9f07-4d19-9d67-f66771d26287/prom-label-proxy/0.log" Apr 23 17:37:41.545652 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:41.545602 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7cbf84d895-rj6tl_d2eb8e62-9f07-4d19-9d67-f66771d26287/kube-rbac-proxy-rules/0.log" Apr 23 17:37:41.568640 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:41.568601 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7cbf84d895-rj6tl_d2eb8e62-9f07-4d19-9d67-f66771d26287/kube-rbac-proxy-metrics/0.log" Apr 23 17:37:43.166763 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:43.165089 2562 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lzvhv/perf-node-gather-daemonset-pcw76"] Apr 23 17:37:43.166763 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:43.165837 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d13d9c09-b7d7-426a-98ba-599bc728f39e" containerName="gather" Apr 23 17:37:43.166763 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:43.165860 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="d13d9c09-b7d7-426a-98ba-599bc728f39e" containerName="gather" Apr 23 17:37:43.166763 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:43.165890 2562 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d13d9c09-b7d7-426a-98ba-599bc728f39e" containerName="copy" Apr 23 17:37:43.166763 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:43.165899 2562 state_mem.go:107] "Deleted CPUSet assignment" podUID="d13d9c09-b7d7-426a-98ba-599bc728f39e" containerName="copy" Apr 23 17:37:43.166763 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:43.166037 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="d13d9c09-b7d7-426a-98ba-599bc728f39e" containerName="copy" Apr 23 17:37:43.166763 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:43.166050 2562 memory_manager.go:356] "RemoveStaleState removing state" podUID="d13d9c09-b7d7-426a-98ba-599bc728f39e" containerName="gather" Apr 23 17:37:43.172307 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:43.172277 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-pcw76" Apr 23 17:37:43.179825 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:43.178782 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lzvhv/perf-node-gather-daemonset-pcw76"] Apr 23 17:37:43.332520 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:43.332490 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6799bffdd6-d45jf_c0299958-18f0-43ec-a2a3-24e19c7d3c8d/console/0.log" Apr 23 17:37:43.337239 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:43.337215 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/06ada5c0-7d2b-4b09-ac60-a6bec97a8caf-podres\") pod \"perf-node-gather-daemonset-pcw76\" (UID: \"06ada5c0-7d2b-4b09-ac60-a6bec97a8caf\") " pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-pcw76" Apr 23 17:37:43.337367 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:43.337267 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/06ada5c0-7d2b-4b09-ac60-a6bec97a8caf-proc\") pod \"perf-node-gather-daemonset-pcw76\" (UID: \"06ada5c0-7d2b-4b09-ac60-a6bec97a8caf\") " pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-pcw76" Apr 23 17:37:43.337367 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:43.337323 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/06ada5c0-7d2b-4b09-ac60-a6bec97a8caf-lib-modules\") pod \"perf-node-gather-daemonset-pcw76\" (UID: \"06ada5c0-7d2b-4b09-ac60-a6bec97a8caf\") " pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-pcw76" Apr 23 17:37:43.337367 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:43.337354 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/06ada5c0-7d2b-4b09-ac60-a6bec97a8caf-sys\") pod \"perf-node-gather-daemonset-pcw76\" (UID: \"06ada5c0-7d2b-4b09-ac60-a6bec97a8caf\") " pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-pcw76" Apr 23 17:37:43.337495 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:43.337385 2562 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55wsq\" (UniqueName: \"kubernetes.io/projected/06ada5c0-7d2b-4b09-ac60-a6bec97a8caf-kube-api-access-55wsq\") pod \"perf-node-gather-daemonset-pcw76\" (UID: \"06ada5c0-7d2b-4b09-ac60-a6bec97a8caf\") " pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-pcw76" Apr 23 17:37:43.438435 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:43.438141 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/06ada5c0-7d2b-4b09-ac60-a6bec97a8caf-podres\") pod \"perf-node-gather-daemonset-pcw76\" (UID: \"06ada5c0-7d2b-4b09-ac60-a6bec97a8caf\") " pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-pcw76" Apr 23 17:37:43.438610 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:43.438455 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/06ada5c0-7d2b-4b09-ac60-a6bec97a8caf-proc\") pod \"perf-node-gather-daemonset-pcw76\" (UID: \"06ada5c0-7d2b-4b09-ac60-a6bec97a8caf\") " pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-pcw76" Apr 23 17:37:43.438610 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:43.438329 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/06ada5c0-7d2b-4b09-ac60-a6bec97a8caf-podres\") pod \"perf-node-gather-daemonset-pcw76\" (UID: \"06ada5c0-7d2b-4b09-ac60-a6bec97a8caf\") " pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-pcw76" Apr 23 17:37:43.438610 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:43.438551 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/06ada5c0-7d2b-4b09-ac60-a6bec97a8caf-lib-modules\") pod \"perf-node-gather-daemonset-pcw76\" (UID: \"06ada5c0-7d2b-4b09-ac60-a6bec97a8caf\") " pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-pcw76" Apr 23 17:37:43.438851 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:43.438616 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/06ada5c0-7d2b-4b09-ac60-a6bec97a8caf-proc\") pod \"perf-node-gather-daemonset-pcw76\" (UID: \"06ada5c0-7d2b-4b09-ac60-a6bec97a8caf\") " pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-pcw76" Apr 23 17:37:43.438851 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:43.438714 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/06ada5c0-7d2b-4b09-ac60-a6bec97a8caf-lib-modules\") pod \"perf-node-gather-daemonset-pcw76\" (UID: \"06ada5c0-7d2b-4b09-ac60-a6bec97a8caf\") " pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-pcw76" Apr 23 17:37:43.438851 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:43.438793 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/06ada5c0-7d2b-4b09-ac60-a6bec97a8caf-sys\") pod \"perf-node-gather-daemonset-pcw76\" (UID: \"06ada5c0-7d2b-4b09-ac60-a6bec97a8caf\") " pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-pcw76" Apr 23 17:37:43.439014 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:43.438927 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/06ada5c0-7d2b-4b09-ac60-a6bec97a8caf-sys\") pod \"perf-node-gather-daemonset-pcw76\" (UID: \"06ada5c0-7d2b-4b09-ac60-a6bec97a8caf\") " pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-pcw76" Apr 23 17:37:43.439014 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:43.438952 2562 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-55wsq\" (UniqueName: \"kubernetes.io/projected/06ada5c0-7d2b-4b09-ac60-a6bec97a8caf-kube-api-access-55wsq\") pod \"perf-node-gather-daemonset-pcw76\" (UID: \"06ada5c0-7d2b-4b09-ac60-a6bec97a8caf\") " pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-pcw76" Apr 23 17:37:43.448013 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:43.447987 2562 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-55wsq\" (UniqueName: \"kubernetes.io/projected/06ada5c0-7d2b-4b09-ac60-a6bec97a8caf-kube-api-access-55wsq\") pod \"perf-node-gather-daemonset-pcw76\" (UID: \"06ada5c0-7d2b-4b09-ac60-a6bec97a8caf\") " pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-pcw76" Apr 23 17:37:43.488687 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:43.488656 2562 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-pcw76" Apr 23 17:37:43.826629 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:43.826603 2562 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lzvhv/perf-node-gather-daemonset-pcw76"] Apr 23 17:37:43.833994 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:43.833964 2562 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 17:37:44.548193 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:44.548166 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-wfdh5_eb5d7213-8672-4ab5-9189-ec203ba60c84/dns/0.log" Apr 23 17:37:44.569557 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:44.569516 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-pcw76" event={"ID":"06ada5c0-7d2b-4b09-ac60-a6bec97a8caf","Type":"ContainerStarted","Data":"3373b1d5df868a812a66cb390dd335fe38bec024f802b0f29c06bea5ebf332fa"} Apr 23 17:37:44.569557 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:44.569556 2562 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-pcw76" event={"ID":"06ada5c0-7d2b-4b09-ac60-a6bec97a8caf","Type":"ContainerStarted","Data":"555dbba44aff0079199d2f510c69ec1bdfdc3dba518e3ba27d14a1d8d8887d23"} Apr 23 17:37:44.569833 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:44.569587 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-pcw76" Apr 23 17:37:44.575171 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:44.575150 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-wfdh5_eb5d7213-8672-4ab5-9189-ec203ba60c84/kube-rbac-proxy/0.log" Apr 23 17:37:44.587312 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:44.587274 2562 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-pcw76" podStartSLOduration=1.58726111 podStartE2EDuration="1.58726111s" podCreationTimestamp="2026-04-23 17:37:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 17:37:44.585713876 +0000 UTC m=+3744.438528941" watchObservedRunningTime="2026-04-23 17:37:44.58726111 +0000 UTC m=+3744.440076173" Apr 23 17:37:44.599540 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:44.599512 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-bfsmf_43aa3f2a-b871-42c1-a3ca-550b762203bf/dns-node-resolver/0.log" Apr 23 17:37:45.169800 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:45.169772 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-h68c5_e5a7d4ac-2ec3-48b6-8f5f-0d911a26eb86/node-ca/0.log" Apr 23 17:37:46.366544 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:46.366510 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-mh25k_aa24ec5c-a6a3-4c23-90a0-58ac28a5e1f9/serve-healthcheck-canary/0.log" Apr 23 17:37:46.770620 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:46.770590 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-56lmt_f9f9b357-558a-42d8-b068-7295ba867330/kube-rbac-proxy/0.log" Apr 23 17:37:46.790130 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:46.790105 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-56lmt_f9f9b357-558a-42d8-b068-7295ba867330/exporter/0.log" Apr 23 17:37:46.811285 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:46.811250 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-56lmt_f9f9b357-558a-42d8-b068-7295ba867330/extractor/0.log" Apr 23 17:37:49.118876 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:49.118844 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-gjjcv_ceaa651d-2fe1-4eca-9d70-0f0ecb8e5433/server/0.log" Apr 23 17:37:49.513109 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:49.513081 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-qpp47_316b0480-e2ec-4d32-b14d-6bd0f8560aba/s3-init/0.log" Apr 23 17:37:49.544502 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:49.544469 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-custom-js6s4_caa3c9cf-12d6-4cea-8453-20bcb03f5c31/s3-tls-init-custom/0.log" Apr 23 17:37:49.569109 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:49.569071 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-serving-z7k4r_766278b4-babe-4f1b-9525-5589a3575cd9/s3-tls-init-serving/0.log" Apr 23 17:37:49.652355 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:49.652327 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-serving-7fd5766db9-fg2d5_59e606bd-cff2-4904-bd7b-81a0fa5b15d5/seaweedfs-tls-serving/0.log" Apr 23 17:37:50.583495 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:50.583469 2562 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-lzvhv/perf-node-gather-daemonset-pcw76" Apr 23 17:37:55.583299 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:55.583274 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-q6xdb_2226bf67-86e3-4375-a378-075aedce2ea6/kube-multus-additional-cni-plugins/0.log" Apr 23 17:37:55.605106 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:55.605080 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-q6xdb_2226bf67-86e3-4375-a378-075aedce2ea6/egress-router-binary-copy/0.log" Apr 23 17:37:55.633034 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:55.633002 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-q6xdb_2226bf67-86e3-4375-a378-075aedce2ea6/cni-plugins/0.log" Apr 23 17:37:55.659873 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:55.659841 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-q6xdb_2226bf67-86e3-4375-a378-075aedce2ea6/bond-cni-plugin/0.log" Apr 23 17:37:55.680881 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:55.680839 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-q6xdb_2226bf67-86e3-4375-a378-075aedce2ea6/routeoverride-cni/0.log" Apr 23 17:37:55.703212 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:55.703187 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-q6xdb_2226bf67-86e3-4375-a378-075aedce2ea6/whereabouts-cni-bincopy/0.log" Apr 23 17:37:55.723796 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:55.723768 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-q6xdb_2226bf67-86e3-4375-a378-075aedce2ea6/whereabouts-cni/0.log" Apr 23 17:37:55.772458 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:55.772424 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-h6w7v_27c0f8c4-6f27-4267-8d2e-7e39aa9adcef/kube-multus/0.log" Apr 23 17:37:55.854755 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:55.854662 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-6wbcq_adbb31b5-ee6b-431b-ac95-7775688ba039/network-metrics-daemon/0.log" Apr 23 17:37:55.871355 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:55.871326 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-6wbcq_adbb31b5-ee6b-431b-ac95-7775688ba039/kube-rbac-proxy/0.log" Apr 23 17:37:57.563573 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:57.563534 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krt5k_fd41c55a-49c5-41cd-8fd5-f7964f5444e9/ovn-controller/0.log" Apr 23 17:37:57.675769 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:57.675722 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krt5k_fd41c55a-49c5-41cd-8fd5-f7964f5444e9/ovn-acl-logging/0.log" Apr 23 17:37:57.721253 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:57.721221 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krt5k_fd41c55a-49c5-41cd-8fd5-f7964f5444e9/kube-rbac-proxy-node/0.log" Apr 23 17:37:57.752083 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:57.752056 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krt5k_fd41c55a-49c5-41cd-8fd5-f7964f5444e9/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 17:37:57.777350 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:57.777296 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krt5k_fd41c55a-49c5-41cd-8fd5-f7964f5444e9/northd/0.log" Apr 23 17:37:57.801068 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:57.801043 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krt5k_fd41c55a-49c5-41cd-8fd5-f7964f5444e9/nbdb/0.log" Apr 23 17:37:57.824676 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:57.824603 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krt5k_fd41c55a-49c5-41cd-8fd5-f7964f5444e9/sbdb/0.log" Apr 23 17:37:58.046663 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:58.046630 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-krt5k_fd41c55a-49c5-41cd-8fd5-f7964f5444e9/ovnkube-controller/0.log" Apr 23 17:37:59.127545 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:37:59.127512 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-wn6cd_0ee2ca50-34ff-4830-8c04-92018768a3a7/network-check-target-container/0.log" Apr 23 17:38:00.078239 ip-10-0-135-57 kubenswrapper[2562]: I0423 17:38:00.078159 2562 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-kjd4g_380d4b90-b7df-4855-ae70-0dc24d42f0d3/iptables-alerter/0.log"