Apr 21 10:03:51.532111 ip-10-0-133-157 systemd[1]: Starting Kubernetes Kubelet... Apr 21 10:03:51.983611 ip-10-0-133-157 kubenswrapper[2577]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 10:03:51.983611 ip-10-0-133-157 kubenswrapper[2577]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 21 10:03:51.983611 ip-10-0-133-157 kubenswrapper[2577]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 10:03:51.983611 ip-10-0-133-157 kubenswrapper[2577]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 21 10:03:51.983611 ip-10-0-133-157 kubenswrapper[2577]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 21 10:03:51.984764 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.984214 2577 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 21 10:03:51.992832 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.992805 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 10:03:51.992832 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.992827 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 10:03:51.992832 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.992831 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 10:03:51.993001 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.992984 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 10:03:51.993001 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.992991 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 10:03:51.993001 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.992996 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 10:03:51.993001 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.992999 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 10:03:51.993001 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993002 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 10:03:51.993125 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993006 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 10:03:51.993125 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993009 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 10:03:51.993125 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993012 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 10:03:51.993125 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993015 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 10:03:51.993125 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993018 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 10:03:51.993125 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993020 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 10:03:51.993125 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993023 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 10:03:51.993125 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993026 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 10:03:51.993125 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993028 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 10:03:51.993125 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993031 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 10:03:51.993125 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993034 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 10:03:51.993125 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993036 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 10:03:51.993125 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993039 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 10:03:51.993125 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993042 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 10:03:51.993125 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993045 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 10:03:51.993125 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993047 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 10:03:51.993125 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993050 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 10:03:51.993125 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993052 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 10:03:51.993125 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993055 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 10:03:51.993573 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993057 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 10:03:51.993573 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993061 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 10:03:51.993573 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993064 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 10:03:51.993573 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993067 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 10:03:51.993573 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993069 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 10:03:51.993573 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993072 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 10:03:51.993573 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993074 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 10:03:51.993573 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993077 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 10:03:51.993573 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993079 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 10:03:51.993573 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993082 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 10:03:51.993573 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993084 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 10:03:51.993573 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993087 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 10:03:51.993573 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993090 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 10:03:51.993573 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993092 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 10:03:51.993573 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993096 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 10:03:51.993573 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993098 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 10:03:51.993573 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993105 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 10:03:51.993573 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993110 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 10:03:51.993573 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993113 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 10:03:51.993573 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993116 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 10:03:51.994090 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993119 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 10:03:51.994090 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993121 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 10:03:51.994090 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993124 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 10:03:51.994090 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993127 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 10:03:51.994090 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993130 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 10:03:51.994090 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993133 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 10:03:51.994090 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993135 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 10:03:51.994090 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993138 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 10:03:51.994090 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993141 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 10:03:51.994090 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993143 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 10:03:51.994090 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993147 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 10:03:51.994090 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993152 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 10:03:51.994090 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993155 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 10:03:51.994090 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993158 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 10:03:51.994090 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993161 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 10:03:51.994090 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993164 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 10:03:51.994090 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993167 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 10:03:51.994090 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993170 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 10:03:51.994090 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993172 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 21 10:03:51.994555 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993175 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 10:03:51.994555 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993180 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 10:03:51.994555 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993182 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 10:03:51.994555 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993185 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 10:03:51.994555 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993188 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 10:03:51.994555 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993190 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 10:03:51.994555 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993193 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 10:03:51.994555 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993197 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 10:03:51.994555 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993199 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 10:03:51.994555 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993202 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 10:03:51.994555 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993205 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 10:03:51.994555 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993208 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 10:03:51.994555 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993210 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 10:03:51.994555 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993213 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 10:03:51.994555 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993215 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 10:03:51.994555 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993218 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 10:03:51.994555 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993221 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 10:03:51.994555 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993223 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 10:03:51.994555 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993226 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 10:03:51.994555 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993228 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 10:03:51.995060 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993654 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 10:03:51.995060 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993659 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 10:03:51.995060 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993663 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 10:03:51.995060 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993666 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 10:03:51.995060 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993669 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 10:03:51.995060 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993672 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 10:03:51.995060 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993674 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 10:03:51.995060 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993677 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 10:03:51.995060 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993680 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 10:03:51.995060 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993683 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 10:03:51.995060 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993685 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 10:03:51.995060 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993688 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 10:03:51.995060 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993691 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 10:03:51.995060 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993693 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 10:03:51.995060 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993696 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 10:03:51.995060 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993699 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 10:03:51.995060 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993701 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 10:03:51.995060 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993704 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 10:03:51.995060 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993707 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 10:03:51.995060 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993711 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 10:03:51.995545 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993714 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 10:03:51.995545 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993717 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 10:03:51.995545 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993719 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 10:03:51.995545 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993722 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 10:03:51.995545 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993726 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 10:03:51.995545 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993728 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 10:03:51.995545 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993731 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 10:03:51.995545 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993733 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 10:03:51.995545 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993736 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 21 10:03:51.995545 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993738 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 10:03:51.995545 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993741 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 10:03:51.995545 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993757 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 10:03:51.995545 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993760 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 10:03:51.995545 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993763 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 10:03:51.995545 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993765 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 10:03:51.995545 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993768 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 10:03:51.995545 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993771 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 10:03:51.995545 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993781 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 10:03:51.995545 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993786 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 10:03:51.996037 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993789 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 10:03:51.996037 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993791 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 10:03:51.996037 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993794 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 10:03:51.996037 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993797 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 10:03:51.996037 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993800 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 10:03:51.996037 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993802 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 10:03:51.996037 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993804 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 10:03:51.996037 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993807 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 10:03:51.996037 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993809 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 10:03:51.996037 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993812 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 10:03:51.996037 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993815 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 10:03:51.996037 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993819 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 10:03:51.996037 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993824 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 10:03:51.996037 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993827 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 10:03:51.996037 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993830 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 10:03:51.996037 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993832 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 10:03:51.996037 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993835 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 10:03:51.996037 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993837 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 10:03:51.996037 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993841 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 10:03:51.996513 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993844 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 10:03:51.996513 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993846 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 10:03:51.996513 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993849 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 10:03:51.996513 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993851 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 10:03:51.996513 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993854 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 10:03:51.996513 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993856 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 10:03:51.996513 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993859 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 10:03:51.996513 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993861 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 10:03:51.996513 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993864 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 10:03:51.996513 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993866 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 10:03:51.996513 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993868 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 10:03:51.996513 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993872 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 10:03:51.996513 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993876 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 10:03:51.996513 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993879 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 10:03:51.996513 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993882 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 10:03:51.996513 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993884 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 10:03:51.996513 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993887 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 10:03:51.996513 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993890 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 10:03:51.996513 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993892 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 10:03:51.996513 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993896 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 10:03:51.997014 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993899 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 10:03:51.997014 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993902 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 10:03:51.997014 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993904 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 10:03:51.997014 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993907 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 10:03:51.997014 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993910 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 10:03:51.997014 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993913 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 10:03:51.997014 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993916 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 10:03:51.997014 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.993919 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 10:03:51.997014 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994721 2577 flags.go:64] FLAG: --address="0.0.0.0" Apr 21 10:03:51.997014 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994730 2577 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 21 10:03:51.997014 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994741 2577 flags.go:64] FLAG: --anonymous-auth="true" Apr 21 10:03:51.997014 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994759 2577 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 21 10:03:51.997014 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994766 2577 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 21 10:03:51.997014 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994770 2577 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 21 10:03:51.997014 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994775 2577 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 21 10:03:51.997014 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994779 2577 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 21 10:03:51.997014 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994782 2577 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 21 10:03:51.997014 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994786 2577 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 21 10:03:51.997014 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994789 2577 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 21 10:03:51.997014 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994793 2577 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 21 10:03:51.997014 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994796 2577 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 21 10:03:51.997014 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994799 2577 flags.go:64] FLAG: --cgroup-root="" Apr 21 10:03:51.997599 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994802 2577 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 21 10:03:51.997599 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994805 2577 flags.go:64] FLAG: --client-ca-file="" Apr 21 10:03:51.997599 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994808 2577 flags.go:64] FLAG: --cloud-config="" Apr 21 10:03:51.997599 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994811 2577 flags.go:64] FLAG: --cloud-provider="external" Apr 21 10:03:51.997599 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994814 2577 flags.go:64] FLAG: --cluster-dns="[]" Apr 21 10:03:51.997599 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994821 2577 flags.go:64] FLAG: --cluster-domain="" Apr 21 10:03:51.997599 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994824 2577 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 21 10:03:51.997599 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994827 2577 flags.go:64] FLAG: --config-dir="" Apr 21 10:03:51.997599 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994830 2577 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 21 10:03:51.997599 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994834 2577 flags.go:64] FLAG: --container-log-max-files="5" Apr 21 10:03:51.997599 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994838 2577 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 21 10:03:51.997599 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994841 2577 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 21 10:03:51.997599 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994844 2577 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 21 10:03:51.997599 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994848 2577 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 21 10:03:51.997599 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994851 2577 flags.go:64] FLAG: --contention-profiling="false" Apr 21 10:03:51.997599 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994854 2577 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 21 10:03:51.997599 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994857 2577 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 21 10:03:51.997599 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994860 2577 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 21 10:03:51.997599 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994863 2577 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 21 10:03:51.997599 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994870 2577 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 21 10:03:51.997599 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994873 2577 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 21 10:03:51.997599 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994876 2577 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 21 10:03:51.997599 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994879 2577 flags.go:64] FLAG: --enable-load-reader="false" Apr 21 10:03:51.997599 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994883 2577 flags.go:64] FLAG: --enable-server="true" Apr 21 10:03:51.997599 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994886 2577 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 21 10:03:51.998261 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994891 2577 flags.go:64] FLAG: --event-burst="100" Apr 21 10:03:51.998261 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994895 2577 flags.go:64] FLAG: --event-qps="50" Apr 21 10:03:51.998261 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994898 2577 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 21 10:03:51.998261 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994901 2577 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 21 10:03:51.998261 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994904 2577 flags.go:64] FLAG: --eviction-hard="" Apr 21 10:03:51.998261 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994907 2577 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 21 10:03:51.998261 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994910 2577 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 21 10:03:51.998261 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994914 2577 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 21 10:03:51.998261 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994917 2577 flags.go:64] FLAG: --eviction-soft="" Apr 21 10:03:51.998261 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994921 2577 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 21 10:03:51.998261 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994923 2577 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 21 10:03:51.998261 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994926 2577 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 21 10:03:51.998261 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994929 2577 flags.go:64] FLAG: --experimental-mounter-path="" Apr 21 10:03:51.998261 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994933 2577 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 21 10:03:51.998261 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994936 2577 flags.go:64] FLAG: --fail-swap-on="true" Apr 21 10:03:51.998261 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994939 2577 flags.go:64] FLAG: --feature-gates="" Apr 21 10:03:51.998261 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994947 2577 flags.go:64] FLAG: --file-check-frequency="20s" Apr 21 10:03:51.998261 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994950 2577 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 21 10:03:51.998261 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994953 2577 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 21 10:03:51.998261 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994957 2577 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 21 10:03:51.998261 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994960 2577 flags.go:64] FLAG: --healthz-port="10248" Apr 21 10:03:51.998261 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994963 2577 flags.go:64] FLAG: --help="false" Apr 21 10:03:51.998261 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994967 2577 flags.go:64] FLAG: --hostname-override="ip-10-0-133-157.ec2.internal" Apr 21 10:03:51.998261 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994970 2577 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 21 10:03:51.998261 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994973 2577 flags.go:64] FLAG: --http-check-frequency="20s" Apr 21 10:03:51.998880 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994976 2577 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 21 10:03:51.998880 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994979 2577 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 21 10:03:51.998880 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994983 2577 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 21 10:03:51.998880 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994986 2577 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 21 10:03:51.998880 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994989 2577 flags.go:64] FLAG: --image-service-endpoint="" Apr 21 10:03:51.998880 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994994 2577 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 21 10:03:51.998880 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.994998 2577 flags.go:64] FLAG: --kube-api-burst="100" Apr 21 10:03:51.998880 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995001 2577 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 21 10:03:51.998880 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995004 2577 flags.go:64] FLAG: --kube-api-qps="50" Apr 21 10:03:51.998880 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995007 2577 flags.go:64] FLAG: --kube-reserved="" Apr 21 10:03:51.998880 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995010 2577 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 21 10:03:51.998880 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995013 2577 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 21 10:03:51.998880 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995016 2577 flags.go:64] FLAG: --kubelet-cgroups="" Apr 21 10:03:51.998880 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995019 2577 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 21 10:03:51.998880 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995021 2577 flags.go:64] FLAG: --lock-file="" Apr 21 10:03:51.998880 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995024 2577 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 21 10:03:51.998880 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995027 2577 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 21 10:03:51.998880 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995030 2577 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 21 10:03:51.998880 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995036 2577 flags.go:64] FLAG: --log-json-split-stream="false" Apr 21 10:03:51.998880 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995039 2577 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 21 10:03:51.998880 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995042 2577 flags.go:64] FLAG: --log-text-split-stream="false" Apr 21 10:03:51.998880 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995045 2577 flags.go:64] FLAG: --logging-format="text" Apr 21 10:03:51.998880 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995048 2577 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 21 10:03:51.999457 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995051 2577 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 21 10:03:51.999457 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995054 2577 flags.go:64] FLAG: --manifest-url="" Apr 21 10:03:51.999457 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995056 2577 flags.go:64] FLAG: --manifest-url-header="" Apr 21 10:03:51.999457 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995061 2577 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 21 10:03:51.999457 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995064 2577 flags.go:64] FLAG: --max-open-files="1000000" Apr 21 10:03:51.999457 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995068 2577 flags.go:64] FLAG: --max-pods="110" Apr 21 10:03:51.999457 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995072 2577 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 21 10:03:51.999457 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995075 2577 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 21 10:03:51.999457 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995078 2577 flags.go:64] FLAG: --memory-manager-policy="None" Apr 21 10:03:51.999457 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995081 2577 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 21 10:03:51.999457 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995084 2577 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 21 10:03:51.999457 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995087 2577 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 21 10:03:51.999457 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995090 2577 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 21 10:03:51.999457 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995098 2577 flags.go:64] FLAG: --node-status-max-images="50" Apr 21 10:03:51.999457 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995102 2577 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 21 10:03:51.999457 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995105 2577 flags.go:64] FLAG: --oom-score-adj="-999" Apr 21 10:03:51.999457 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995109 2577 flags.go:64] FLAG: --pod-cidr="" Apr 21 10:03:51.999457 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995113 2577 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 21 10:03:51.999457 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995118 2577 flags.go:64] FLAG: --pod-manifest-path="" Apr 21 10:03:51.999457 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995121 2577 flags.go:64] FLAG: --pod-max-pids="-1" Apr 21 10:03:51.999457 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995124 2577 flags.go:64] FLAG: --pods-per-core="0" Apr 21 10:03:51.999457 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995127 2577 flags.go:64] FLAG: --port="10250" Apr 21 10:03:51.999457 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995130 2577 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 21 10:03:51.999457 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995133 2577 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0eee728e06a469b2c" Apr 21 10:03:52.000055 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995136 2577 flags.go:64] FLAG: --qos-reserved="" Apr 21 10:03:52.000055 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995139 2577 flags.go:64] FLAG: --read-only-port="10255" Apr 21 10:03:52.000055 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995142 2577 flags.go:64] FLAG: --register-node="true" Apr 21 10:03:52.000055 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995145 2577 flags.go:64] FLAG: --register-schedulable="true" Apr 21 10:03:52.000055 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995148 2577 flags.go:64] FLAG: --register-with-taints="" Apr 21 10:03:52.000055 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995152 2577 flags.go:64] FLAG: --registry-burst="10" Apr 21 10:03:52.000055 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995154 2577 flags.go:64] FLAG: --registry-qps="5" Apr 21 10:03:52.000055 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995157 2577 flags.go:64] FLAG: --reserved-cpus="" Apr 21 10:03:52.000055 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995160 2577 flags.go:64] FLAG: --reserved-memory="" Apr 21 10:03:52.000055 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995164 2577 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 21 10:03:52.000055 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995167 2577 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 21 10:03:52.000055 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995170 2577 flags.go:64] FLAG: --rotate-certificates="false" Apr 21 10:03:52.000055 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995172 2577 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 21 10:03:52.000055 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995179 2577 flags.go:64] FLAG: --runonce="false" Apr 21 10:03:52.000055 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995182 2577 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 21 10:03:52.000055 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995185 2577 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 21 10:03:52.000055 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995188 2577 flags.go:64] FLAG: --seccomp-default="false" Apr 21 10:03:52.000055 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995191 2577 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 21 10:03:52.000055 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995195 2577 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 21 10:03:52.000055 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995198 2577 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 21 10:03:52.000055 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995201 2577 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 21 10:03:52.000055 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995204 2577 flags.go:64] FLAG: --storage-driver-password="root" Apr 21 10:03:52.000055 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995207 2577 flags.go:64] FLAG: --storage-driver-secure="false" Apr 21 10:03:52.000055 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995210 2577 flags.go:64] FLAG: --storage-driver-table="stats" Apr 21 10:03:52.000055 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995213 2577 flags.go:64] FLAG: --storage-driver-user="root" Apr 21 10:03:52.000055 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995218 2577 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 21 10:03:52.000724 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995221 2577 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 21 10:03:52.000724 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995224 2577 flags.go:64] FLAG: --system-cgroups="" Apr 21 10:03:52.000724 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995227 2577 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 21 10:03:52.000724 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995233 2577 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 21 10:03:52.000724 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995236 2577 flags.go:64] FLAG: --tls-cert-file="" Apr 21 10:03:52.000724 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995238 2577 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 21 10:03:52.000724 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995242 2577 flags.go:64] FLAG: --tls-min-version="" Apr 21 10:03:52.000724 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995246 2577 flags.go:64] FLAG: --tls-private-key-file="" Apr 21 10:03:52.000724 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995248 2577 flags.go:64] FLAG: --topology-manager-policy="none" Apr 21 10:03:52.000724 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995251 2577 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 21 10:03:52.000724 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995254 2577 flags.go:64] FLAG: --topology-manager-scope="container" Apr 21 10:03:52.000724 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995257 2577 flags.go:64] FLAG: --v="2" Apr 21 10:03:52.000724 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995261 2577 flags.go:64] FLAG: --version="false" Apr 21 10:03:52.000724 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995265 2577 flags.go:64] FLAG: --vmodule="" Apr 21 10:03:52.000724 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995270 2577 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 21 10:03:52.000724 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995273 2577 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 21 10:03:52.000724 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995383 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 10:03:52.000724 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995387 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 10:03:52.000724 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995390 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 10:03:52.000724 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995395 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 10:03:52.000724 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995399 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 10:03:52.000724 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995402 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 10:03:52.000724 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995404 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 10:03:52.001315 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995407 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 10:03:52.001315 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995410 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 10:03:52.001315 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995413 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 10:03:52.001315 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995415 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 10:03:52.001315 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995418 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 10:03:52.001315 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995421 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 10:03:52.001315 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995423 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 10:03:52.001315 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995426 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 10:03:52.001315 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995429 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 10:03:52.001315 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995432 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 10:03:52.001315 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995435 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 10:03:52.001315 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995438 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 10:03:52.001315 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995440 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 10:03:52.001315 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995443 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 10:03:52.001315 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995445 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 10:03:52.001315 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995449 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 10:03:52.001315 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995453 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 10:03:52.001315 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995456 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 10:03:52.001315 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995458 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 10:03:52.001315 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995461 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 10:03:52.001870 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995464 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 10:03:52.001870 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995466 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 10:03:52.001870 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995469 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 10:03:52.001870 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995473 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 10:03:52.001870 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995476 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 10:03:52.001870 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995479 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 10:03:52.001870 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995482 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 10:03:52.001870 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995485 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 21 10:03:52.001870 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995490 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 10:03:52.001870 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995493 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 10:03:52.001870 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995495 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 10:03:52.001870 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995498 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 10:03:52.001870 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995501 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 10:03:52.001870 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995503 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 10:03:52.001870 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995506 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 10:03:52.001870 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995508 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 10:03:52.001870 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995511 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 10:03:52.001870 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995514 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 10:03:52.001870 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995516 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 10:03:52.002355 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995519 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 10:03:52.002355 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995521 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 10:03:52.002355 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995524 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 10:03:52.002355 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995527 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 10:03:52.002355 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995529 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 10:03:52.002355 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995532 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 10:03:52.002355 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995535 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 10:03:52.002355 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995537 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 10:03:52.002355 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995540 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 10:03:52.002355 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995542 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 10:03:52.002355 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995545 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 10:03:52.002355 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995547 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 10:03:52.002355 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995550 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 10:03:52.002355 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995557 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 10:03:52.002355 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995560 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 10:03:52.002355 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995563 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 10:03:52.002355 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995565 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 10:03:52.002355 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995567 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 10:03:52.002355 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995570 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 10:03:52.002355 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995572 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 10:03:52.002872 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995575 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 10:03:52.002872 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995579 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 10:03:52.002872 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995581 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 10:03:52.002872 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995584 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 10:03:52.002872 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995586 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 10:03:52.002872 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995589 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 10:03:52.002872 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995591 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 10:03:52.002872 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995594 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 10:03:52.002872 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995597 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 10:03:52.002872 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995599 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 10:03:52.002872 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995603 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 10:03:52.002872 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995606 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 10:03:52.002872 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995609 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 10:03:52.002872 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995611 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 10:03:52.002872 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995614 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 10:03:52.002872 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995616 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 10:03:52.002872 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995619 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 10:03:52.002872 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995622 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 10:03:52.002872 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995624 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 10:03:52.002872 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:51.995628 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 10:03:52.003378 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:51.995639 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 10:03:52.003378 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.002454 2577 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 21 10:03:52.003378 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.002575 2577 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 21 10:03:52.003378 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002628 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 10:03:52.003378 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002633 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 10:03:52.003378 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002636 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 10:03:52.003378 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002640 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 10:03:52.003378 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002643 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 10:03:52.003378 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002646 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 10:03:52.003378 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002648 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 10:03:52.003378 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002651 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 10:03:52.003378 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002654 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 10:03:52.003378 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002657 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 10:03:52.003378 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002660 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 10:03:52.003378 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002662 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 10:03:52.003378 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002665 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 10:03:52.003805 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002667 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 10:03:52.003805 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002670 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 10:03:52.003805 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002673 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 10:03:52.003805 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002676 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 10:03:52.003805 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002679 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 10:03:52.003805 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002681 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 10:03:52.003805 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002684 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 10:03:52.003805 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002687 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 10:03:52.003805 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002689 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 10:03:52.003805 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002691 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 10:03:52.003805 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002694 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 10:03:52.003805 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002696 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 10:03:52.003805 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002700 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 10:03:52.003805 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002703 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 10:03:52.003805 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002705 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 10:03:52.003805 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002708 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 10:03:52.003805 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002711 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 10:03:52.003805 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002713 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 10:03:52.003805 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002717 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 10:03:52.004318 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002720 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 10:03:52.004318 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002723 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 10:03:52.004318 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002726 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 10:03:52.004318 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002728 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 10:03:52.004318 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002731 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 10:03:52.004318 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002733 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 10:03:52.004318 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002736 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 10:03:52.004318 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002739 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 10:03:52.004318 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002741 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 10:03:52.004318 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002760 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 10:03:52.004318 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002764 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 10:03:52.004318 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002767 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 10:03:52.004318 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002769 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 10:03:52.004318 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002772 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 10:03:52.004318 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002774 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 10:03:52.004318 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002777 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 10:03:52.004318 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002780 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 10:03:52.004318 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002784 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 10:03:52.004318 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002788 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 10:03:52.004805 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002791 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 10:03:52.004805 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002794 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 10:03:52.004805 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002797 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 10:03:52.004805 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002800 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 10:03:52.004805 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002802 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 10:03:52.004805 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002805 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 10:03:52.004805 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002808 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 10:03:52.004805 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002811 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 10:03:52.004805 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002814 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 10:03:52.004805 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002817 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 10:03:52.004805 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002820 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 10:03:52.004805 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002822 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 10:03:52.004805 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002826 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 10:03:52.004805 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002829 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 10:03:52.004805 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002832 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 10:03:52.004805 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002834 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 10:03:52.004805 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002837 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 10:03:52.004805 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002840 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 10:03:52.004805 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002843 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 10:03:52.005279 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002845 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 10:03:52.005279 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002848 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 10:03:52.005279 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002851 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 10:03:52.005279 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002853 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 10:03:52.005279 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002856 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 10:03:52.005279 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002859 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 10:03:52.005279 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002861 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 21 10:03:52.005279 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002864 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 10:03:52.005279 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002866 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 10:03:52.005279 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002869 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 10:03:52.005279 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002872 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 10:03:52.005279 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002874 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 10:03:52.005279 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002879 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 10:03:52.005279 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002882 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 10:03:52.005279 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002885 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 10:03:52.005279 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002888 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 10:03:52.005684 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.002894 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 10:03:52.005684 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.002995 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 21 10:03:52.005684 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003000 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 21 10:03:52.005684 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003003 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 21 10:03:52.005684 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003007 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 21 10:03:52.005684 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003012 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 21 10:03:52.005684 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003015 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 21 10:03:52.005684 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003018 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 21 10:03:52.005684 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003021 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 21 10:03:52.005684 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003024 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 21 10:03:52.005684 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003028 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 21 10:03:52.005684 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003030 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 21 10:03:52.005684 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003033 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 21 10:03:52.005684 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003036 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 21 10:03:52.005684 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003038 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 21 10:03:52.006083 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003041 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 21 10:03:52.006083 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003044 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 21 10:03:52.006083 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003047 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 21 10:03:52.006083 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003049 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 21 10:03:52.006083 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003052 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 21 10:03:52.006083 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003054 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 21 10:03:52.006083 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003058 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 21 10:03:52.006083 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003062 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 21 10:03:52.006083 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003065 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 21 10:03:52.006083 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003067 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 21 10:03:52.006083 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003070 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 21 10:03:52.006083 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003073 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 21 10:03:52.006083 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003075 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 21 10:03:52.006083 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003078 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 21 10:03:52.006083 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003080 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 21 10:03:52.006083 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003083 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 21 10:03:52.006083 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003085 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 21 10:03:52.006083 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003088 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 21 10:03:52.006083 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003090 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 21 10:03:52.006083 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003093 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 21 10:03:52.006671 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003095 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 21 10:03:52.006671 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003098 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 21 10:03:52.006671 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003100 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 21 10:03:52.006671 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003107 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 21 10:03:52.006671 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003109 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 21 10:03:52.006671 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003111 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 21 10:03:52.006671 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003115 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 21 10:03:52.006671 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003118 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 21 10:03:52.006671 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003121 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 21 10:03:52.006671 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003124 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 21 10:03:52.006671 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003126 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 21 10:03:52.006671 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003129 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 21 10:03:52.006671 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003131 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 21 10:03:52.006671 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003134 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 21 10:03:52.006671 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003136 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 21 10:03:52.006671 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003139 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 21 10:03:52.006671 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003141 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 21 10:03:52.006671 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003144 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 21 10:03:52.006671 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003146 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 21 10:03:52.007160 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003149 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 21 10:03:52.007160 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003151 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 21 10:03:52.007160 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003154 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 21 10:03:52.007160 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003156 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 21 10:03:52.007160 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003159 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 21 10:03:52.007160 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003161 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 21 10:03:52.007160 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003164 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 21 10:03:52.007160 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003166 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 21 10:03:52.007160 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003169 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 21 10:03:52.007160 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003171 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 21 10:03:52.007160 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003174 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 21 10:03:52.007160 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003176 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 21 10:03:52.007160 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003179 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 21 10:03:52.007160 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003181 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 21 10:03:52.007160 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003184 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 21 10:03:52.007160 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003186 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 21 10:03:52.007160 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003189 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 21 10:03:52.007160 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003192 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 21 10:03:52.007160 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003195 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 21 10:03:52.007160 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003197 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 21 10:03:52.007656 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003200 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 21 10:03:52.007656 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003203 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 21 10:03:52.007656 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003205 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 21 10:03:52.007656 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003208 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 21 10:03:52.007656 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003210 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 21 10:03:52.007656 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003213 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 21 10:03:52.007656 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003216 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 21 10:03:52.007656 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003218 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 21 10:03:52.007656 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003221 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 21 10:03:52.007656 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003223 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 21 10:03:52.007656 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003226 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 21 10:03:52.007656 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003229 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 21 10:03:52.007656 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:52.003231 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 21 10:03:52.007656 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.003236 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 21 10:03:52.007656 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.003922 2577 server.go:962] "Client rotation is on, will bootstrap in background" Apr 21 10:03:52.008085 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.006168 2577 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 21 10:03:52.008085 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.007160 2577 server.go:1019] "Starting client certificate rotation" Apr 21 10:03:52.008085 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.007260 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 10:03:52.008085 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.007301 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 21 10:03:52.031627 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.031602 2577 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 10:03:52.034802 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.034778 2577 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 21 10:03:52.051424 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.051397 2577 log.go:25] "Validated CRI v1 runtime API" Apr 21 10:03:52.058071 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.058037 2577 log.go:25] "Validated CRI v1 image API" Apr 21 10:03:52.059962 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.059939 2577 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 21 10:03:52.065205 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.065174 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 10:03:52.067457 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.067434 2577 fs.go:135] Filesystem UUIDs: map[29809a73-547f-44e4-935b-5a6facde363a:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 e9787231-44fd-4bf9-b24f-d467dea8c2ea:/dev/nvme0n1p4] Apr 21 10:03:52.067517 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.067457 2577 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 21 10:03:52.073055 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.072938 2577 manager.go:217] Machine: {Timestamp:2026-04-21 10:03:52.071231951 +0000 UTC m=+0.409765959 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3112481 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec22f55cedf86ea1545ac488927e772b SystemUUID:ec22f55c-edf8-6ea1-545a-c488927e772b BootID:049a2883-91c7-49af-a2cc-d952aeb8df58 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:11:f3:c1:1b:f3 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:11:f3:c1:1b:f3 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:02:71:ad:64:84:97 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 21 10:03:52.073055 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.073043 2577 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 21 10:03:52.073200 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.073187 2577 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 21 10:03:52.073530 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.073504 2577 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 21 10:03:52.073674 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.073531 2577 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-133-157.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 21 10:03:52.073716 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.073684 2577 topology_manager.go:138] "Creating topology manager with none policy" Apr 21 10:03:52.073716 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.073693 2577 container_manager_linux.go:306] "Creating device plugin manager" Apr 21 10:03:52.073716 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.073707 2577 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 10:03:52.075430 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.075418 2577 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 21 10:03:52.077216 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.077204 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 21 10:03:52.077330 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.077321 2577 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 21 10:03:52.079840 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.079829 2577 kubelet.go:491] "Attempting to sync node with API server" Apr 21 10:03:52.079884 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.079845 2577 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 21 10:03:52.079884 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.079860 2577 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 21 10:03:52.079884 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.079869 2577 kubelet.go:397] "Adding apiserver pod source" Apr 21 10:03:52.079884 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.079880 2577 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 21 10:03:52.080858 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.080846 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 10:03:52.080920 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.080865 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 21 10:03:52.083863 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.083846 2577 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 21 10:03:52.085300 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.085281 2577 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 21 10:03:52.087060 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.087045 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 21 10:03:52.087131 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.087067 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 21 10:03:52.087131 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.087073 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 21 10:03:52.087131 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.087078 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 21 10:03:52.087131 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.087084 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 21 10:03:52.087131 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.087095 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 21 10:03:52.087131 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.087101 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 21 10:03:52.087131 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.087106 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 21 10:03:52.087131 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.087113 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 21 10:03:52.087131 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.087119 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 21 10:03:52.087373 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.087141 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 21 10:03:52.087373 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.087150 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 21 10:03:52.088262 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.088252 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 21 10:03:52.088262 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.088262 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 21 10:03:52.091857 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.091835 2577 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-133-157.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 21 10:03:52.091934 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:52.091854 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 21 10:03:52.091934 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:52.091904 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-133-157.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 21 10:03:52.092374 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.092362 2577 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 21 10:03:52.092404 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.092399 2577 server.go:1295] "Started kubelet" Apr 21 10:03:52.092475 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.092450 2577 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 21 10:03:52.092589 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.092550 2577 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 21 10:03:52.092622 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.092608 2577 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 21 10:03:52.093388 ip-10-0-133-157 systemd[1]: Started Kubernetes Kubelet. Apr 21 10:03:52.094386 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.094262 2577 server.go:317] "Adding debug handlers to kubelet server" Apr 21 10:03:52.096567 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.096547 2577 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 21 10:03:52.099698 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:52.098741 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-157.ec2.internal.18a8571c4eead599 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-157.ec2.internal,UID:ip-10-0-133-157.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-133-157.ec2.internal,},FirstTimestamp:2026-04-21 10:03:52.092374425 +0000 UTC m=+0.430908434,LastTimestamp:2026-04-21 10:03:52.092374425 +0000 UTC m=+0.430908434,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-157.ec2.internal,}" Apr 21 10:03:52.100505 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:52.100488 2577 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 21 10:03:52.100846 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.100830 2577 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 21 10:03:52.100905 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.100842 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 21 10:03:52.101599 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:52.101572 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-157.ec2.internal\" not found" Apr 21 10:03:52.101675 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.101642 2577 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 21 10:03:52.101675 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.101648 2577 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 21 10:03:52.101675 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.101672 2577 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 21 10:03:52.101846 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.101735 2577 factory.go:55] Registering systemd factory Apr 21 10:03:52.101846 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.101771 2577 reconstruct.go:97] "Volume reconstruction finished" Apr 21 10:03:52.101846 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.101789 2577 reconciler.go:26] "Reconciler: start to sync state" Apr 21 10:03:52.101846 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.101789 2577 factory.go:223] Registration of the systemd container factory successfully Apr 21 10:03:52.102178 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.102097 2577 factory.go:153] Registering CRI-O factory Apr 21 10:03:52.102178 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.102114 2577 factory.go:223] Registration of the crio container factory successfully Apr 21 10:03:52.102973 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.102951 2577 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 21 10:03:52.103064 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.102991 2577 factory.go:103] Registering Raw factory Apr 21 10:03:52.103064 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.103008 2577 manager.go:1196] Started watching for new ooms in manager Apr 21 10:03:52.105139 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.105119 2577 manager.go:319] Starting recovery of all containers Apr 21 10:03:52.105985 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:52.105946 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 21 10:03:52.106092 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:52.106035 2577 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-133-157.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 21 10:03:52.115924 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.115903 2577 manager.go:324] Recovery completed Apr 21 10:03:52.120407 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.120392 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 10:03:52.123422 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.123390 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-157.ec2.internal" event="NodeHasSufficientMemory" Apr 21 10:03:52.123526 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.123434 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-157.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 10:03:52.123526 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.123449 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-157.ec2.internal" event="NodeHasSufficientPID" Apr 21 10:03:52.124012 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.123993 2577 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 21 10:03:52.124012 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.124010 2577 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 21 10:03:52.124127 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.124031 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 21 10:03:52.125524 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:52.125447 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-157.ec2.internal.18a8571c50c486ca default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-157.ec2.internal,UID:ip-10-0-133-157.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-133-157.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-133-157.ec2.internal,},FirstTimestamp:2026-04-21 10:03:52.123418314 +0000 UTC m=+0.461952321,LastTimestamp:2026-04-21 10:03:52.123418314 +0000 UTC m=+0.461952321,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-157.ec2.internal,}" Apr 21 10:03:52.126332 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.126317 2577 policy_none.go:49] "None policy: Start" Apr 21 10:03:52.126332 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.126336 2577 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 21 10:03:52.126444 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.126347 2577 state_mem.go:35] "Initializing new in-memory state store" Apr 21 10:03:52.139122 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:52.139047 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-157.ec2.internal.18a8571c50c4e060 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-157.ec2.internal,UID:ip-10-0-133-157.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-133-157.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-133-157.ec2.internal,},FirstTimestamp:2026-04-21 10:03:52.123441248 +0000 UTC m=+0.461975262,LastTimestamp:2026-04-21 10:03:52.123441248 +0000 UTC m=+0.461975262,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-157.ec2.internal,}" Apr 21 10:03:52.149107 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:52.148998 2577 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-157.ec2.internal.18a8571c50c513fb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-157.ec2.internal,UID:ip-10-0-133-157.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-133-157.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-133-157.ec2.internal,},FirstTimestamp:2026-04-21 10:03:52.123454459 +0000 UTC m=+0.461988467,LastTimestamp:2026-04-21 10:03:52.123454459 +0000 UTC m=+0.461988467,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-157.ec2.internal,}" Apr 21 10:03:52.168457 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.155314 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-f47r8" Apr 21 10:03:52.168457 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.161602 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-f47r8" Apr 21 10:03:52.168457 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.167468 2577 manager.go:341] "Starting Device Plugin manager" Apr 21 10:03:52.168457 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:52.167500 2577 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 21 10:03:52.168457 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.167510 2577 server.go:85] "Starting device plugin registration server" Apr 21 10:03:52.168457 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.167831 2577 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 21 10:03:52.168457 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.167863 2577 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 21 10:03:52.168457 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.167963 2577 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 21 10:03:52.168457 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.168069 2577 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 21 10:03:52.168457 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.168078 2577 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 21 10:03:52.169105 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:52.169039 2577 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 21 10:03:52.169105 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:52.169077 2577 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-133-157.ec2.internal\" not found" Apr 21 10:03:52.225832 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.225793 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 21 10:03:52.227064 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.227048 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 21 10:03:52.227134 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.227077 2577 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 21 10:03:52.227134 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.227096 2577 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 21 10:03:52.227134 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.227103 2577 kubelet.go:2451] "Starting kubelet main sync loop" Apr 21 10:03:52.227271 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:52.227186 2577 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 21 10:03:52.229575 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.229548 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 10:03:52.268665 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.268582 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 10:03:52.272330 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.272309 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-157.ec2.internal" event="NodeHasSufficientMemory" Apr 21 10:03:52.272465 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.272343 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-157.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 10:03:52.272465 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.272353 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-157.ec2.internal" event="NodeHasSufficientPID" Apr 21 10:03:52.272465 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.272387 2577 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-133-157.ec2.internal" Apr 21 10:03:52.280688 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.280667 2577 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-133-157.ec2.internal" Apr 21 10:03:52.280809 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:52.280699 2577 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-133-157.ec2.internal\": node \"ip-10-0-133-157.ec2.internal\" not found" Apr 21 10:03:52.297088 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:52.297053 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-157.ec2.internal\" not found" Apr 21 10:03:52.327643 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.327613 2577 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-157.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-133-157.ec2.internal"] Apr 21 10:03:52.327725 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.327716 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 10:03:52.330185 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.330169 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-157.ec2.internal" event="NodeHasSufficientMemory" Apr 21 10:03:52.330264 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.330199 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-157.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 10:03:52.330264 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.330209 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-157.ec2.internal" event="NodeHasSufficientPID" Apr 21 10:03:52.331476 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.331463 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 10:03:52.332300 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.332278 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-157.ec2.internal" Apr 21 10:03:52.332396 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.332308 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-157.ec2.internal" event="NodeHasSufficientMemory" Apr 21 10:03:52.332396 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.332316 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 10:03:52.332396 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.332331 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-157.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 10:03:52.332396 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.332343 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-157.ec2.internal" event="NodeHasSufficientPID" Apr 21 10:03:52.333414 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.333397 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-157.ec2.internal" event="NodeHasSufficientMemory" Apr 21 10:03:52.333523 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.333429 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-157.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 10:03:52.333523 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.333442 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-157.ec2.internal" event="NodeHasSufficientPID" Apr 21 10:03:52.333956 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.333939 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-157.ec2.internal" Apr 21 10:03:52.334043 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.333981 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 21 10:03:52.334708 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.334689 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-157.ec2.internal" event="NodeHasSufficientMemory" Apr 21 10:03:52.334811 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.334715 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-157.ec2.internal" event="NodeHasNoDiskPressure" Apr 21 10:03:52.334811 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.334725 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-157.ec2.internal" event="NodeHasSufficientPID" Apr 21 10:03:52.350429 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:52.350403 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-157.ec2.internal\" not found" node="ip-10-0-133-157.ec2.internal" Apr 21 10:03:52.354970 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:52.354949 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-157.ec2.internal\" not found" node="ip-10-0-133-157.ec2.internal" Apr 21 10:03:52.397761 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:52.397711 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-157.ec2.internal\" not found" Apr 21 10:03:52.404126 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.404088 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0dca983b0ab93ffb2f6b3f066e9c69dd-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-157.ec2.internal\" (UID: \"0dca983b0ab93ffb2f6b3f066e9c69dd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-157.ec2.internal" Apr 21 10:03:52.404243 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.404140 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0dca983b0ab93ffb2f6b3f066e9c69dd-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-157.ec2.internal\" (UID: \"0dca983b0ab93ffb2f6b3f066e9c69dd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-157.ec2.internal" Apr 21 10:03:52.404243 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.404169 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6c2c9c59cdaa3a4ce6126af55beb4c88-config\") pod \"kube-apiserver-proxy-ip-10-0-133-157.ec2.internal\" (UID: \"6c2c9c59cdaa3a4ce6126af55beb4c88\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-157.ec2.internal" Apr 21 10:03:52.498673 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:52.498633 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-157.ec2.internal\" not found" Apr 21 10:03:52.505015 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.504995 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0dca983b0ab93ffb2f6b3f066e9c69dd-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-157.ec2.internal\" (UID: \"0dca983b0ab93ffb2f6b3f066e9c69dd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-157.ec2.internal" Apr 21 10:03:52.505103 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.505024 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0dca983b0ab93ffb2f6b3f066e9c69dd-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-157.ec2.internal\" (UID: \"0dca983b0ab93ffb2f6b3f066e9c69dd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-157.ec2.internal" Apr 21 10:03:52.505103 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.505045 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6c2c9c59cdaa3a4ce6126af55beb4c88-config\") pod \"kube-apiserver-proxy-ip-10-0-133-157.ec2.internal\" (UID: \"6c2c9c59cdaa3a4ce6126af55beb4c88\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-157.ec2.internal" Apr 21 10:03:52.505103 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.505068 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0dca983b0ab93ffb2f6b3f066e9c69dd-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-157.ec2.internal\" (UID: \"0dca983b0ab93ffb2f6b3f066e9c69dd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-157.ec2.internal" Apr 21 10:03:52.505203 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.505099 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0dca983b0ab93ffb2f6b3f066e9c69dd-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-157.ec2.internal\" (UID: \"0dca983b0ab93ffb2f6b3f066e9c69dd\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-157.ec2.internal" Apr 21 10:03:52.505203 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.505080 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6c2c9c59cdaa3a4ce6126af55beb4c88-config\") pod \"kube-apiserver-proxy-ip-10-0-133-157.ec2.internal\" (UID: \"6c2c9c59cdaa3a4ce6126af55beb4c88\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-157.ec2.internal" Apr 21 10:03:52.598986 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:52.598895 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-157.ec2.internal\" not found" Apr 21 10:03:52.653360 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.653331 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-157.ec2.internal" Apr 21 10:03:52.657897 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:52.657875 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-157.ec2.internal" Apr 21 10:03:52.699608 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:52.699571 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-157.ec2.internal\" not found" Apr 21 10:03:52.800090 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:52.800053 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-157.ec2.internal\" not found" Apr 21 10:03:52.900658 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:52.900565 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-157.ec2.internal\" not found" Apr 21 10:03:53.001063 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:53.001024 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-157.ec2.internal\" not found" Apr 21 10:03:53.007340 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:53.007316 2577 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 21 10:03:53.007473 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:53.007454 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 21 10:03:53.101413 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:53.101237 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-157.ec2.internal\" not found" Apr 21 10:03:53.101413 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:53.101267 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 21 10:03:53.112766 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:53.112728 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 21 10:03:53.137677 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:53.137646 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-tjn44" Apr 21 10:03:53.143435 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:53.143414 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-tjn44" Apr 21 10:03:53.152485 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:53.152329 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0dca983b0ab93ffb2f6b3f066e9c69dd.slice/crio-09bbacf6593b5630466ad757d0c2b3e14f83c2bedf0c4834f2a61c1e61437b60 WatchSource:0}: Error finding container 09bbacf6593b5630466ad757d0c2b3e14f83c2bedf0c4834f2a61c1e61437b60: Status 404 returned error can't find the container with id 09bbacf6593b5630466ad757d0c2b3e14f83c2bedf0c4834f2a61c1e61437b60 Apr 21 10:03:53.152679 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:53.152657 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c2c9c59cdaa3a4ce6126af55beb4c88.slice/crio-df5e89ae0f465c65643ae23bb49b7bdaa48d0d386c62292c88bf5dba9163c933 WatchSource:0}: Error finding container df5e89ae0f465c65643ae23bb49b7bdaa48d0d386c62292c88bf5dba9163c933: Status 404 returned error can't find the container with id df5e89ae0f465c65643ae23bb49b7bdaa48d0d386c62292c88bf5dba9163c933 Apr 21 10:03:53.157265 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:53.157242 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 10:03:53.163895 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:53.163859 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-20 09:58:52 +0000 UTC" deadline="2027-10-29 22:07:15.483712684 +0000 UTC" Apr 21 10:03:53.163895 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:53.163893 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13356h3m22.319822841s" Apr 21 10:03:53.201511 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:53.201476 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-157.ec2.internal\" not found" Apr 21 10:03:53.230852 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:53.230804 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-157.ec2.internal" event={"ID":"6c2c9c59cdaa3a4ce6126af55beb4c88","Type":"ContainerStarted","Data":"df5e89ae0f465c65643ae23bb49b7bdaa48d0d386c62292c88bf5dba9163c933"} Apr 21 10:03:53.231701 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:53.231674 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-157.ec2.internal" event={"ID":"0dca983b0ab93ffb2f6b3f066e9c69dd","Type":"ContainerStarted","Data":"09bbacf6593b5630466ad757d0c2b3e14f83c2bedf0c4834f2a61c1e61437b60"} Apr 21 10:03:53.257126 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:53.257103 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 10:03:53.302010 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:53.301977 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-157.ec2.internal\" not found" Apr 21 10:03:53.402557 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:53.402474 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-157.ec2.internal\" not found" Apr 21 10:03:53.495385 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:53.495350 2577 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 10:03:53.501349 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:53.501317 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-157.ec2.internal" Apr 21 10:03:53.513260 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:53.513229 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 10:03:53.514176 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:53.514153 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-157.ec2.internal" Apr 21 10:03:53.522117 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:53.522092 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 21 10:03:53.560938 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:53.560906 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 10:03:54.081218 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.081181 2577 apiserver.go:52] "Watching apiserver" Apr 21 10:03:54.090066 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.090032 2577 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 21 10:03:54.090428 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.090397 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-157.ec2.internal","openshift-multus/multus-additional-cni-plugins-g77pc","openshift-multus/multus-xtxxg","openshift-ovn-kubernetes/ovnkube-node-hc44q","kube-system/kube-apiserver-proxy-ip-10-0-133-157.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tz8td","openshift-cluster-node-tuning-operator/tuned-h7tlg","openshift-dns/node-resolver-c588q","openshift-image-registry/node-ca-4ggkj","openshift-multus/network-metrics-daemon-kckvj","openshift-network-diagnostics/network-check-target-qxvf7","openshift-network-operator/iptables-alerter-gwnnd","kube-system/konnectivity-agent-8c424"] Apr 21 10:03:54.092518 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.092498 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-h7tlg" Apr 21 10:03:54.093667 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.093640 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-g77pc" Apr 21 10:03:54.095064 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.094845 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xtxxg" Apr 21 10:03:54.095064 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.094923 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 21 10:03:54.095064 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.094959 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-ghbqc\"" Apr 21 10:03:54.095064 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.094924 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 21 10:03:54.096052 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.096032 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-57929\"" Apr 21 10:03:54.096154 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.096062 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 21 10:03:54.098775 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.096712 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 21 10:03:54.098775 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.096733 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 21 10:03:54.098775 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.097457 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 21 10:03:54.098775 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.097763 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 21 10:03:54.098775 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.097796 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-whmjl\"" Apr 21 10:03:54.098775 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.098253 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 21 10:03:54.100231 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.100210 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.101424 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.101399 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tz8td" Apr 21 10:03:54.101585 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.101569 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-c588q" Apr 21 10:03:54.102450 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.102421 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 21 10:03:54.102863 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.102694 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 21 10:03:54.102863 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.102702 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 21 10:03:54.102984 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.102964 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 21 10:03:54.103090 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.103027 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 21 10:03:54.103178 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.103158 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 21 10:03:54.103257 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.103175 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-5wfdc\"" Apr 21 10:03:54.103403 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.103383 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 21 10:03:54.103601 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.103585 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 21 10:03:54.103841 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.103823 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-qh762\"" Apr 21 10:03:54.104143 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.104123 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 21 10:03:54.104231 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.104160 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 21 10:03:54.104686 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.104443 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4ggkj" Apr 21 10:03:54.104686 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.104536 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kckvj" Apr 21 10:03:54.104686 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:54.104604 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kckvj" podUID="e437b5da-7e75-4ed5-8d79-e418168b80fe" Apr 21 10:03:54.104942 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.104916 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-8q6xb\"" Apr 21 10:03:54.105019 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.104998 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 21 10:03:54.105893 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.105875 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qxvf7" Apr 21 10:03:54.105991 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:54.105966 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qxvf7" podUID="8723d4f5-441e-4586-b642-f008d599b082" Apr 21 10:03:54.107083 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.107062 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 21 10:03:54.107177 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.107136 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-gwnnd" Apr 21 10:03:54.107626 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.107608 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-vj7ql\"" Apr 21 10:03:54.107626 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.107622 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 21 10:03:54.108252 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.108226 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 21 10:03:54.108354 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.108339 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-8c424" Apr 21 10:03:54.111257 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.111241 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 21 10:03:54.111352 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.111326 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 21 10:03:54.111570 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.111552 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 21 10:03:54.111684 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.111655 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-8b4v4\"" Apr 21 10:03:54.111784 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.111735 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 21 10:03:54.111949 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.111921 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-9xwmk\"" Apr 21 10:03:54.112262 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.112153 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 21 10:03:54.114068 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.114034 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/451f825c-7185-464f-967b-97007b1437b8-etc-sysctl-conf\") pod \"tuned-h7tlg\" (UID: \"451f825c-7185-464f-967b-97007b1437b8\") " pod="openshift-cluster-node-tuning-operator/tuned-h7tlg" Apr 21 10:03:54.114159 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.114087 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1e8275fc-da1c-442f-8dd4-cc1ad0f529fe-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-g77pc\" (UID: \"1e8275fc-da1c-442f-8dd4-cc1ad0f529fe\") " pod="openshift-multus/multus-additional-cni-plugins-g77pc" Apr 21 10:03:54.114159 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.114116 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7090d2d9-5242-41cd-9157-a84d2b1a535d-etc-selinux\") pod \"aws-ebs-csi-driver-node-tz8td\" (UID: \"7090d2d9-5242-41cd-9157-a84d2b1a535d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tz8td" Apr 21 10:03:54.114159 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.114152 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/451f825c-7185-464f-967b-97007b1437b8-host\") pod \"tuned-h7tlg\" (UID: \"451f825c-7185-464f-967b-97007b1437b8\") " pod="openshift-cluster-node-tuning-operator/tuned-h7tlg" Apr 21 10:03:54.114291 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.114188 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1e8275fc-da1c-442f-8dd4-cc1ad0f529fe-cni-binary-copy\") pod \"multus-additional-cni-plugins-g77pc\" (UID: \"1e8275fc-da1c-442f-8dd4-cc1ad0f529fe\") " pod="openshift-multus/multus-additional-cni-plugins-g77pc" Apr 21 10:03:54.114291 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.114216 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-host-kubelet\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.114291 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.114239 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-ovn-node-metrics-cert\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.114291 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.114262 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a327162a-7ff0-4ea9-9be0-15ce746f80a2-os-release\") pod \"multus-xtxxg\" (UID: \"a327162a-7ff0-4ea9-9be0-15ce746f80a2\") " pod="openshift-multus/multus-xtxxg" Apr 21 10:03:54.114455 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.114289 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a327162a-7ff0-4ea9-9be0-15ce746f80a2-multus-socket-dir-parent\") pod \"multus-xtxxg\" (UID: \"a327162a-7ff0-4ea9-9be0-15ce746f80a2\") " pod="openshift-multus/multus-xtxxg" Apr 21 10:03:54.114455 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.114312 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-host-slash\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.114455 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.114374 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/451f825c-7185-464f-967b-97007b1437b8-var-lib-kubelet\") pod \"tuned-h7tlg\" (UID: \"451f825c-7185-464f-967b-97007b1437b8\") " pod="openshift-cluster-node-tuning-operator/tuned-h7tlg" Apr 21 10:03:54.114566 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.114458 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1e8275fc-da1c-442f-8dd4-cc1ad0f529fe-os-release\") pod \"multus-additional-cni-plugins-g77pc\" (UID: \"1e8275fc-da1c-442f-8dd4-cc1ad0f529fe\") " pod="openshift-multus/multus-additional-cni-plugins-g77pc" Apr 21 10:03:54.114566 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.114486 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jxr9\" (UniqueName: \"kubernetes.io/projected/1e8275fc-da1c-442f-8dd4-cc1ad0f529fe-kube-api-access-8jxr9\") pod \"multus-additional-cni-plugins-g77pc\" (UID: \"1e8275fc-da1c-442f-8dd4-cc1ad0f529fe\") " pod="openshift-multus/multus-additional-cni-plugins-g77pc" Apr 21 10:03:54.114566 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.114506 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-run-openvswitch\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.114566 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.114525 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7090d2d9-5242-41cd-9157-a84d2b1a535d-socket-dir\") pod \"aws-ebs-csi-driver-node-tz8td\" (UID: \"7090d2d9-5242-41cd-9157-a84d2b1a535d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tz8td" Apr 21 10:03:54.114566 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.114550 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/451f825c-7185-464f-967b-97007b1437b8-etc-systemd\") pod \"tuned-h7tlg\" (UID: \"451f825c-7185-464f-967b-97007b1437b8\") " pod="openshift-cluster-node-tuning-operator/tuned-h7tlg" Apr 21 10:03:54.114788 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.114573 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2p8n\" (UniqueName: \"kubernetes.io/projected/451f825c-7185-464f-967b-97007b1437b8-kube-api-access-m2p8n\") pod \"tuned-h7tlg\" (UID: \"451f825c-7185-464f-967b-97007b1437b8\") " pod="openshift-cluster-node-tuning-operator/tuned-h7tlg" Apr 21 10:03:54.114788 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.114612 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1e8275fc-da1c-442f-8dd4-cc1ad0f529fe-cnibin\") pod \"multus-additional-cni-plugins-g77pc\" (UID: \"1e8275fc-da1c-442f-8dd4-cc1ad0f529fe\") " pod="openshift-multus/multus-additional-cni-plugins-g77pc" Apr 21 10:03:54.114788 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.114669 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1e8275fc-da1c-442f-8dd4-cc1ad0f529fe-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-g77pc\" (UID: \"1e8275fc-da1c-442f-8dd4-cc1ad0f529fe\") " pod="openshift-multus/multus-additional-cni-plugins-g77pc" Apr 21 10:03:54.114788 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.114708 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7090d2d9-5242-41cd-9157-a84d2b1a535d-registration-dir\") pod \"aws-ebs-csi-driver-node-tz8td\" (UID: \"7090d2d9-5242-41cd-9157-a84d2b1a535d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tz8td" Apr 21 10:03:54.114788 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.114767 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w569r\" (UniqueName: \"kubernetes.io/projected/a327162a-7ff0-4ea9-9be0-15ce746f80a2-kube-api-access-w569r\") pod \"multus-xtxxg\" (UID: \"a327162a-7ff0-4ea9-9be0-15ce746f80a2\") " pod="openshift-multus/multus-xtxxg" Apr 21 10:03:54.114959 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.114793 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-run-ovn\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.114959 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.114817 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7090d2d9-5242-41cd-9157-a84d2b1a535d-sys-fs\") pod \"aws-ebs-csi-driver-node-tz8td\" (UID: \"7090d2d9-5242-41cd-9157-a84d2b1a535d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tz8td" Apr 21 10:03:54.114959 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.114839 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a327162a-7ff0-4ea9-9be0-15ce746f80a2-host-run-netns\") pod \"multus-xtxxg\" (UID: \"a327162a-7ff0-4ea9-9be0-15ce746f80a2\") " pod="openshift-multus/multus-xtxxg" Apr 21 10:03:54.114959 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.114871 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a327162a-7ff0-4ea9-9be0-15ce746f80a2-multus-daemon-config\") pod \"multus-xtxxg\" (UID: \"a327162a-7ff0-4ea9-9be0-15ce746f80a2\") " pod="openshift-multus/multus-xtxxg" Apr 21 10:03:54.114959 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.114894 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrkks\" (UniqueName: \"kubernetes.io/projected/b4e66db4-f8f5-415c-aac2-60c02dfc43ff-kube-api-access-qrkks\") pod \"node-ca-4ggkj\" (UID: \"b4e66db4-f8f5-415c-aac2-60c02dfc43ff\") " pod="openshift-image-registry/node-ca-4ggkj" Apr 21 10:03:54.114959 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.114916 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/451f825c-7185-464f-967b-97007b1437b8-run\") pod \"tuned-h7tlg\" (UID: \"451f825c-7185-464f-967b-97007b1437b8\") " pod="openshift-cluster-node-tuning-operator/tuned-h7tlg" Apr 21 10:03:54.114959 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.114952 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/451f825c-7185-464f-967b-97007b1437b8-lib-modules\") pod \"tuned-h7tlg\" (UID: \"451f825c-7185-464f-967b-97007b1437b8\") " pod="openshift-cluster-node-tuning-operator/tuned-h7tlg" Apr 21 10:03:54.115237 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.114990 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1e8275fc-da1c-442f-8dd4-cc1ad0f529fe-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g77pc\" (UID: \"1e8275fc-da1c-442f-8dd4-cc1ad0f529fe\") " pod="openshift-multus/multus-additional-cni-plugins-g77pc" Apr 21 10:03:54.115237 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.115019 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-host-run-netns\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.115237 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.115045 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-var-lib-openvswitch\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.115237 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.115088 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a327162a-7ff0-4ea9-9be0-15ce746f80a2-host-var-lib-kubelet\") pod \"multus-xtxxg\" (UID: \"a327162a-7ff0-4ea9-9be0-15ce746f80a2\") " pod="openshift-multus/multus-xtxxg" Apr 21 10:03:54.115237 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.115114 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b4e66db4-f8f5-415c-aac2-60c02dfc43ff-host\") pod \"node-ca-4ggkj\" (UID: \"b4e66db4-f8f5-415c-aac2-60c02dfc43ff\") " pod="openshift-image-registry/node-ca-4ggkj" Apr 21 10:03:54.115237 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.115154 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/451f825c-7185-464f-967b-97007b1437b8-etc-tuned\") pod \"tuned-h7tlg\" (UID: \"451f825c-7185-464f-967b-97007b1437b8\") " pod="openshift-cluster-node-tuning-operator/tuned-h7tlg" Apr 21 10:03:54.115237 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.115174 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-etc-openvswitch\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.115237 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.115194 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-log-socket\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.115237 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.115208 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-env-overrides\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.115237 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.115223 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-ovnkube-script-lib\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.115649 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.115257 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a327162a-7ff0-4ea9-9be0-15ce746f80a2-multus-cni-dir\") pod \"multus-xtxxg\" (UID: \"a327162a-7ff0-4ea9-9be0-15ce746f80a2\") " pod="openshift-multus/multus-xtxxg" Apr 21 10:03:54.115649 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.115300 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a327162a-7ff0-4ea9-9be0-15ce746f80a2-etc-kubernetes\") pod \"multus-xtxxg\" (UID: \"a327162a-7ff0-4ea9-9be0-15ce746f80a2\") " pod="openshift-multus/multus-xtxxg" Apr 21 10:03:54.115649 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.115324 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/451f825c-7185-464f-967b-97007b1437b8-etc-sysconfig\") pod \"tuned-h7tlg\" (UID: \"451f825c-7185-464f-967b-97007b1437b8\") " pod="openshift-cluster-node-tuning-operator/tuned-h7tlg" Apr 21 10:03:54.115649 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.115346 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/451f825c-7185-464f-967b-97007b1437b8-etc-kubernetes\") pod \"tuned-h7tlg\" (UID: \"451f825c-7185-464f-967b-97007b1437b8\") " pod="openshift-cluster-node-tuning-operator/tuned-h7tlg" Apr 21 10:03:54.115649 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.115375 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-host-cni-bin\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.115649 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.115404 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a327162a-7ff0-4ea9-9be0-15ce746f80a2-cni-binary-copy\") pod \"multus-xtxxg\" (UID: \"a327162a-7ff0-4ea9-9be0-15ce746f80a2\") " pod="openshift-multus/multus-xtxxg" Apr 21 10:03:54.115649 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.115425 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a327162a-7ff0-4ea9-9be0-15ce746f80a2-cnibin\") pod \"multus-xtxxg\" (UID: \"a327162a-7ff0-4ea9-9be0-15ce746f80a2\") " pod="openshift-multus/multus-xtxxg" Apr 21 10:03:54.115649 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.115469 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a327162a-7ff0-4ea9-9be0-15ce746f80a2-hostroot\") pod \"multus-xtxxg\" (UID: \"a327162a-7ff0-4ea9-9be0-15ce746f80a2\") " pod="openshift-multus/multus-xtxxg" Apr 21 10:03:54.115649 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.115486 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a327162a-7ff0-4ea9-9be0-15ce746f80a2-multus-conf-dir\") pod \"multus-xtxxg\" (UID: \"a327162a-7ff0-4ea9-9be0-15ce746f80a2\") " pod="openshift-multus/multus-xtxxg" Apr 21 10:03:54.115649 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.115501 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/186c1594-0ba9-495b-8213-27692e681b57-hosts-file\") pod \"node-resolver-c588q\" (UID: \"186c1594-0ba9-495b-8213-27692e681b57\") " pod="openshift-dns/node-resolver-c588q" Apr 21 10:03:54.115649 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.115522 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzhkg\" (UniqueName: \"kubernetes.io/projected/186c1594-0ba9-495b-8213-27692e681b57-kube-api-access-vzhkg\") pod \"node-resolver-c588q\" (UID: \"186c1594-0ba9-495b-8213-27692e681b57\") " pod="openshift-dns/node-resolver-c588q" Apr 21 10:03:54.115649 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.115546 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a327162a-7ff0-4ea9-9be0-15ce746f80a2-system-cni-dir\") pod \"multus-xtxxg\" (UID: \"a327162a-7ff0-4ea9-9be0-15ce746f80a2\") " pod="openshift-multus/multus-xtxxg" Apr 21 10:03:54.115649 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.115599 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-host-run-ovn-kubernetes\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.115649 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.115646 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b4e66db4-f8f5-415c-aac2-60c02dfc43ff-serviceca\") pod \"node-ca-4ggkj\" (UID: \"b4e66db4-f8f5-415c-aac2-60c02dfc43ff\") " pod="openshift-image-registry/node-ca-4ggkj" Apr 21 10:03:54.116301 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.115671 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-host-cni-netd\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.116301 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.115696 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjwz4\" (UniqueName: \"kubernetes.io/projected/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-kube-api-access-kjwz4\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.116301 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.115720 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7090d2d9-5242-41cd-9157-a84d2b1a535d-device-dir\") pod \"aws-ebs-csi-driver-node-tz8td\" (UID: \"7090d2d9-5242-41cd-9157-a84d2b1a535d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tz8td" Apr 21 10:03:54.116301 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.115759 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a327162a-7ff0-4ea9-9be0-15ce746f80a2-host-run-k8s-cni-cncf-io\") pod \"multus-xtxxg\" (UID: \"a327162a-7ff0-4ea9-9be0-15ce746f80a2\") " pod="openshift-multus/multus-xtxxg" Apr 21 10:03:54.116301 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.115785 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/186c1594-0ba9-495b-8213-27692e681b57-tmp-dir\") pod \"node-resolver-c588q\" (UID: \"186c1594-0ba9-495b-8213-27692e681b57\") " pod="openshift-dns/node-resolver-c588q" Apr 21 10:03:54.116301 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.115808 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-run-systemd\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.116301 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.115831 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-ovnkube-config\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.116301 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.115854 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7090d2d9-5242-41cd-9157-a84d2b1a535d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-tz8td\" (UID: \"7090d2d9-5242-41cd-9157-a84d2b1a535d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tz8td" Apr 21 10:03:54.116301 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.115877 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gctv\" (UniqueName: \"kubernetes.io/projected/7090d2d9-5242-41cd-9157-a84d2b1a535d-kube-api-access-2gctv\") pod \"aws-ebs-csi-driver-node-tz8td\" (UID: \"7090d2d9-5242-41cd-9157-a84d2b1a535d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tz8td" Apr 21 10:03:54.116301 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.115900 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/451f825c-7185-464f-967b-97007b1437b8-etc-sysctl-d\") pod \"tuned-h7tlg\" (UID: \"451f825c-7185-464f-967b-97007b1437b8\") " pod="openshift-cluster-node-tuning-operator/tuned-h7tlg" Apr 21 10:03:54.116301 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.115925 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1e8275fc-da1c-442f-8dd4-cc1ad0f529fe-system-cni-dir\") pod \"multus-additional-cni-plugins-g77pc\" (UID: \"1e8275fc-da1c-442f-8dd4-cc1ad0f529fe\") " pod="openshift-multus/multus-additional-cni-plugins-g77pc" Apr 21 10:03:54.116301 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.115951 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.116301 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.115991 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a327162a-7ff0-4ea9-9be0-15ce746f80a2-host-var-lib-cni-bin\") pod \"multus-xtxxg\" (UID: \"a327162a-7ff0-4ea9-9be0-15ce746f80a2\") " pod="openshift-multus/multus-xtxxg" Apr 21 10:03:54.116301 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.116015 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a327162a-7ff0-4ea9-9be0-15ce746f80a2-host-run-multus-certs\") pod \"multus-xtxxg\" (UID: \"a327162a-7ff0-4ea9-9be0-15ce746f80a2\") " pod="openshift-multus/multus-xtxxg" Apr 21 10:03:54.116301 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.116039 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/451f825c-7185-464f-967b-97007b1437b8-sys\") pod \"tuned-h7tlg\" (UID: \"451f825c-7185-464f-967b-97007b1437b8\") " pod="openshift-cluster-node-tuning-operator/tuned-h7tlg" Apr 21 10:03:54.116301 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.116083 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/451f825c-7185-464f-967b-97007b1437b8-tmp\") pod \"tuned-h7tlg\" (UID: \"451f825c-7185-464f-967b-97007b1437b8\") " pod="openshift-cluster-node-tuning-operator/tuned-h7tlg" Apr 21 10:03:54.116955 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.116106 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/451f825c-7185-464f-967b-97007b1437b8-etc-modprobe-d\") pod \"tuned-h7tlg\" (UID: \"451f825c-7185-464f-967b-97007b1437b8\") " pod="openshift-cluster-node-tuning-operator/tuned-h7tlg" Apr 21 10:03:54.116955 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.116190 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-systemd-units\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.116955 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.116215 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-node-log\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.116955 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.116239 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a327162a-7ff0-4ea9-9be0-15ce746f80a2-host-var-lib-cni-multus\") pod \"multus-xtxxg\" (UID: \"a327162a-7ff0-4ea9-9be0-15ce746f80a2\") " pod="openshift-multus/multus-xtxxg" Apr 21 10:03:54.144174 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.144123 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 09:58:53 +0000 UTC" deadline="2028-01-13 03:17:13.95102483 +0000 UTC" Apr 21 10:03:54.144174 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.144172 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15161h13m19.806858776s" Apr 21 10:03:54.203022 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.202990 2577 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 21 10:03:54.217351 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.217315 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/451f825c-7185-464f-967b-97007b1437b8-etc-systemd\") pod \"tuned-h7tlg\" (UID: \"451f825c-7185-464f-967b-97007b1437b8\") " pod="openshift-cluster-node-tuning-operator/tuned-h7tlg" Apr 21 10:03:54.217351 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.217357 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m2p8n\" (UniqueName: \"kubernetes.io/projected/451f825c-7185-464f-967b-97007b1437b8-kube-api-access-m2p8n\") pod \"tuned-h7tlg\" (UID: \"451f825c-7185-464f-967b-97007b1437b8\") " pod="openshift-cluster-node-tuning-operator/tuned-h7tlg" Apr 21 10:03:54.217571 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.217376 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1e8275fc-da1c-442f-8dd4-cc1ad0f529fe-cnibin\") pod \"multus-additional-cni-plugins-g77pc\" (UID: \"1e8275fc-da1c-442f-8dd4-cc1ad0f529fe\") " pod="openshift-multus/multus-additional-cni-plugins-g77pc" Apr 21 10:03:54.217571 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.217398 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1e8275fc-da1c-442f-8dd4-cc1ad0f529fe-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-g77pc\" (UID: \"1e8275fc-da1c-442f-8dd4-cc1ad0f529fe\") " pod="openshift-multus/multus-additional-cni-plugins-g77pc" Apr 21 10:03:54.217571 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.217425 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7090d2d9-5242-41cd-9157-a84d2b1a535d-registration-dir\") pod \"aws-ebs-csi-driver-node-tz8td\" (UID: \"7090d2d9-5242-41cd-9157-a84d2b1a535d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tz8td" Apr 21 10:03:54.217571 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.217447 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w569r\" (UniqueName: \"kubernetes.io/projected/a327162a-7ff0-4ea9-9be0-15ce746f80a2-kube-api-access-w569r\") pod \"multus-xtxxg\" (UID: \"a327162a-7ff0-4ea9-9be0-15ce746f80a2\") " pod="openshift-multus/multus-xtxxg" Apr 21 10:03:54.217571 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.217449 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/451f825c-7185-464f-967b-97007b1437b8-etc-systemd\") pod \"tuned-h7tlg\" (UID: \"451f825c-7185-464f-967b-97007b1437b8\") " pod="openshift-cluster-node-tuning-operator/tuned-h7tlg" Apr 21 10:03:54.217571 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.217472 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-run-ovn\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.217571 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.217466 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1e8275fc-da1c-442f-8dd4-cc1ad0f529fe-cnibin\") pod \"multus-additional-cni-plugins-g77pc\" (UID: \"1e8275fc-da1c-442f-8dd4-cc1ad0f529fe\") " pod="openshift-multus/multus-additional-cni-plugins-g77pc" Apr 21 10:03:54.217571 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.217524 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7090d2d9-5242-41cd-9157-a84d2b1a535d-registration-dir\") pod \"aws-ebs-csi-driver-node-tz8td\" (UID: \"7090d2d9-5242-41cd-9157-a84d2b1a535d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tz8td" Apr 21 10:03:54.217571 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.217528 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-run-ovn\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.217955 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.217620 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7090d2d9-5242-41cd-9157-a84d2b1a535d-sys-fs\") pod \"aws-ebs-csi-driver-node-tz8td\" (UID: \"7090d2d9-5242-41cd-9157-a84d2b1a535d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tz8td" Apr 21 10:03:54.217955 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.217648 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a327162a-7ff0-4ea9-9be0-15ce746f80a2-host-run-netns\") pod \"multus-xtxxg\" (UID: \"a327162a-7ff0-4ea9-9be0-15ce746f80a2\") " pod="openshift-multus/multus-xtxxg" Apr 21 10:03:54.217955 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.217667 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a327162a-7ff0-4ea9-9be0-15ce746f80a2-multus-daemon-config\") pod \"multus-xtxxg\" (UID: \"a327162a-7ff0-4ea9-9be0-15ce746f80a2\") " pod="openshift-multus/multus-xtxxg" Apr 21 10:03:54.217955 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.217689 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qrkks\" (UniqueName: \"kubernetes.io/projected/b4e66db4-f8f5-415c-aac2-60c02dfc43ff-kube-api-access-qrkks\") pod \"node-ca-4ggkj\" (UID: \"b4e66db4-f8f5-415c-aac2-60c02dfc43ff\") " pod="openshift-image-registry/node-ca-4ggkj" Apr 21 10:03:54.217955 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.217707 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/451f825c-7185-464f-967b-97007b1437b8-run\") pod \"tuned-h7tlg\" (UID: \"451f825c-7185-464f-967b-97007b1437b8\") " pod="openshift-cluster-node-tuning-operator/tuned-h7tlg" Apr 21 10:03:54.217955 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.217713 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7090d2d9-5242-41cd-9157-a84d2b1a535d-sys-fs\") pod \"aws-ebs-csi-driver-node-tz8td\" (UID: \"7090d2d9-5242-41cd-9157-a84d2b1a535d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tz8td" Apr 21 10:03:54.217955 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.217729 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/451f825c-7185-464f-967b-97007b1437b8-lib-modules\") pod \"tuned-h7tlg\" (UID: \"451f825c-7185-464f-967b-97007b1437b8\") " pod="openshift-cluster-node-tuning-operator/tuned-h7tlg" Apr 21 10:03:54.217955 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.217760 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a327162a-7ff0-4ea9-9be0-15ce746f80a2-host-run-netns\") pod \"multus-xtxxg\" (UID: \"a327162a-7ff0-4ea9-9be0-15ce746f80a2\") " pod="openshift-multus/multus-xtxxg" Apr 21 10:03:54.217955 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.217770 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1e8275fc-da1c-442f-8dd4-cc1ad0f529fe-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g77pc\" (UID: \"1e8275fc-da1c-442f-8dd4-cc1ad0f529fe\") " pod="openshift-multus/multus-additional-cni-plugins-g77pc" Apr 21 10:03:54.217955 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.217788 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/451f825c-7185-464f-967b-97007b1437b8-run\") pod \"tuned-h7tlg\" (UID: \"451f825c-7185-464f-967b-97007b1437b8\") " pod="openshift-cluster-node-tuning-operator/tuned-h7tlg" Apr 21 10:03:54.217955 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.217821 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-host-run-netns\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.217955 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.217850 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-var-lib-openvswitch\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.217955 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.217876 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a327162a-7ff0-4ea9-9be0-15ce746f80a2-host-var-lib-kubelet\") pod \"multus-xtxxg\" (UID: \"a327162a-7ff0-4ea9-9be0-15ce746f80a2\") " pod="openshift-multus/multus-xtxxg" Apr 21 10:03:54.217955 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.217905 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b4e66db4-f8f5-415c-aac2-60c02dfc43ff-host\") pod \"node-ca-4ggkj\" (UID: \"b4e66db4-f8f5-415c-aac2-60c02dfc43ff\") " pod="openshift-image-registry/node-ca-4ggkj" Apr 21 10:03:54.217955 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.217908 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1e8275fc-da1c-442f-8dd4-cc1ad0f529fe-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g77pc\" (UID: \"1e8275fc-da1c-442f-8dd4-cc1ad0f529fe\") " pod="openshift-multus/multus-additional-cni-plugins-g77pc" Apr 21 10:03:54.217955 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.217892 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/451f825c-7185-464f-967b-97007b1437b8-lib-modules\") pod \"tuned-h7tlg\" (UID: \"451f825c-7185-464f-967b-97007b1437b8\") " pod="openshift-cluster-node-tuning-operator/tuned-h7tlg" Apr 21 10:03:54.217955 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.217924 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-host-run-netns\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.217955 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.217954 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b4e66db4-f8f5-415c-aac2-60c02dfc43ff-host\") pod \"node-ca-4ggkj\" (UID: \"b4e66db4-f8f5-415c-aac2-60c02dfc43ff\") " pod="openshift-image-registry/node-ca-4ggkj" Apr 21 10:03:54.218781 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.217946 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-var-lib-openvswitch\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.218781 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.217958 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a327162a-7ff0-4ea9-9be0-15ce746f80a2-host-var-lib-kubelet\") pod \"multus-xtxxg\" (UID: \"a327162a-7ff0-4ea9-9be0-15ce746f80a2\") " pod="openshift-multus/multus-xtxxg" Apr 21 10:03:54.218781 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.217977 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/451f825c-7185-464f-967b-97007b1437b8-etc-tuned\") pod \"tuned-h7tlg\" (UID: \"451f825c-7185-464f-967b-97007b1437b8\") " pod="openshift-cluster-node-tuning-operator/tuned-h7tlg" Apr 21 10:03:54.218781 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.218008 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-etc-openvswitch\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.218781 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.218086 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1e8275fc-da1c-442f-8dd4-cc1ad0f529fe-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-g77pc\" (UID: \"1e8275fc-da1c-442f-8dd4-cc1ad0f529fe\") " pod="openshift-multus/multus-additional-cni-plugins-g77pc" Apr 21 10:03:54.218781 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.218123 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-log-socket\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.218781 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.218149 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-env-overrides\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.218781 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.218154 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-etc-openvswitch\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.218781 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.218167 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-ovnkube-script-lib\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.218781 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.218207 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a327162a-7ff0-4ea9-9be0-15ce746f80a2-multus-cni-dir\") pod \"multus-xtxxg\" (UID: \"a327162a-7ff0-4ea9-9be0-15ce746f80a2\") " pod="openshift-multus/multus-xtxxg" Apr 21 10:03:54.218781 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.218222 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-log-socket\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.218781 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.218237 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a327162a-7ff0-4ea9-9be0-15ce746f80a2-etc-kubernetes\") pod \"multus-xtxxg\" (UID: \"a327162a-7ff0-4ea9-9be0-15ce746f80a2\") " pod="openshift-multus/multus-xtxxg" Apr 21 10:03:54.218781 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.218264 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/451f825c-7185-464f-967b-97007b1437b8-etc-sysconfig\") pod \"tuned-h7tlg\" (UID: \"451f825c-7185-464f-967b-97007b1437b8\") " pod="openshift-cluster-node-tuning-operator/tuned-h7tlg" Apr 21 10:03:54.218781 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.218290 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/451f825c-7185-464f-967b-97007b1437b8-etc-kubernetes\") pod \"tuned-h7tlg\" (UID: \"451f825c-7185-464f-967b-97007b1437b8\") " pod="openshift-cluster-node-tuning-operator/tuned-h7tlg" Apr 21 10:03:54.218781 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.218289 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a327162a-7ff0-4ea9-9be0-15ce746f80a2-multus-cni-dir\") pod \"multus-xtxxg\" (UID: \"a327162a-7ff0-4ea9-9be0-15ce746f80a2\") " pod="openshift-multus/multus-xtxxg" Apr 21 10:03:54.218781 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.218287 2577 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 21 10:03:54.218781 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.218330 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a327162a-7ff0-4ea9-9be0-15ce746f80a2-etc-kubernetes\") pod \"multus-xtxxg\" (UID: \"a327162a-7ff0-4ea9-9be0-15ce746f80a2\") " pod="openshift-multus/multus-xtxxg" Apr 21 10:03:54.218781 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.218336 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/451f825c-7185-464f-967b-97007b1437b8-etc-sysconfig\") pod \"tuned-h7tlg\" (UID: \"451f825c-7185-464f-967b-97007b1437b8\") " pod="openshift-cluster-node-tuning-operator/tuned-h7tlg" Apr 21 10:03:54.219596 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.218337 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a327162a-7ff0-4ea9-9be0-15ce746f80a2-multus-daemon-config\") pod \"multus-xtxxg\" (UID: \"a327162a-7ff0-4ea9-9be0-15ce746f80a2\") " pod="openshift-multus/multus-xtxxg" Apr 21 10:03:54.219596 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.218361 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-host-cni-bin\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.219596 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.218389 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a327162a-7ff0-4ea9-9be0-15ce746f80a2-cni-binary-copy\") pod \"multus-xtxxg\" (UID: \"a327162a-7ff0-4ea9-9be0-15ce746f80a2\") " pod="openshift-multus/multus-xtxxg" Apr 21 10:03:54.219596 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.218414 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a327162a-7ff0-4ea9-9be0-15ce746f80a2-cnibin\") pod \"multus-xtxxg\" (UID: \"a327162a-7ff0-4ea9-9be0-15ce746f80a2\") " pod="openshift-multus/multus-xtxxg" Apr 21 10:03:54.219596 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.218419 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-host-cni-bin\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.219596 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.218450 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c6bbb2a8-d1d8-46e7-a943-36618a64adb4-iptables-alerter-script\") pod \"iptables-alerter-gwnnd\" (UID: \"c6bbb2a8-d1d8-46e7-a943-36618a64adb4\") " pod="openshift-network-operator/iptables-alerter-gwnnd" Apr 21 10:03:54.219596 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.218461 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a327162a-7ff0-4ea9-9be0-15ce746f80a2-cnibin\") pod \"multus-xtxxg\" (UID: \"a327162a-7ff0-4ea9-9be0-15ce746f80a2\") " pod="openshift-multus/multus-xtxxg" Apr 21 10:03:54.219596 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.218478 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/b7b9a85e-3e84-4e88-b193-a2eff9d45b6a-agent-certs\") pod \"konnectivity-agent-8c424\" (UID: \"b7b9a85e-3e84-4e88-b193-a2eff9d45b6a\") " pod="kube-system/konnectivity-agent-8c424" Apr 21 10:03:54.219596 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.218389 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/451f825c-7185-464f-967b-97007b1437b8-etc-kubernetes\") pod \"tuned-h7tlg\" (UID: \"451f825c-7185-464f-967b-97007b1437b8\") " pod="openshift-cluster-node-tuning-operator/tuned-h7tlg" Apr 21 10:03:54.219596 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.218513 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e437b5da-7e75-4ed5-8d79-e418168b80fe-metrics-certs\") pod \"network-metrics-daemon-kckvj\" (UID: \"e437b5da-7e75-4ed5-8d79-e418168b80fe\") " pod="openshift-multus/network-metrics-daemon-kckvj" Apr 21 10:03:54.219596 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.218550 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a327162a-7ff0-4ea9-9be0-15ce746f80a2-hostroot\") pod \"multus-xtxxg\" (UID: \"a327162a-7ff0-4ea9-9be0-15ce746f80a2\") " pod="openshift-multus/multus-xtxxg" Apr 21 10:03:54.219596 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.218575 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a327162a-7ff0-4ea9-9be0-15ce746f80a2-multus-conf-dir\") pod \"multus-xtxxg\" (UID: \"a327162a-7ff0-4ea9-9be0-15ce746f80a2\") " pod="openshift-multus/multus-xtxxg" Apr 21 10:03:54.219596 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.218600 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/186c1594-0ba9-495b-8213-27692e681b57-hosts-file\") pod \"node-resolver-c588q\" (UID: \"186c1594-0ba9-495b-8213-27692e681b57\") " pod="openshift-dns/node-resolver-c588q" Apr 21 10:03:54.219596 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.218637 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vzhkg\" (UniqueName: \"kubernetes.io/projected/186c1594-0ba9-495b-8213-27692e681b57-kube-api-access-vzhkg\") pod \"node-resolver-c588q\" (UID: \"186c1594-0ba9-495b-8213-27692e681b57\") " pod="openshift-dns/node-resolver-c588q" Apr 21 10:03:54.219596 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.218656 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a327162a-7ff0-4ea9-9be0-15ce746f80a2-hostroot\") pod \"multus-xtxxg\" (UID: \"a327162a-7ff0-4ea9-9be0-15ce746f80a2\") " pod="openshift-multus/multus-xtxxg" Apr 21 10:03:54.219596 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.218663 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c6bbb2a8-d1d8-46e7-a943-36618a64adb4-host-slash\") pod \"iptables-alerter-gwnnd\" (UID: \"c6bbb2a8-d1d8-46e7-a943-36618a64adb4\") " pod="openshift-network-operator/iptables-alerter-gwnnd" Apr 21 10:03:54.219596 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.218673 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-env-overrides\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.219596 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.218702 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/186c1594-0ba9-495b-8213-27692e681b57-hosts-file\") pod \"node-resolver-c588q\" (UID: \"186c1594-0ba9-495b-8213-27692e681b57\") " pod="openshift-dns/node-resolver-c588q" Apr 21 10:03:54.220340 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.218732 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a327162a-7ff0-4ea9-9be0-15ce746f80a2-system-cni-dir\") pod \"multus-xtxxg\" (UID: \"a327162a-7ff0-4ea9-9be0-15ce746f80a2\") " pod="openshift-multus/multus-xtxxg" Apr 21 10:03:54.220340 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.218765 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-host-run-ovn-kubernetes\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.220340 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.218774 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a327162a-7ff0-4ea9-9be0-15ce746f80a2-multus-conf-dir\") pod \"multus-xtxxg\" (UID: \"a327162a-7ff0-4ea9-9be0-15ce746f80a2\") " pod="openshift-multus/multus-xtxxg" Apr 21 10:03:54.220340 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.218785 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-ovnkube-script-lib\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.220340 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.218805 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b4e66db4-f8f5-415c-aac2-60c02dfc43ff-serviceca\") pod \"node-ca-4ggkj\" (UID: \"b4e66db4-f8f5-415c-aac2-60c02dfc43ff\") " pod="openshift-image-registry/node-ca-4ggkj" Apr 21 10:03:54.220340 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.218843 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a327162a-7ff0-4ea9-9be0-15ce746f80a2-system-cni-dir\") pod \"multus-xtxxg\" (UID: \"a327162a-7ff0-4ea9-9be0-15ce746f80a2\") " pod="openshift-multus/multus-xtxxg" Apr 21 10:03:54.220340 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.218850 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-host-cni-netd\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.220340 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.218877 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-host-run-ovn-kubernetes\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.220340 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.218918 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kjwz4\" (UniqueName: \"kubernetes.io/projected/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-kube-api-access-kjwz4\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.220340 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.218947 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7090d2d9-5242-41cd-9157-a84d2b1a535d-device-dir\") pod \"aws-ebs-csi-driver-node-tz8td\" (UID: \"7090d2d9-5242-41cd-9157-a84d2b1a535d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tz8td" Apr 21 10:03:54.220340 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.218956 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a327162a-7ff0-4ea9-9be0-15ce746f80a2-cni-binary-copy\") pod \"multus-xtxxg\" (UID: \"a327162a-7ff0-4ea9-9be0-15ce746f80a2\") " pod="openshift-multus/multus-xtxxg" Apr 21 10:03:54.220340 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.218997 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7090d2d9-5242-41cd-9157-a84d2b1a535d-device-dir\") pod \"aws-ebs-csi-driver-node-tz8td\" (UID: \"7090d2d9-5242-41cd-9157-a84d2b1a535d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tz8td" Apr 21 10:03:54.220340 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.218999 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a327162a-7ff0-4ea9-9be0-15ce746f80a2-host-run-k8s-cni-cncf-io\") pod \"multus-xtxxg\" (UID: \"a327162a-7ff0-4ea9-9be0-15ce746f80a2\") " pod="openshift-multus/multus-xtxxg" Apr 21 10:03:54.220340 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.219029 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a327162a-7ff0-4ea9-9be0-15ce746f80a2-host-run-k8s-cni-cncf-io\") pod \"multus-xtxxg\" (UID: \"a327162a-7ff0-4ea9-9be0-15ce746f80a2\") " pod="openshift-multus/multus-xtxxg" Apr 21 10:03:54.220340 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.219035 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-host-cni-netd\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.220340 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.219043 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/186c1594-0ba9-495b-8213-27692e681b57-tmp-dir\") pod \"node-resolver-c588q\" (UID: \"186c1594-0ba9-495b-8213-27692e681b57\") " pod="openshift-dns/node-resolver-c588q" Apr 21 10:03:54.220340 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.219076 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxdxm\" (UniqueName: \"kubernetes.io/projected/8723d4f5-441e-4586-b642-f008d599b082-kube-api-access-pxdxm\") pod \"network-check-target-qxvf7\" (UID: \"8723d4f5-441e-4586-b642-f008d599b082\") " pod="openshift-network-diagnostics/network-check-target-qxvf7" Apr 21 10:03:54.221100 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.219103 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/b7b9a85e-3e84-4e88-b193-a2eff9d45b6a-konnectivity-ca\") pod \"konnectivity-agent-8c424\" (UID: \"b7b9a85e-3e84-4e88-b193-a2eff9d45b6a\") " pod="kube-system/konnectivity-agent-8c424" Apr 21 10:03:54.221100 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.219138 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-run-systemd\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.221100 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.219163 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-ovnkube-config\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.221100 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.219194 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7090d2d9-5242-41cd-9157-a84d2b1a535d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-tz8td\" (UID: \"7090d2d9-5242-41cd-9157-a84d2b1a535d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tz8td" Apr 21 10:03:54.221100 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.219222 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2gctv\" (UniqueName: \"kubernetes.io/projected/7090d2d9-5242-41cd-9157-a84d2b1a535d-kube-api-access-2gctv\") pod \"aws-ebs-csi-driver-node-tz8td\" (UID: \"7090d2d9-5242-41cd-9157-a84d2b1a535d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tz8td" Apr 21 10:03:54.221100 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.219248 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/451f825c-7185-464f-967b-97007b1437b8-etc-sysctl-d\") pod \"tuned-h7tlg\" (UID: \"451f825c-7185-464f-967b-97007b1437b8\") " pod="openshift-cluster-node-tuning-operator/tuned-h7tlg" Apr 21 10:03:54.221100 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.219266 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7090d2d9-5242-41cd-9157-a84d2b1a535d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-tz8td\" (UID: \"7090d2d9-5242-41cd-9157-a84d2b1a535d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tz8td" Apr 21 10:03:54.221100 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.219279 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1e8275fc-da1c-442f-8dd4-cc1ad0f529fe-system-cni-dir\") pod \"multus-additional-cni-plugins-g77pc\" (UID: \"1e8275fc-da1c-442f-8dd4-cc1ad0f529fe\") " pod="openshift-multus/multus-additional-cni-plugins-g77pc" Apr 21 10:03:54.221100 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.219305 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.221100 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.219323 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/186c1594-0ba9-495b-8213-27692e681b57-tmp-dir\") pod \"node-resolver-c588q\" (UID: \"186c1594-0ba9-495b-8213-27692e681b57\") " pod="openshift-dns/node-resolver-c588q" Apr 21 10:03:54.221100 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.219332 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a327162a-7ff0-4ea9-9be0-15ce746f80a2-host-var-lib-cni-bin\") pod \"multus-xtxxg\" (UID: \"a327162a-7ff0-4ea9-9be0-15ce746f80a2\") " pod="openshift-multus/multus-xtxxg" Apr 21 10:03:54.221100 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.219349 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b4e66db4-f8f5-415c-aac2-60c02dfc43ff-serviceca\") pod \"node-ca-4ggkj\" (UID: \"b4e66db4-f8f5-415c-aac2-60c02dfc43ff\") " pod="openshift-image-registry/node-ca-4ggkj" Apr 21 10:03:54.221100 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.219192 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-run-systemd\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.221100 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.219360 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a327162a-7ff0-4ea9-9be0-15ce746f80a2-host-run-multus-certs\") pod \"multus-xtxxg\" (UID: \"a327162a-7ff0-4ea9-9be0-15ce746f80a2\") " pod="openshift-multus/multus-xtxxg" Apr 21 10:03:54.221100 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.219386 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v5fc\" (UniqueName: \"kubernetes.io/projected/c6bbb2a8-d1d8-46e7-a943-36618a64adb4-kube-api-access-8v5fc\") pod \"iptables-alerter-gwnnd\" (UID: \"c6bbb2a8-d1d8-46e7-a943-36618a64adb4\") " pod="openshift-network-operator/iptables-alerter-gwnnd" Apr 21 10:03:54.221100 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.219407 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1e8275fc-da1c-442f-8dd4-cc1ad0f529fe-system-cni-dir\") pod \"multus-additional-cni-plugins-g77pc\" (UID: \"1e8275fc-da1c-442f-8dd4-cc1ad0f529fe\") " pod="openshift-multus/multus-additional-cni-plugins-g77pc" Apr 21 10:03:54.221100 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.219413 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/451f825c-7185-464f-967b-97007b1437b8-sys\") pod \"tuned-h7tlg\" (UID: \"451f825c-7185-464f-967b-97007b1437b8\") " pod="openshift-cluster-node-tuning-operator/tuned-h7tlg" Apr 21 10:03:54.221882 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.219454 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/451f825c-7185-464f-967b-97007b1437b8-tmp\") pod \"tuned-h7tlg\" (UID: \"451f825c-7185-464f-967b-97007b1437b8\") " pod="openshift-cluster-node-tuning-operator/tuned-h7tlg" Apr 21 10:03:54.221882 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.219463 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/451f825c-7185-464f-967b-97007b1437b8-sys\") pod \"tuned-h7tlg\" (UID: \"451f825c-7185-464f-967b-97007b1437b8\") " pod="openshift-cluster-node-tuning-operator/tuned-h7tlg" Apr 21 10:03:54.221882 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.219479 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/451f825c-7185-464f-967b-97007b1437b8-etc-sysctl-d\") pod \"tuned-h7tlg\" (UID: \"451f825c-7185-464f-967b-97007b1437b8\") " pod="openshift-cluster-node-tuning-operator/tuned-h7tlg" Apr 21 10:03:54.221882 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.219481 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhnzq\" (UniqueName: \"kubernetes.io/projected/e437b5da-7e75-4ed5-8d79-e418168b80fe-kube-api-access-qhnzq\") pod \"network-metrics-daemon-kckvj\" (UID: \"e437b5da-7e75-4ed5-8d79-e418168b80fe\") " pod="openshift-multus/network-metrics-daemon-kckvj" Apr 21 10:03:54.221882 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.219493 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a327162a-7ff0-4ea9-9be0-15ce746f80a2-host-var-lib-cni-bin\") pod \"multus-xtxxg\" (UID: \"a327162a-7ff0-4ea9-9be0-15ce746f80a2\") " pod="openshift-multus/multus-xtxxg" Apr 21 10:03:54.221882 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.219519 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/451f825c-7185-464f-967b-97007b1437b8-etc-modprobe-d\") pod \"tuned-h7tlg\" (UID: \"451f825c-7185-464f-967b-97007b1437b8\") " pod="openshift-cluster-node-tuning-operator/tuned-h7tlg" Apr 21 10:03:54.221882 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.219548 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.221882 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.219508 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a327162a-7ff0-4ea9-9be0-15ce746f80a2-host-run-multus-certs\") pod \"multus-xtxxg\" (UID: \"a327162a-7ff0-4ea9-9be0-15ce746f80a2\") " pod="openshift-multus/multus-xtxxg" Apr 21 10:03:54.221882 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.219575 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-systemd-units\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.221882 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.219629 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-node-log\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.221882 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.219631 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-ovnkube-config\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.221882 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.219656 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a327162a-7ff0-4ea9-9be0-15ce746f80a2-host-var-lib-cni-multus\") pod \"multus-xtxxg\" (UID: \"a327162a-7ff0-4ea9-9be0-15ce746f80a2\") " pod="openshift-multus/multus-xtxxg" Apr 21 10:03:54.221882 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.219647 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/451f825c-7185-464f-967b-97007b1437b8-etc-modprobe-d\") pod \"tuned-h7tlg\" (UID: \"451f825c-7185-464f-967b-97007b1437b8\") " pod="openshift-cluster-node-tuning-operator/tuned-h7tlg" Apr 21 10:03:54.221882 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.219686 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-systemd-units\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.221882 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.219695 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a327162a-7ff0-4ea9-9be0-15ce746f80a2-host-var-lib-cni-multus\") pod \"multus-xtxxg\" (UID: \"a327162a-7ff0-4ea9-9be0-15ce746f80a2\") " pod="openshift-multus/multus-xtxxg" Apr 21 10:03:54.221882 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.219728 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/451f825c-7185-464f-967b-97007b1437b8-etc-sysctl-conf\") pod \"tuned-h7tlg\" (UID: \"451f825c-7185-464f-967b-97007b1437b8\") " pod="openshift-cluster-node-tuning-operator/tuned-h7tlg" Apr 21 10:03:54.221882 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.219729 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-node-log\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.222480 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.219789 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1e8275fc-da1c-442f-8dd4-cc1ad0f529fe-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-g77pc\" (UID: \"1e8275fc-da1c-442f-8dd4-cc1ad0f529fe\") " pod="openshift-multus/multus-additional-cni-plugins-g77pc" Apr 21 10:03:54.222480 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.219821 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7090d2d9-5242-41cd-9157-a84d2b1a535d-etc-selinux\") pod \"aws-ebs-csi-driver-node-tz8td\" (UID: \"7090d2d9-5242-41cd-9157-a84d2b1a535d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tz8td" Apr 21 10:03:54.222480 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.219847 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/451f825c-7185-464f-967b-97007b1437b8-etc-sysctl-conf\") pod \"tuned-h7tlg\" (UID: \"451f825c-7185-464f-967b-97007b1437b8\") " pod="openshift-cluster-node-tuning-operator/tuned-h7tlg" Apr 21 10:03:54.222480 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.219893 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/451f825c-7185-464f-967b-97007b1437b8-host\") pod \"tuned-h7tlg\" (UID: \"451f825c-7185-464f-967b-97007b1437b8\") " pod="openshift-cluster-node-tuning-operator/tuned-h7tlg" Apr 21 10:03:54.222480 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.219903 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7090d2d9-5242-41cd-9157-a84d2b1a535d-etc-selinux\") pod \"aws-ebs-csi-driver-node-tz8td\" (UID: \"7090d2d9-5242-41cd-9157-a84d2b1a535d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tz8td" Apr 21 10:03:54.222480 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.219850 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/451f825c-7185-464f-967b-97007b1437b8-host\") pod \"tuned-h7tlg\" (UID: \"451f825c-7185-464f-967b-97007b1437b8\") " pod="openshift-cluster-node-tuning-operator/tuned-h7tlg" Apr 21 10:03:54.222480 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.219941 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1e8275fc-da1c-442f-8dd4-cc1ad0f529fe-cni-binary-copy\") pod \"multus-additional-cni-plugins-g77pc\" (UID: \"1e8275fc-da1c-442f-8dd4-cc1ad0f529fe\") " pod="openshift-multus/multus-additional-cni-plugins-g77pc" Apr 21 10:03:54.222480 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.219966 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-host-kubelet\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.222480 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.219990 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-ovn-node-metrics-cert\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.222480 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.220018 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a327162a-7ff0-4ea9-9be0-15ce746f80a2-os-release\") pod \"multus-xtxxg\" (UID: \"a327162a-7ff0-4ea9-9be0-15ce746f80a2\") " pod="openshift-multus/multus-xtxxg" Apr 21 10:03:54.222480 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.220044 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a327162a-7ff0-4ea9-9be0-15ce746f80a2-multus-socket-dir-parent\") pod \"multus-xtxxg\" (UID: \"a327162a-7ff0-4ea9-9be0-15ce746f80a2\") " pod="openshift-multus/multus-xtxxg" Apr 21 10:03:54.222480 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.220072 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-host-slash\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.222480 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.220100 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/451f825c-7185-464f-967b-97007b1437b8-var-lib-kubelet\") pod \"tuned-h7tlg\" (UID: \"451f825c-7185-464f-967b-97007b1437b8\") " pod="openshift-cluster-node-tuning-operator/tuned-h7tlg" Apr 21 10:03:54.222480 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.220125 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1e8275fc-da1c-442f-8dd4-cc1ad0f529fe-os-release\") pod \"multus-additional-cni-plugins-g77pc\" (UID: \"1e8275fc-da1c-442f-8dd4-cc1ad0f529fe\") " pod="openshift-multus/multus-additional-cni-plugins-g77pc" Apr 21 10:03:54.222480 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.220154 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8jxr9\" (UniqueName: \"kubernetes.io/projected/1e8275fc-da1c-442f-8dd4-cc1ad0f529fe-kube-api-access-8jxr9\") pod \"multus-additional-cni-plugins-g77pc\" (UID: \"1e8275fc-da1c-442f-8dd4-cc1ad0f529fe\") " pod="openshift-multus/multus-additional-cni-plugins-g77pc" Apr 21 10:03:54.222480 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.220181 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-run-openvswitch\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.222480 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.220208 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7090d2d9-5242-41cd-9157-a84d2b1a535d-socket-dir\") pod \"aws-ebs-csi-driver-node-tz8td\" (UID: \"7090d2d9-5242-41cd-9157-a84d2b1a535d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tz8td" Apr 21 10:03:54.223038 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.220224 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1e8275fc-da1c-442f-8dd4-cc1ad0f529fe-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-g77pc\" (UID: \"1e8275fc-da1c-442f-8dd4-cc1ad0f529fe\") " pod="openshift-multus/multus-additional-cni-plugins-g77pc" Apr 21 10:03:54.223038 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.220288 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-host-slash\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.223038 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.220303 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/451f825c-7185-464f-967b-97007b1437b8-var-lib-kubelet\") pod \"tuned-h7tlg\" (UID: \"451f825c-7185-464f-967b-97007b1437b8\") " pod="openshift-cluster-node-tuning-operator/tuned-h7tlg" Apr 21 10:03:54.223038 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.220361 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1e8275fc-da1c-442f-8dd4-cc1ad0f529fe-os-release\") pod \"multus-additional-cni-plugins-g77pc\" (UID: \"1e8275fc-da1c-442f-8dd4-cc1ad0f529fe\") " pod="openshift-multus/multus-additional-cni-plugins-g77pc" Apr 21 10:03:54.223038 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.220421 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-host-kubelet\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.223038 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.220424 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1e8275fc-da1c-442f-8dd4-cc1ad0f529fe-cni-binary-copy\") pod \"multus-additional-cni-plugins-g77pc\" (UID: \"1e8275fc-da1c-442f-8dd4-cc1ad0f529fe\") " pod="openshift-multus/multus-additional-cni-plugins-g77pc" Apr 21 10:03:54.223038 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.220491 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-run-openvswitch\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.223038 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.220471 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7090d2d9-5242-41cd-9157-a84d2b1a535d-socket-dir\") pod \"aws-ebs-csi-driver-node-tz8td\" (UID: \"7090d2d9-5242-41cd-9157-a84d2b1a535d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tz8td" Apr 21 10:03:54.223038 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.220557 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a327162a-7ff0-4ea9-9be0-15ce746f80a2-os-release\") pod \"multus-xtxxg\" (UID: \"a327162a-7ff0-4ea9-9be0-15ce746f80a2\") " pod="openshift-multus/multus-xtxxg" Apr 21 10:03:54.223038 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.220564 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a327162a-7ff0-4ea9-9be0-15ce746f80a2-multus-socket-dir-parent\") pod \"multus-xtxxg\" (UID: \"a327162a-7ff0-4ea9-9be0-15ce746f80a2\") " pod="openshift-multus/multus-xtxxg" Apr 21 10:03:54.223038 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.222115 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/451f825c-7185-464f-967b-97007b1437b8-etc-tuned\") pod \"tuned-h7tlg\" (UID: \"451f825c-7185-464f-967b-97007b1437b8\") " pod="openshift-cluster-node-tuning-operator/tuned-h7tlg" Apr 21 10:03:54.223038 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.222171 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/451f825c-7185-464f-967b-97007b1437b8-tmp\") pod \"tuned-h7tlg\" (UID: \"451f825c-7185-464f-967b-97007b1437b8\") " pod="openshift-cluster-node-tuning-operator/tuned-h7tlg" Apr 21 10:03:54.223038 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.222807 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-ovn-node-metrics-cert\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.227853 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.227784 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrkks\" (UniqueName: \"kubernetes.io/projected/b4e66db4-f8f5-415c-aac2-60c02dfc43ff-kube-api-access-qrkks\") pod \"node-ca-4ggkj\" (UID: \"b4e66db4-f8f5-415c-aac2-60c02dfc43ff\") " pod="openshift-image-registry/node-ca-4ggkj" Apr 21 10:03:54.228607 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.228584 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzhkg\" (UniqueName: \"kubernetes.io/projected/186c1594-0ba9-495b-8213-27692e681b57-kube-api-access-vzhkg\") pod \"node-resolver-c588q\" (UID: \"186c1594-0ba9-495b-8213-27692e681b57\") " pod="openshift-dns/node-resolver-c588q" Apr 21 10:03:54.228731 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.228681 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w569r\" (UniqueName: \"kubernetes.io/projected/a327162a-7ff0-4ea9-9be0-15ce746f80a2-kube-api-access-w569r\") pod \"multus-xtxxg\" (UID: \"a327162a-7ff0-4ea9-9be0-15ce746f80a2\") " pod="openshift-multus/multus-xtxxg" Apr 21 10:03:54.229310 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.229255 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjwz4\" (UniqueName: \"kubernetes.io/projected/7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f-kube-api-access-kjwz4\") pod \"ovnkube-node-hc44q\" (UID: \"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f\") " pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.233800 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.233741 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jxr9\" (UniqueName: \"kubernetes.io/projected/1e8275fc-da1c-442f-8dd4-cc1ad0f529fe-kube-api-access-8jxr9\") pod \"multus-additional-cni-plugins-g77pc\" (UID: \"1e8275fc-da1c-442f-8dd4-cc1ad0f529fe\") " pod="openshift-multus/multus-additional-cni-plugins-g77pc" Apr 21 10:03:54.235039 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.235011 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2p8n\" (UniqueName: \"kubernetes.io/projected/451f825c-7185-464f-967b-97007b1437b8-kube-api-access-m2p8n\") pod \"tuned-h7tlg\" (UID: \"451f825c-7185-464f-967b-97007b1437b8\") " pod="openshift-cluster-node-tuning-operator/tuned-h7tlg" Apr 21 10:03:54.235178 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.235164 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gctv\" (UniqueName: \"kubernetes.io/projected/7090d2d9-5242-41cd-9157-a84d2b1a535d-kube-api-access-2gctv\") pod \"aws-ebs-csi-driver-node-tz8td\" (UID: \"7090d2d9-5242-41cd-9157-a84d2b1a535d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tz8td" Apr 21 10:03:54.320534 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.320498 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c6bbb2a8-d1d8-46e7-a943-36618a64adb4-iptables-alerter-script\") pod \"iptables-alerter-gwnnd\" (UID: \"c6bbb2a8-d1d8-46e7-a943-36618a64adb4\") " pod="openshift-network-operator/iptables-alerter-gwnnd" Apr 21 10:03:54.320719 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.320537 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/b7b9a85e-3e84-4e88-b193-a2eff9d45b6a-agent-certs\") pod \"konnectivity-agent-8c424\" (UID: \"b7b9a85e-3e84-4e88-b193-a2eff9d45b6a\") " pod="kube-system/konnectivity-agent-8c424" Apr 21 10:03:54.320719 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.320587 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e437b5da-7e75-4ed5-8d79-e418168b80fe-metrics-certs\") pod \"network-metrics-daemon-kckvj\" (UID: \"e437b5da-7e75-4ed5-8d79-e418168b80fe\") " pod="openshift-multus/network-metrics-daemon-kckvj" Apr 21 10:03:54.320719 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.320611 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c6bbb2a8-d1d8-46e7-a943-36618a64adb4-host-slash\") pod \"iptables-alerter-gwnnd\" (UID: \"c6bbb2a8-d1d8-46e7-a943-36618a64adb4\") " pod="openshift-network-operator/iptables-alerter-gwnnd" Apr 21 10:03:54.320719 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.320639 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pxdxm\" (UniqueName: \"kubernetes.io/projected/8723d4f5-441e-4586-b642-f008d599b082-kube-api-access-pxdxm\") pod \"network-check-target-qxvf7\" (UID: \"8723d4f5-441e-4586-b642-f008d599b082\") " pod="openshift-network-diagnostics/network-check-target-qxvf7" Apr 21 10:03:54.320719 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.320663 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/b7b9a85e-3e84-4e88-b193-a2eff9d45b6a-konnectivity-ca\") pod \"konnectivity-agent-8c424\" (UID: \"b7b9a85e-3e84-4e88-b193-a2eff9d45b6a\") " pod="kube-system/konnectivity-agent-8c424" Apr 21 10:03:54.320719 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.320692 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8v5fc\" (UniqueName: \"kubernetes.io/projected/c6bbb2a8-d1d8-46e7-a943-36618a64adb4-kube-api-access-8v5fc\") pod \"iptables-alerter-gwnnd\" (UID: \"c6bbb2a8-d1d8-46e7-a943-36618a64adb4\") " pod="openshift-network-operator/iptables-alerter-gwnnd" Apr 21 10:03:54.320719 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.320715 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qhnzq\" (UniqueName: \"kubernetes.io/projected/e437b5da-7e75-4ed5-8d79-e418168b80fe-kube-api-access-qhnzq\") pod \"network-metrics-daemon-kckvj\" (UID: \"e437b5da-7e75-4ed5-8d79-e418168b80fe\") " pod="openshift-multus/network-metrics-daemon-kckvj" Apr 21 10:03:54.321051 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.320714 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c6bbb2a8-d1d8-46e7-a943-36618a64adb4-host-slash\") pod \"iptables-alerter-gwnnd\" (UID: \"c6bbb2a8-d1d8-46e7-a943-36618a64adb4\") " pod="openshift-network-operator/iptables-alerter-gwnnd" Apr 21 10:03:54.321051 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:54.320768 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:54.321051 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:54.320971 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e437b5da-7e75-4ed5-8d79-e418168b80fe-metrics-certs podName:e437b5da-7e75-4ed5-8d79-e418168b80fe nodeName:}" failed. No retries permitted until 2026-04-21 10:03:54.820934388 +0000 UTC m=+3.159468401 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e437b5da-7e75-4ed5-8d79-e418168b80fe-metrics-certs") pod "network-metrics-daemon-kckvj" (UID: "e437b5da-7e75-4ed5-8d79-e418168b80fe") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:54.321282 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.321253 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c6bbb2a8-d1d8-46e7-a943-36618a64adb4-iptables-alerter-script\") pod \"iptables-alerter-gwnnd\" (UID: \"c6bbb2a8-d1d8-46e7-a943-36618a64adb4\") " pod="openshift-network-operator/iptables-alerter-gwnnd" Apr 21 10:03:54.321477 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.321457 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/b7b9a85e-3e84-4e88-b193-a2eff9d45b6a-konnectivity-ca\") pod \"konnectivity-agent-8c424\" (UID: \"b7b9a85e-3e84-4e88-b193-a2eff9d45b6a\") " pod="kube-system/konnectivity-agent-8c424" Apr 21 10:03:54.323218 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.323195 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/b7b9a85e-3e84-4e88-b193-a2eff9d45b6a-agent-certs\") pod \"konnectivity-agent-8c424\" (UID: \"b7b9a85e-3e84-4e88-b193-a2eff9d45b6a\") " pod="kube-system/konnectivity-agent-8c424" Apr 21 10:03:54.329377 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:54.329354 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 10:03:54.329377 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:54.329380 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 10:03:54.329524 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:54.329395 2577 projected.go:194] Error preparing data for projected volume kube-api-access-pxdxm for pod openshift-network-diagnostics/network-check-target-qxvf7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:54.329524 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:54.329480 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8723d4f5-441e-4586-b642-f008d599b082-kube-api-access-pxdxm podName:8723d4f5-441e-4586-b642-f008d599b082 nodeName:}" failed. No retries permitted until 2026-04-21 10:03:54.829460498 +0000 UTC m=+3.167994493 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-pxdxm" (UniqueName: "kubernetes.io/projected/8723d4f5-441e-4586-b642-f008d599b082-kube-api-access-pxdxm") pod "network-check-target-qxvf7" (UID: "8723d4f5-441e-4586-b642-f008d599b082") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:54.332015 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.331948 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v5fc\" (UniqueName: \"kubernetes.io/projected/c6bbb2a8-d1d8-46e7-a943-36618a64adb4-kube-api-access-8v5fc\") pod \"iptables-alerter-gwnnd\" (UID: \"c6bbb2a8-d1d8-46e7-a943-36618a64adb4\") " pod="openshift-network-operator/iptables-alerter-gwnnd" Apr 21 10:03:54.332015 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.332004 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhnzq\" (UniqueName: \"kubernetes.io/projected/e437b5da-7e75-4ed5-8d79-e418168b80fe-kube-api-access-qhnzq\") pod \"network-metrics-daemon-kckvj\" (UID: \"e437b5da-7e75-4ed5-8d79-e418168b80fe\") " pod="openshift-multus/network-metrics-daemon-kckvj" Apr 21 10:03:54.377823 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.377790 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 21 10:03:54.406975 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.406937 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-h7tlg" Apr 21 10:03:54.413737 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.413716 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-g77pc" Apr 21 10:03:54.422589 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.422565 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xtxxg" Apr 21 10:03:54.427226 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.427202 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:03:54.433889 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.433853 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tz8td" Apr 21 10:03:54.441475 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.441453 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-c588q" Apr 21 10:03:54.449094 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.449071 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4ggkj" Apr 21 10:03:54.454270 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.454252 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-gwnnd" Apr 21 10:03:54.460876 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.460853 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-8c424" Apr 21 10:03:54.713030 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:54.712982 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod186c1594_0ba9_495b_8213_27692e681b57.slice/crio-955391d85540236cb055b1453d50bc156ec37d26f6d07172c352df6db1192a93 WatchSource:0}: Error finding container 955391d85540236cb055b1453d50bc156ec37d26f6d07172c352df6db1192a93: Status 404 returned error can't find the container with id 955391d85540236cb055b1453d50bc156ec37d26f6d07172c352df6db1192a93 Apr 21 10:03:54.715473 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:54.715446 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e8275fc_da1c_442f_8dd4_cc1ad0f529fe.slice/crio-84f7ea2b5756950cd988e0d505f973ab82d3f7973c2e7d4935d99f23e9d882fe WatchSource:0}: Error finding container 84f7ea2b5756950cd988e0d505f973ab82d3f7973c2e7d4935d99f23e9d882fe: Status 404 returned error can't find the container with id 84f7ea2b5756950cd988e0d505f973ab82d3f7973c2e7d4935d99f23e9d882fe Apr 21 10:03:54.717520 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:54.717405 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod451f825c_7185_464f_967b_97007b1437b8.slice/crio-810a125046d638e5b4ac25f973ff2f47f12430888f6c8bd63d9d0b6de3803007 WatchSource:0}: Error finding container 810a125046d638e5b4ac25f973ff2f47f12430888f6c8bd63d9d0b6de3803007: Status 404 returned error can't find the container with id 810a125046d638e5b4ac25f973ff2f47f12430888f6c8bd63d9d0b6de3803007 Apr 21 10:03:54.719906 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:54.719882 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda327162a_7ff0_4ea9_9be0_15ce746f80a2.slice/crio-411efc82a1c9523b09f7407b0a2b6749dc106816777a036603fadb564d4d8701 WatchSource:0}: Error finding container 411efc82a1c9523b09f7407b0a2b6749dc106816777a036603fadb564d4d8701: Status 404 returned error can't find the container with id 411efc82a1c9523b09f7407b0a2b6749dc106816777a036603fadb564d4d8701 Apr 21 10:03:54.720919 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:54.720891 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4e66db4_f8f5_415c_aac2_60c02dfc43ff.slice/crio-a2fba6916fef0d80c0c30629a8c4e1e1a2fdce5cedbda2732f6ffb6c55a7a6c7 WatchSource:0}: Error finding container a2fba6916fef0d80c0c30629a8c4e1e1a2fdce5cedbda2732f6ffb6c55a7a6c7: Status 404 returned error can't find the container with id a2fba6916fef0d80c0c30629a8c4e1e1a2fdce5cedbda2732f6ffb6c55a7a6c7 Apr 21 10:03:54.721644 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:54.721621 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e21d9fd_fcef_42e2_8f1f_ef4aa8d9171f.slice/crio-7bb21ce3d2f37c0d5496a6a90a44bfe9e4ddafb580274b7cc4cfbed08d035f92 WatchSource:0}: Error finding container 7bb21ce3d2f37c0d5496a6a90a44bfe9e4ddafb580274b7cc4cfbed08d035f92: Status 404 returned error can't find the container with id 7bb21ce3d2f37c0d5496a6a90a44bfe9e4ddafb580274b7cc4cfbed08d035f92 Apr 21 10:03:54.722939 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:54.722914 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7090d2d9_5242_41cd_9157_a84d2b1a535d.slice/crio-414bdcf7eabc8e575b0542777d0e1f8a247295ecb4f248a38c20293f22b180e2 WatchSource:0}: Error finding container 414bdcf7eabc8e575b0542777d0e1f8a247295ecb4f248a38c20293f22b180e2: Status 404 returned error can't find the container with id 414bdcf7eabc8e575b0542777d0e1f8a247295ecb4f248a38c20293f22b180e2 Apr 21 10:03:54.724265 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:54.723584 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6bbb2a8_d1d8_46e7_a943_36618a64adb4.slice/crio-2f7593abf005539cff20b615d12c7f7a3e78682a0be981ea2a32501906ca267f WatchSource:0}: Error finding container 2f7593abf005539cff20b615d12c7f7a3e78682a0be981ea2a32501906ca267f: Status 404 returned error can't find the container with id 2f7593abf005539cff20b615d12c7f7a3e78682a0be981ea2a32501906ca267f Apr 21 10:03:54.724810 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:03:54.724545 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7b9a85e_3e84_4e88_b193_a2eff9d45b6a.slice/crio-d20be9499cf46d4c4d71ed4506e1a83f353083771bd79816062aa2c117cbc78f WatchSource:0}: Error finding container d20be9499cf46d4c4d71ed4506e1a83f353083771bd79816062aa2c117cbc78f: Status 404 returned error can't find the container with id d20be9499cf46d4c4d71ed4506e1a83f353083771bd79816062aa2c117cbc78f Apr 21 10:03:54.823764 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.823714 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e437b5da-7e75-4ed5-8d79-e418168b80fe-metrics-certs\") pod \"network-metrics-daemon-kckvj\" (UID: \"e437b5da-7e75-4ed5-8d79-e418168b80fe\") " pod="openshift-multus/network-metrics-daemon-kckvj" Apr 21 10:03:54.823934 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:54.823914 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:54.823997 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:54.823982 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e437b5da-7e75-4ed5-8d79-e418168b80fe-metrics-certs podName:e437b5da-7e75-4ed5-8d79-e418168b80fe nodeName:}" failed. No retries permitted until 2026-04-21 10:03:55.823967448 +0000 UTC m=+4.162501447 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e437b5da-7e75-4ed5-8d79-e418168b80fe-metrics-certs") pod "network-metrics-daemon-kckvj" (UID: "e437b5da-7e75-4ed5-8d79-e418168b80fe") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:54.924824 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:54.924793 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pxdxm\" (UniqueName: \"kubernetes.io/projected/8723d4f5-441e-4586-b642-f008d599b082-kube-api-access-pxdxm\") pod \"network-check-target-qxvf7\" (UID: \"8723d4f5-441e-4586-b642-f008d599b082\") " pod="openshift-network-diagnostics/network-check-target-qxvf7" Apr 21 10:03:54.924974 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:54.924922 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 10:03:54.924974 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:54.924939 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 10:03:54.924974 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:54.924949 2577 projected.go:194] Error preparing data for projected volume kube-api-access-pxdxm for pod openshift-network-diagnostics/network-check-target-qxvf7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:54.925081 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:54.924999 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8723d4f5-441e-4586-b642-f008d599b082-kube-api-access-pxdxm podName:8723d4f5-441e-4586-b642-f008d599b082 nodeName:}" failed. No retries permitted until 2026-04-21 10:03:55.924985325 +0000 UTC m=+4.263519325 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-pxdxm" (UniqueName: "kubernetes.io/projected/8723d4f5-441e-4586-b642-f008d599b082-kube-api-access-pxdxm") pod "network-check-target-qxvf7" (UID: "8723d4f5-441e-4586-b642-f008d599b082") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:55.144411 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:55.144317 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-20 09:58:53 +0000 UTC" deadline="2028-01-09 15:46:06.851505776 +0000 UTC" Apr 21 10:03:55.144411 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:55.144358 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15077h42m11.707150876s" Apr 21 10:03:55.240295 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:55.240254 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tz8td" event={"ID":"7090d2d9-5242-41cd-9157-a84d2b1a535d","Type":"ContainerStarted","Data":"414bdcf7eabc8e575b0542777d0e1f8a247295ecb4f248a38c20293f22b180e2"} Apr 21 10:03:55.246625 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:55.245980 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4ggkj" event={"ID":"b4e66db4-f8f5-415c-aac2-60c02dfc43ff","Type":"ContainerStarted","Data":"a2fba6916fef0d80c0c30629a8c4e1e1a2fdce5cedbda2732f6ffb6c55a7a6c7"} Apr 21 10:03:55.250420 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:55.250385 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xtxxg" event={"ID":"a327162a-7ff0-4ea9-9be0-15ce746f80a2","Type":"ContainerStarted","Data":"411efc82a1c9523b09f7407b0a2b6749dc106816777a036603fadb564d4d8701"} Apr 21 10:03:55.259869 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:55.259783 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-h7tlg" event={"ID":"451f825c-7185-464f-967b-97007b1437b8","Type":"ContainerStarted","Data":"810a125046d638e5b4ac25f973ff2f47f12430888f6c8bd63d9d0b6de3803007"} Apr 21 10:03:55.264241 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:55.264212 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g77pc" event={"ID":"1e8275fc-da1c-442f-8dd4-cc1ad0f529fe","Type":"ContainerStarted","Data":"84f7ea2b5756950cd988e0d505f973ab82d3f7973c2e7d4935d99f23e9d882fe"} Apr 21 10:03:55.274573 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:55.274535 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-c588q" event={"ID":"186c1594-0ba9-495b-8213-27692e681b57","Type":"ContainerStarted","Data":"955391d85540236cb055b1453d50bc156ec37d26f6d07172c352df6db1192a93"} Apr 21 10:03:55.287807 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:55.287004 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-157.ec2.internal" event={"ID":"6c2c9c59cdaa3a4ce6126af55beb4c88","Type":"ContainerStarted","Data":"f0a55d1ecb0fa60c238f57eff65ea48813dcd8a83fb185944c1a5b84a227b5e1"} Apr 21 10:03:55.292658 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:55.292623 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-8c424" event={"ID":"b7b9a85e-3e84-4e88-b193-a2eff9d45b6a","Type":"ContainerStarted","Data":"d20be9499cf46d4c4d71ed4506e1a83f353083771bd79816062aa2c117cbc78f"} Apr 21 10:03:55.299013 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:55.298979 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-gwnnd" event={"ID":"c6bbb2a8-d1d8-46e7-a943-36618a64adb4","Type":"ContainerStarted","Data":"2f7593abf005539cff20b615d12c7f7a3e78682a0be981ea2a32501906ca267f"} Apr 21 10:03:55.299647 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:55.299598 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-157.ec2.internal" podStartSLOduration=2.299582373 podStartE2EDuration="2.299582373s" podCreationTimestamp="2026-04-21 10:03:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:03:55.29919525 +0000 UTC m=+3.637729268" watchObservedRunningTime="2026-04-21 10:03:55.299582373 +0000 UTC m=+3.638116391" Apr 21 10:03:55.305840 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:55.305805 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" event={"ID":"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f","Type":"ContainerStarted","Data":"7bb21ce3d2f37c0d5496a6a90a44bfe9e4ddafb580274b7cc4cfbed08d035f92"} Apr 21 10:03:55.831702 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:55.831655 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e437b5da-7e75-4ed5-8d79-e418168b80fe-metrics-certs\") pod \"network-metrics-daemon-kckvj\" (UID: \"e437b5da-7e75-4ed5-8d79-e418168b80fe\") " pod="openshift-multus/network-metrics-daemon-kckvj" Apr 21 10:03:55.831873 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:55.831850 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:55.831945 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:55.831916 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e437b5da-7e75-4ed5-8d79-e418168b80fe-metrics-certs podName:e437b5da-7e75-4ed5-8d79-e418168b80fe nodeName:}" failed. No retries permitted until 2026-04-21 10:03:57.8318983 +0000 UTC m=+6.170432301 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e437b5da-7e75-4ed5-8d79-e418168b80fe-metrics-certs") pod "network-metrics-daemon-kckvj" (UID: "e437b5da-7e75-4ed5-8d79-e418168b80fe") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:55.933320 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:55.932622 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pxdxm\" (UniqueName: \"kubernetes.io/projected/8723d4f5-441e-4586-b642-f008d599b082-kube-api-access-pxdxm\") pod \"network-check-target-qxvf7\" (UID: \"8723d4f5-441e-4586-b642-f008d599b082\") " pod="openshift-network-diagnostics/network-check-target-qxvf7" Apr 21 10:03:55.933320 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:55.932830 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 10:03:55.933320 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:55.932855 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 10:03:55.933320 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:55.932869 2577 projected.go:194] Error preparing data for projected volume kube-api-access-pxdxm for pod openshift-network-diagnostics/network-check-target-qxvf7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:55.933320 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:55.932930 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8723d4f5-441e-4586-b642-f008d599b082-kube-api-access-pxdxm podName:8723d4f5-441e-4586-b642-f008d599b082 nodeName:}" failed. No retries permitted until 2026-04-21 10:03:57.932910796 +0000 UTC m=+6.271444798 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-pxdxm" (UniqueName: "kubernetes.io/projected/8723d4f5-441e-4586-b642-f008d599b082-kube-api-access-pxdxm") pod "network-check-target-qxvf7" (UID: "8723d4f5-441e-4586-b642-f008d599b082") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:56.229670 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:56.228917 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kckvj" Apr 21 10:03:56.229670 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:56.229055 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kckvj" podUID="e437b5da-7e75-4ed5-8d79-e418168b80fe" Apr 21 10:03:56.229670 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:56.229477 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qxvf7" Apr 21 10:03:56.229670 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:56.229565 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qxvf7" podUID="8723d4f5-441e-4586-b642-f008d599b082" Apr 21 10:03:56.335739 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:56.335697 2577 generic.go:358] "Generic (PLEG): container finished" podID="0dca983b0ab93ffb2f6b3f066e9c69dd" containerID="7095e3dbe760c69a84d278659ab5eb83457c193286f73dc071590c90e13f3a4b" exitCode=0 Apr 21 10:03:56.336580 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:56.336252 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-157.ec2.internal" event={"ID":"0dca983b0ab93ffb2f6b3f066e9c69dd","Type":"ContainerDied","Data":"7095e3dbe760c69a84d278659ab5eb83457c193286f73dc071590c90e13f3a4b"} Apr 21 10:03:57.342492 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:57.342451 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-157.ec2.internal" event={"ID":"0dca983b0ab93ffb2f6b3f066e9c69dd","Type":"ContainerStarted","Data":"a76bbdb777c83fa766092343f773cdfa411b9e2d105910c040c5b1cf583dea8b"} Apr 21 10:03:57.851419 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:57.851382 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e437b5da-7e75-4ed5-8d79-e418168b80fe-metrics-certs\") pod \"network-metrics-daemon-kckvj\" (UID: \"e437b5da-7e75-4ed5-8d79-e418168b80fe\") " pod="openshift-multus/network-metrics-daemon-kckvj" Apr 21 10:03:57.851617 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:57.851575 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:57.851706 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:57.851639 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e437b5da-7e75-4ed5-8d79-e418168b80fe-metrics-certs podName:e437b5da-7e75-4ed5-8d79-e418168b80fe nodeName:}" failed. No retries permitted until 2026-04-21 10:04:01.851620477 +0000 UTC m=+10.190154472 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e437b5da-7e75-4ed5-8d79-e418168b80fe-metrics-certs") pod "network-metrics-daemon-kckvj" (UID: "e437b5da-7e75-4ed5-8d79-e418168b80fe") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:03:57.952996 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:57.952698 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pxdxm\" (UniqueName: \"kubernetes.io/projected/8723d4f5-441e-4586-b642-f008d599b082-kube-api-access-pxdxm\") pod \"network-check-target-qxvf7\" (UID: \"8723d4f5-441e-4586-b642-f008d599b082\") " pod="openshift-network-diagnostics/network-check-target-qxvf7" Apr 21 10:03:57.952996 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:57.952926 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 10:03:57.952996 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:57.952950 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 10:03:57.952996 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:57.952963 2577 projected.go:194] Error preparing data for projected volume kube-api-access-pxdxm for pod openshift-network-diagnostics/network-check-target-qxvf7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:57.953348 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:57.953023 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8723d4f5-441e-4586-b642-f008d599b082-kube-api-access-pxdxm podName:8723d4f5-441e-4586-b642-f008d599b082 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:01.953008801 +0000 UTC m=+10.291542796 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-pxdxm" (UniqueName: "kubernetes.io/projected/8723d4f5-441e-4586-b642-f008d599b082-kube-api-access-pxdxm") pod "network-check-target-qxvf7" (UID: "8723d4f5-441e-4586-b642-f008d599b082") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:03:58.228126 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:58.228044 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kckvj" Apr 21 10:03:58.228279 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:03:58.228044 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qxvf7" Apr 21 10:03:58.228628 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:58.228601 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kckvj" podUID="e437b5da-7e75-4ed5-8d79-e418168b80fe" Apr 21 10:03:58.228786 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:03:58.228719 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qxvf7" podUID="8723d4f5-441e-4586-b642-f008d599b082" Apr 21 10:04:00.227985 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:00.227947 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qxvf7" Apr 21 10:04:00.228427 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:00.227995 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kckvj" Apr 21 10:04:00.228427 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:00.228093 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qxvf7" podUID="8723d4f5-441e-4586-b642-f008d599b082" Apr 21 10:04:00.228427 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:00.228265 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kckvj" podUID="e437b5da-7e75-4ed5-8d79-e418168b80fe" Apr 21 10:04:01.890163 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:01.890030 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e437b5da-7e75-4ed5-8d79-e418168b80fe-metrics-certs\") pod \"network-metrics-daemon-kckvj\" (UID: \"e437b5da-7e75-4ed5-8d79-e418168b80fe\") " pod="openshift-multus/network-metrics-daemon-kckvj" Apr 21 10:04:01.890641 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:01.890219 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:04:01.890641 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:01.890288 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e437b5da-7e75-4ed5-8d79-e418168b80fe-metrics-certs podName:e437b5da-7e75-4ed5-8d79-e418168b80fe nodeName:}" failed. No retries permitted until 2026-04-21 10:04:09.890269634 +0000 UTC m=+18.228803651 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e437b5da-7e75-4ed5-8d79-e418168b80fe-metrics-certs") pod "network-metrics-daemon-kckvj" (UID: "e437b5da-7e75-4ed5-8d79-e418168b80fe") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:04:01.990710 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:01.990601 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pxdxm\" (UniqueName: \"kubernetes.io/projected/8723d4f5-441e-4586-b642-f008d599b082-kube-api-access-pxdxm\") pod \"network-check-target-qxvf7\" (UID: \"8723d4f5-441e-4586-b642-f008d599b082\") " pod="openshift-network-diagnostics/network-check-target-qxvf7" Apr 21 10:04:01.990885 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:01.990774 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 10:04:01.990885 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:01.990802 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 10:04:01.990885 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:01.990816 2577 projected.go:194] Error preparing data for projected volume kube-api-access-pxdxm for pod openshift-network-diagnostics/network-check-target-qxvf7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:04:01.990885 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:01.990884 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8723d4f5-441e-4586-b642-f008d599b082-kube-api-access-pxdxm podName:8723d4f5-441e-4586-b642-f008d599b082 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:09.990863395 +0000 UTC m=+18.329397406 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-pxdxm" (UniqueName: "kubernetes.io/projected/8723d4f5-441e-4586-b642-f008d599b082-kube-api-access-pxdxm") pod "network-check-target-qxvf7" (UID: "8723d4f5-441e-4586-b642-f008d599b082") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:04:02.229259 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:02.229178 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kckvj" Apr 21 10:04:02.229427 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:02.229322 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kckvj" podUID="e437b5da-7e75-4ed5-8d79-e418168b80fe" Apr 21 10:04:02.229427 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:02.229353 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qxvf7" Apr 21 10:04:02.229528 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:02.229481 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qxvf7" podUID="8723d4f5-441e-4586-b642-f008d599b082" Apr 21 10:04:04.227640 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:04.227579 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qxvf7" Apr 21 10:04:04.228064 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:04.227612 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kckvj" Apr 21 10:04:04.228064 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:04.227769 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qxvf7" podUID="8723d4f5-441e-4586-b642-f008d599b082" Apr 21 10:04:04.228064 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:04.227827 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kckvj" podUID="e437b5da-7e75-4ed5-8d79-e418168b80fe" Apr 21 10:04:06.230826 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:06.230791 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qxvf7" Apr 21 10:04:06.230826 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:06.230809 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kckvj" Apr 21 10:04:06.231314 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:06.230907 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qxvf7" podUID="8723d4f5-441e-4586-b642-f008d599b082" Apr 21 10:04:06.231314 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:06.231018 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kckvj" podUID="e437b5da-7e75-4ed5-8d79-e418168b80fe" Apr 21 10:04:08.227885 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:08.227846 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qxvf7" Apr 21 10:04:08.228397 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:08.227974 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qxvf7" podUID="8723d4f5-441e-4586-b642-f008d599b082" Apr 21 10:04:08.228397 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:08.228034 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kckvj" Apr 21 10:04:08.228397 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:08.228157 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kckvj" podUID="e437b5da-7e75-4ed5-8d79-e418168b80fe" Apr 21 10:04:09.948332 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:09.948284 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e437b5da-7e75-4ed5-8d79-e418168b80fe-metrics-certs\") pod \"network-metrics-daemon-kckvj\" (UID: \"e437b5da-7e75-4ed5-8d79-e418168b80fe\") " pod="openshift-multus/network-metrics-daemon-kckvj" Apr 21 10:04:09.948849 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:09.948492 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:04:09.948849 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:09.948580 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e437b5da-7e75-4ed5-8d79-e418168b80fe-metrics-certs podName:e437b5da-7e75-4ed5-8d79-e418168b80fe nodeName:}" failed. No retries permitted until 2026-04-21 10:04:25.948558276 +0000 UTC m=+34.287092289 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e437b5da-7e75-4ed5-8d79-e418168b80fe-metrics-certs") pod "network-metrics-daemon-kckvj" (UID: "e437b5da-7e75-4ed5-8d79-e418168b80fe") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 21 10:04:10.048912 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:10.048872 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pxdxm\" (UniqueName: \"kubernetes.io/projected/8723d4f5-441e-4586-b642-f008d599b082-kube-api-access-pxdxm\") pod \"network-check-target-qxvf7\" (UID: \"8723d4f5-441e-4586-b642-f008d599b082\") " pod="openshift-network-diagnostics/network-check-target-qxvf7" Apr 21 10:04:10.049081 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:10.049021 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 21 10:04:10.049081 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:10.049054 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 21 10:04:10.049081 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:10.049067 2577 projected.go:194] Error preparing data for projected volume kube-api-access-pxdxm for pod openshift-network-diagnostics/network-check-target-qxvf7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:04:10.049195 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:10.049119 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8723d4f5-441e-4586-b642-f008d599b082-kube-api-access-pxdxm podName:8723d4f5-441e-4586-b642-f008d599b082 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:26.049106173 +0000 UTC m=+34.387640173 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-pxdxm" (UniqueName: "kubernetes.io/projected/8723d4f5-441e-4586-b642-f008d599b082-kube-api-access-pxdxm") pod "network-check-target-qxvf7" (UID: "8723d4f5-441e-4586-b642-f008d599b082") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 21 10:04:10.230380 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:10.230298 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qxvf7" Apr 21 10:04:10.230537 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:10.230312 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kckvj" Apr 21 10:04:10.230537 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:10.230468 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qxvf7" podUID="8723d4f5-441e-4586-b642-f008d599b082" Apr 21 10:04:10.230655 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:10.230618 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kckvj" podUID="e437b5da-7e75-4ed5-8d79-e418168b80fe" Apr 21 10:04:12.230200 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:12.230030 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qxvf7" Apr 21 10:04:12.230768 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:12.230032 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kckvj" Apr 21 10:04:12.230768 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:12.230275 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qxvf7" podUID="8723d4f5-441e-4586-b642-f008d599b082" Apr 21 10:04:12.230768 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:12.230345 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kckvj" podUID="e437b5da-7e75-4ed5-8d79-e418168b80fe" Apr 21 10:04:12.369994 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:12.369956 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-h7tlg" event={"ID":"451f825c-7185-464f-967b-97007b1437b8","Type":"ContainerStarted","Data":"4262f072f3d0b5bb84b6c772dea0f01eed7f36c014a4d42545ae55e89b2fe762"} Apr 21 10:04:12.371834 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:12.371795 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g77pc" event={"ID":"1e8275fc-da1c-442f-8dd4-cc1ad0f529fe","Type":"ContainerStarted","Data":"de571c9cce627436dfb0e408f531e333944f9175d514c41e9be6da88b99e4ece"} Apr 21 10:04:12.373226 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:12.373188 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-c588q" event={"ID":"186c1594-0ba9-495b-8213-27692e681b57","Type":"ContainerStarted","Data":"0b59040a14a616496bffd1b6bc3b0f8919407c5209f38303b21894640bf36e70"} Apr 21 10:04:12.375432 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:12.375404 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-8c424" event={"ID":"b7b9a85e-3e84-4e88-b193-a2eff9d45b6a","Type":"ContainerStarted","Data":"b27872c42ba2e82965a78f9e2c4790e7d858d44b93ae48a5ab35658acdc601c2"} Apr 21 10:04:12.376939 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:12.376919 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" event={"ID":"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f","Type":"ContainerStarted","Data":"27cc2c478db03334036088df1a4c1c90451a5cca51296a693026f41ffe795d53"} Apr 21 10:04:12.377065 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:12.376946 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" event={"ID":"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f","Type":"ContainerStarted","Data":"00a11680849fff9a36f6017a7e14cd7475b4fc0197efc7dfa862f9b5d214befa"} Apr 21 10:04:12.378162 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:12.378136 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tz8td" event={"ID":"7090d2d9-5242-41cd-9157-a84d2b1a535d","Type":"ContainerStarted","Data":"65d75e89c6f21cdf7203520f3465a1d06129c54751e6519f08f9967392b7f18f"} Apr 21 10:04:12.379345 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:12.379323 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4ggkj" event={"ID":"b4e66db4-f8f5-415c-aac2-60c02dfc43ff","Type":"ContainerStarted","Data":"098ab691951c680954f5544629e16e8b229dbc6b38d759963bbca9e23f4f9eda"} Apr 21 10:04:12.380555 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:12.380533 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xtxxg" event={"ID":"a327162a-7ff0-4ea9-9be0-15ce746f80a2","Type":"ContainerStarted","Data":"04b2a71cfed137c1927108b99a9ab24212f650a9c01196f8039549ffc5154c44"} Apr 21 10:04:12.387794 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:12.387724 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-157.ec2.internal" podStartSLOduration=19.387709944 podStartE2EDuration="19.387709944s" podCreationTimestamp="2026-04-21 10:03:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:03:57.358702167 +0000 UTC m=+5.697236185" watchObservedRunningTime="2026-04-21 10:04:12.387709944 +0000 UTC m=+20.726243964" Apr 21 10:04:12.388292 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:12.388253 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-h7tlg" podStartSLOduration=3.197287597 podStartE2EDuration="20.388242673s" podCreationTimestamp="2026-04-21 10:03:52 +0000 UTC" firstStartedPulling="2026-04-21 10:03:54.719449405 +0000 UTC m=+3.057983400" lastFinishedPulling="2026-04-21 10:04:11.910404465 +0000 UTC m=+20.248938476" observedRunningTime="2026-04-21 10:04:12.387309189 +0000 UTC m=+20.725843205" watchObservedRunningTime="2026-04-21 10:04:12.388242673 +0000 UTC m=+20.726776689" Apr 21 10:04:12.432038 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:12.431989 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-c588q" podStartSLOduration=3.2639406060000002 podStartE2EDuration="20.431974035s" podCreationTimestamp="2026-04-21 10:03:52 +0000 UTC" firstStartedPulling="2026-04-21 10:03:54.716255563 +0000 UTC m=+3.054789558" lastFinishedPulling="2026-04-21 10:04:11.884288981 +0000 UTC m=+20.222822987" observedRunningTime="2026-04-21 10:04:12.431687794 +0000 UTC m=+20.770221812" watchObservedRunningTime="2026-04-21 10:04:12.431974035 +0000 UTC m=+20.770508051" Apr 21 10:04:12.446912 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:12.446867 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-4ggkj" podStartSLOduration=3.2869639680000002 podStartE2EDuration="20.446855642s" podCreationTimestamp="2026-04-21 10:03:52 +0000 UTC" firstStartedPulling="2026-04-21 10:03:54.724143286 +0000 UTC m=+3.062677284" lastFinishedPulling="2026-04-21 10:04:11.884034953 +0000 UTC m=+20.222568958" observedRunningTime="2026-04-21 10:04:12.446661973 +0000 UTC m=+20.785195991" watchObservedRunningTime="2026-04-21 10:04:12.446855642 +0000 UTC m=+20.785389658" Apr 21 10:04:12.466772 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:12.466712 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-8c424" podStartSLOduration=11.548456416 podStartE2EDuration="20.46669792s" podCreationTimestamp="2026-04-21 10:03:52 +0000 UTC" firstStartedPulling="2026-04-21 10:03:54.727052829 +0000 UTC m=+3.065586827" lastFinishedPulling="2026-04-21 10:04:03.645294333 +0000 UTC m=+11.983828331" observedRunningTime="2026-04-21 10:04:12.466217214 +0000 UTC m=+20.804751231" watchObservedRunningTime="2026-04-21 10:04:12.46669792 +0000 UTC m=+20.805231936" Apr 21 10:04:12.488482 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:12.488304 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-xtxxg" podStartSLOduration=3.294281337 podStartE2EDuration="20.488290429s" podCreationTimestamp="2026-04-21 10:03:52 +0000 UTC" firstStartedPulling="2026-04-21 10:03:54.72176654 +0000 UTC m=+3.060300547" lastFinishedPulling="2026-04-21 10:04:11.915775644 +0000 UTC m=+20.254309639" observedRunningTime="2026-04-21 10:04:12.487814163 +0000 UTC m=+20.826348180" watchObservedRunningTime="2026-04-21 10:04:12.488290429 +0000 UTC m=+20.826824445" Apr 21 10:04:12.890872 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:12.890829 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-8c424" Apr 21 10:04:12.918021 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:12.917987 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-8c424" Apr 21 10:04:12.918886 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:12.918862 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-8c424" Apr 21 10:04:13.191257 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:13.191228 2577 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 21 10:04:13.383873 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:13.383781 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-gwnnd" event={"ID":"c6bbb2a8-d1d8-46e7-a943-36618a64adb4","Type":"ContainerStarted","Data":"f1a27e84b4a3a8fdf4926b5f002eda425e29735837e2eb9fcf81ba0f6842d891"} Apr 21 10:04:13.386257 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:13.386233 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" event={"ID":"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f","Type":"ContainerStarted","Data":"46f72e52ed2b9fd23696264b1a103fcc8f6a4b60b2f4af8a0a6bac7ef9b40fc5"} Apr 21 10:04:13.386363 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:13.386261 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" event={"ID":"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f","Type":"ContainerStarted","Data":"ea52dc412a7df81f7c983bc71fa5377ed6ce2103464db1726e1804524dccde6b"} Apr 21 10:04:13.386363 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:13.386271 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" event={"ID":"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f","Type":"ContainerStarted","Data":"46c9588b8a4b9be5b9d6b4bb327d3221e5bac2e7e54695055242b9b298bd089b"} Apr 21 10:04:13.386363 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:13.386279 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" event={"ID":"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f","Type":"ContainerStarted","Data":"dceaf7d8dcf15c13204903fd1e2d4e5045cf3b82675695376c52a36b5cf6e3fa"} Apr 21 10:04:13.387825 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:13.387799 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tz8td" event={"ID":"7090d2d9-5242-41cd-9157-a84d2b1a535d","Type":"ContainerStarted","Data":"11b6173f1a1daaf66f6060e7a2207b5b5e4984c4e3f6219b24edd1d92f15243c"} Apr 21 10:04:13.388947 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:13.388927 2577 generic.go:358] "Generic (PLEG): container finished" podID="1e8275fc-da1c-442f-8dd4-cc1ad0f529fe" containerID="de571c9cce627436dfb0e408f531e333944f9175d514c41e9be6da88b99e4ece" exitCode=0 Apr 21 10:04:13.389062 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:13.389042 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g77pc" event={"ID":"1e8275fc-da1c-442f-8dd4-cc1ad0f529fe","Type":"ContainerDied","Data":"de571c9cce627436dfb0e408f531e333944f9175d514c41e9be6da88b99e4ece"} Apr 21 10:04:13.390037 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:13.389851 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-8c424" Apr 21 10:04:13.401725 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:13.401678 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-gwnnd" podStartSLOduration=4.242915522 podStartE2EDuration="21.401667524s" podCreationTimestamp="2026-04-21 10:03:52 +0000 UTC" firstStartedPulling="2026-04-21 10:03:54.725518946 +0000 UTC m=+3.064052946" lastFinishedPulling="2026-04-21 10:04:11.884270953 +0000 UTC m=+20.222804948" observedRunningTime="2026-04-21 10:04:13.401412367 +0000 UTC m=+21.739946381" watchObservedRunningTime="2026-04-21 10:04:13.401667524 +0000 UTC m=+21.740201540" Apr 21 10:04:14.180886 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:14.180788 2577 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-21T10:04:13.191253204Z","UUID":"122471a0-c133-4619-b7f9-1878f6ad8e39","Handler":null,"Name":"","Endpoint":""} Apr 21 10:04:14.183433 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:14.183403 2577 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 21 10:04:14.183433 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:14.183441 2577 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 21 10:04:14.227741 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:14.227667 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kckvj" Apr 21 10:04:14.227741 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:14.227722 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qxvf7" Apr 21 10:04:14.227961 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:14.227846 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kckvj" podUID="e437b5da-7e75-4ed5-8d79-e418168b80fe" Apr 21 10:04:14.228025 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:14.227964 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qxvf7" podUID="8723d4f5-441e-4586-b642-f008d599b082" Apr 21 10:04:14.393460 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:14.393309 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tz8td" event={"ID":"7090d2d9-5242-41cd-9157-a84d2b1a535d","Type":"ContainerStarted","Data":"49de634d0f1d425997f811c4a1ea8fd8d2b11bf7c91bfc277ebc88609e5b666e"} Apr 21 10:04:14.409961 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:14.409909 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tz8td" podStartSLOduration=3.041541551 podStartE2EDuration="22.409893479s" podCreationTimestamp="2026-04-21 10:03:52 +0000 UTC" firstStartedPulling="2026-04-21 10:03:54.726548411 +0000 UTC m=+3.065082406" lastFinishedPulling="2026-04-21 10:04:14.094900333 +0000 UTC m=+22.433434334" observedRunningTime="2026-04-21 10:04:14.409591346 +0000 UTC m=+22.748125363" watchObservedRunningTime="2026-04-21 10:04:14.409893479 +0000 UTC m=+22.748427493" Apr 21 10:04:15.398816 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:15.398775 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" event={"ID":"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f","Type":"ContainerStarted","Data":"326dbb947d31403a4071193b6db099bda3ce9fa87e9d63bc76c14f6bfc1ebc73"} Apr 21 10:04:16.227987 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:16.227951 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kckvj" Apr 21 10:04:16.228166 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:16.227951 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qxvf7" Apr 21 10:04:16.228166 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:16.228102 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kckvj" podUID="e437b5da-7e75-4ed5-8d79-e418168b80fe" Apr 21 10:04:16.228296 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:16.228182 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qxvf7" podUID="8723d4f5-441e-4586-b642-f008d599b082" Apr 21 10:04:18.227858 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:18.227827 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qxvf7" Apr 21 10:04:18.228396 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:18.227834 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kckvj" Apr 21 10:04:18.228396 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:18.227923 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qxvf7" podUID="8723d4f5-441e-4586-b642-f008d599b082" Apr 21 10:04:18.228396 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:18.228043 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kckvj" podUID="e437b5da-7e75-4ed5-8d79-e418168b80fe" Apr 21 10:04:18.405397 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:18.405362 2577 generic.go:358] "Generic (PLEG): container finished" podID="1e8275fc-da1c-442f-8dd4-cc1ad0f529fe" containerID="775e930d4a677ba2d13e92fdf131ffbfb4af27840729c2601e95051789db77e0" exitCode=0 Apr 21 10:04:18.405549 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:18.405448 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g77pc" event={"ID":"1e8275fc-da1c-442f-8dd4-cc1ad0f529fe","Type":"ContainerDied","Data":"775e930d4a677ba2d13e92fdf131ffbfb4af27840729c2601e95051789db77e0"} Apr 21 10:04:18.408686 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:18.408664 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" event={"ID":"7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f","Type":"ContainerStarted","Data":"1ca8725b92916b649ed774447253dddefcf0a75cdf7f23dad2af096ad1540535"} Apr 21 10:04:18.409051 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:18.409035 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:04:18.423803 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:18.423773 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:04:19.373035 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:19.372842 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" podStartSLOduration=9.893903045 podStartE2EDuration="27.372825126s" podCreationTimestamp="2026-04-21 10:03:52 +0000 UTC" firstStartedPulling="2026-04-21 10:03:54.723897869 +0000 UTC m=+3.062431864" lastFinishedPulling="2026-04-21 10:04:12.202819932 +0000 UTC m=+20.541353945" observedRunningTime="2026-04-21 10:04:18.471991407 +0000 UTC m=+26.810525424" watchObservedRunningTime="2026-04-21 10:04:19.372825126 +0000 UTC m=+27.711359143" Apr 21 10:04:19.373599 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:19.373309 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qxvf7"] Apr 21 10:04:19.373599 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:19.373444 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qxvf7" Apr 21 10:04:19.373599 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:19.373560 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qxvf7" podUID="8723d4f5-441e-4586-b642-f008d599b082" Apr 21 10:04:19.375708 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:19.375688 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-kckvj"] Apr 21 10:04:19.375852 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:19.375840 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kckvj" Apr 21 10:04:19.375959 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:19.375943 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kckvj" podUID="e437b5da-7e75-4ed5-8d79-e418168b80fe" Apr 21 10:04:19.413399 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:19.413323 2577 generic.go:358] "Generic (PLEG): container finished" podID="1e8275fc-da1c-442f-8dd4-cc1ad0f529fe" containerID="1d8b27dd50841e83f52292fb7d04897b20f06cb9f98dd19e1d58a5bc75e57f34" exitCode=0 Apr 21 10:04:19.413547 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:19.413397 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g77pc" event={"ID":"1e8275fc-da1c-442f-8dd4-cc1ad0f529fe","Type":"ContainerDied","Data":"1d8b27dd50841e83f52292fb7d04897b20f06cb9f98dd19e1d58a5bc75e57f34"} Apr 21 10:04:19.414500 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:19.414047 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:04:19.414500 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:19.414077 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:04:19.429378 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:19.429354 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:04:20.417301 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:20.417207 2577 generic.go:358] "Generic (PLEG): container finished" podID="1e8275fc-da1c-442f-8dd4-cc1ad0f529fe" containerID="e50f499b7c9dbce6fed1eefcf8b1e44474978517d494ec0c280ad90105f81294" exitCode=0 Apr 21 10:04:20.417301 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:20.417288 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g77pc" event={"ID":"1e8275fc-da1c-442f-8dd4-cc1ad0f529fe","Type":"ContainerDied","Data":"e50f499b7c9dbce6fed1eefcf8b1e44474978517d494ec0c280ad90105f81294"} Apr 21 10:04:21.227877 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:21.227791 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qxvf7" Apr 21 10:04:21.228024 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:21.227791 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kckvj" Apr 21 10:04:21.228024 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:21.227914 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qxvf7" podUID="8723d4f5-441e-4586-b642-f008d599b082" Apr 21 10:04:21.228127 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:21.228020 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kckvj" podUID="e437b5da-7e75-4ed5-8d79-e418168b80fe" Apr 21 10:04:23.227378 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:23.227339 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kckvj" Apr 21 10:04:23.227378 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:23.227372 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qxvf7" Apr 21 10:04:23.228055 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:23.227466 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kckvj" podUID="e437b5da-7e75-4ed5-8d79-e418168b80fe" Apr 21 10:04:23.228055 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:23.227603 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-qxvf7" podUID="8723d4f5-441e-4586-b642-f008d599b082" Apr 21 10:04:25.023482 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:25.023366 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-157.ec2.internal" event="NodeReady" Apr 21 10:04:25.023934 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:25.023520 2577 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 21 10:04:25.077557 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:25.077451 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-6lwhm"] Apr 21 10:04:25.103182 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:25.103132 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-75v2d"] Apr 21 10:04:25.103438 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:25.103359 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6lwhm" Apr 21 10:04:25.106364 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:25.106314 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-2fp54\"" Apr 21 10:04:25.106868 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:25.106315 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 21 10:04:25.106868 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:25.106541 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 21 10:04:25.106868 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:25.106801 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 21 10:04:25.123128 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:25.123098 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6lwhm"] Apr 21 10:04:25.123128 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:25.123129 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-75v2d"] Apr 21 10:04:25.123343 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:25.123233 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-75v2d" Apr 21 10:04:25.125658 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:25.125636 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-nqzpr\"" Apr 21 10:04:25.125936 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:25.125910 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 21 10:04:25.126086 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:25.126065 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 21 10:04:25.227934 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:25.227897 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qxvf7" Apr 21 10:04:25.228120 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:25.227902 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kckvj" Apr 21 10:04:25.230974 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:25.230925 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 21 10:04:25.230974 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:25.230940 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-w68bg\"" Apr 21 10:04:25.231165 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:25.230980 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 21 10:04:25.231165 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:25.230991 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 21 10:04:25.231165 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:25.230993 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-dfhd5\"" Apr 21 10:04:25.257369 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:25.257334 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0-cert\") pod \"ingress-canary-6lwhm\" (UID: \"b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0\") " pod="openshift-ingress-canary/ingress-canary-6lwhm" Apr 21 10:04:25.257508 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:25.257383 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5a4cb6b4-0870-47bc-b13c-24f96bc4d282-config-volume\") pod \"dns-default-75v2d\" (UID: \"5a4cb6b4-0870-47bc-b13c-24f96bc4d282\") " pod="openshift-dns/dns-default-75v2d" Apr 21 10:04:25.257508 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:25.257475 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6922\" (UniqueName: \"kubernetes.io/projected/5a4cb6b4-0870-47bc-b13c-24f96bc4d282-kube-api-access-z6922\") pod \"dns-default-75v2d\" (UID: \"5a4cb6b4-0870-47bc-b13c-24f96bc4d282\") " pod="openshift-dns/dns-default-75v2d" Apr 21 10:04:25.257604 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:25.257519 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9gqv\" (UniqueName: \"kubernetes.io/projected/b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0-kube-api-access-h9gqv\") pod \"ingress-canary-6lwhm\" (UID: \"b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0\") " pod="openshift-ingress-canary/ingress-canary-6lwhm" Apr 21 10:04:25.257604 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:25.257540 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5a4cb6b4-0870-47bc-b13c-24f96bc4d282-tmp-dir\") pod \"dns-default-75v2d\" (UID: \"5a4cb6b4-0870-47bc-b13c-24f96bc4d282\") " pod="openshift-dns/dns-default-75v2d" Apr 21 10:04:25.257692 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:25.257632 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5a4cb6b4-0870-47bc-b13c-24f96bc4d282-metrics-tls\") pod \"dns-default-75v2d\" (UID: \"5a4cb6b4-0870-47bc-b13c-24f96bc4d282\") " pod="openshift-dns/dns-default-75v2d" Apr 21 10:04:25.358529 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:25.358432 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0-cert\") pod \"ingress-canary-6lwhm\" (UID: \"b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0\") " pod="openshift-ingress-canary/ingress-canary-6lwhm" Apr 21 10:04:25.358529 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:25.358492 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5a4cb6b4-0870-47bc-b13c-24f96bc4d282-config-volume\") pod \"dns-default-75v2d\" (UID: \"5a4cb6b4-0870-47bc-b13c-24f96bc4d282\") " pod="openshift-dns/dns-default-75v2d" Apr 21 10:04:25.358785 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:25.358533 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z6922\" (UniqueName: \"kubernetes.io/projected/5a4cb6b4-0870-47bc-b13c-24f96bc4d282-kube-api-access-z6922\") pod \"dns-default-75v2d\" (UID: \"5a4cb6b4-0870-47bc-b13c-24f96bc4d282\") " pod="openshift-dns/dns-default-75v2d" Apr 21 10:04:25.358785 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:25.358582 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h9gqv\" (UniqueName: \"kubernetes.io/projected/b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0-kube-api-access-h9gqv\") pod \"ingress-canary-6lwhm\" (UID: \"b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0\") " pod="openshift-ingress-canary/ingress-canary-6lwhm" Apr 21 10:04:25.358785 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:25.358606 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5a4cb6b4-0870-47bc-b13c-24f96bc4d282-tmp-dir\") pod \"dns-default-75v2d\" (UID: \"5a4cb6b4-0870-47bc-b13c-24f96bc4d282\") " pod="openshift-dns/dns-default-75v2d" Apr 21 10:04:25.358785 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:25.358607 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 10:04:25.358785 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:25.358648 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5a4cb6b4-0870-47bc-b13c-24f96bc4d282-metrics-tls\") pod \"dns-default-75v2d\" (UID: \"5a4cb6b4-0870-47bc-b13c-24f96bc4d282\") " pod="openshift-dns/dns-default-75v2d" Apr 21 10:04:25.358785 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:25.358682 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0-cert podName:b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:25.858661638 +0000 UTC m=+34.197195646 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0-cert") pod "ingress-canary-6lwhm" (UID: "b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0") : secret "canary-serving-cert" not found Apr 21 10:04:25.358785 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:25.358725 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 10:04:25.358785 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:25.358792 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a4cb6b4-0870-47bc-b13c-24f96bc4d282-metrics-tls podName:5a4cb6b4-0870-47bc-b13c-24f96bc4d282 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:25.858773452 +0000 UTC m=+34.197307450 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5a4cb6b4-0870-47bc-b13c-24f96bc4d282-metrics-tls") pod "dns-default-75v2d" (UID: "5a4cb6b4-0870-47bc-b13c-24f96bc4d282") : secret "dns-default-metrics-tls" not found Apr 21 10:04:25.359152 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:25.359097 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5a4cb6b4-0870-47bc-b13c-24f96bc4d282-tmp-dir\") pod \"dns-default-75v2d\" (UID: \"5a4cb6b4-0870-47bc-b13c-24f96bc4d282\") " pod="openshift-dns/dns-default-75v2d" Apr 21 10:04:25.359243 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:25.359178 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5a4cb6b4-0870-47bc-b13c-24f96bc4d282-config-volume\") pod \"dns-default-75v2d\" (UID: \"5a4cb6b4-0870-47bc-b13c-24f96bc4d282\") " pod="openshift-dns/dns-default-75v2d" Apr 21 10:04:25.370946 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:25.370914 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6922\" (UniqueName: \"kubernetes.io/projected/5a4cb6b4-0870-47bc-b13c-24f96bc4d282-kube-api-access-z6922\") pod \"dns-default-75v2d\" (UID: \"5a4cb6b4-0870-47bc-b13c-24f96bc4d282\") " pod="openshift-dns/dns-default-75v2d" Apr 21 10:04:25.371116 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:25.371065 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9gqv\" (UniqueName: \"kubernetes.io/projected/b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0-kube-api-access-h9gqv\") pod \"ingress-canary-6lwhm\" (UID: \"b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0\") " pod="openshift-ingress-canary/ingress-canary-6lwhm" Apr 21 10:04:25.862883 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:25.862849 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5a4cb6b4-0870-47bc-b13c-24f96bc4d282-metrics-tls\") pod \"dns-default-75v2d\" (UID: \"5a4cb6b4-0870-47bc-b13c-24f96bc4d282\") " pod="openshift-dns/dns-default-75v2d" Apr 21 10:04:25.863117 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:25.862907 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0-cert\") pod \"ingress-canary-6lwhm\" (UID: \"b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0\") " pod="openshift-ingress-canary/ingress-canary-6lwhm" Apr 21 10:04:25.863117 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:25.863001 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 10:04:25.863117 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:25.863010 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 10:04:25.863117 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:25.863067 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0-cert podName:b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:26.86304811 +0000 UTC m=+35.201582105 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0-cert") pod "ingress-canary-6lwhm" (UID: "b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0") : secret "canary-serving-cert" not found Apr 21 10:04:25.863117 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:25.863086 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a4cb6b4-0870-47bc-b13c-24f96bc4d282-metrics-tls podName:5a4cb6b4-0870-47bc-b13c-24f96bc4d282 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:26.863076983 +0000 UTC m=+35.201610981 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5a4cb6b4-0870-47bc-b13c-24f96bc4d282-metrics-tls") pod "dns-default-75v2d" (UID: "5a4cb6b4-0870-47bc-b13c-24f96bc4d282") : secret "dns-default-metrics-tls" not found Apr 21 10:04:25.963826 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:25.963762 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e437b5da-7e75-4ed5-8d79-e418168b80fe-metrics-certs\") pod \"network-metrics-daemon-kckvj\" (UID: \"e437b5da-7e75-4ed5-8d79-e418168b80fe\") " pod="openshift-multus/network-metrics-daemon-kckvj" Apr 21 10:04:25.964031 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:25.963904 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 10:04:25.964031 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:25.963981 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e437b5da-7e75-4ed5-8d79-e418168b80fe-metrics-certs podName:e437b5da-7e75-4ed5-8d79-e418168b80fe nodeName:}" failed. No retries permitted until 2026-04-21 10:04:57.963961096 +0000 UTC m=+66.302495107 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e437b5da-7e75-4ed5-8d79-e418168b80fe-metrics-certs") pod "network-metrics-daemon-kckvj" (UID: "e437b5da-7e75-4ed5-8d79-e418168b80fe") : secret "metrics-daemon-secret" not found Apr 21 10:04:26.064766 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:26.064713 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pxdxm\" (UniqueName: \"kubernetes.io/projected/8723d4f5-441e-4586-b642-f008d599b082-kube-api-access-pxdxm\") pod \"network-check-target-qxvf7\" (UID: \"8723d4f5-441e-4586-b642-f008d599b082\") " pod="openshift-network-diagnostics/network-check-target-qxvf7" Apr 21 10:04:26.067256 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:26.067233 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxdxm\" (UniqueName: \"kubernetes.io/projected/8723d4f5-441e-4586-b642-f008d599b082-kube-api-access-pxdxm\") pod \"network-check-target-qxvf7\" (UID: \"8723d4f5-441e-4586-b642-f008d599b082\") " pod="openshift-network-diagnostics/network-check-target-qxvf7" Apr 21 10:04:26.140373 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:26.140300 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-qxvf7" Apr 21 10:04:26.303382 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:26.303347 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-qxvf7"] Apr 21 10:04:26.378935 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:04:26.378898 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8723d4f5_441e_4586_b642_f008d599b082.slice/crio-564ff7f48e8f77c6a66a497cc7ecdaea6e153c920b0a1212e74ad98c2666d025 WatchSource:0}: Error finding container 564ff7f48e8f77c6a66a497cc7ecdaea6e153c920b0a1212e74ad98c2666d025: Status 404 returned error can't find the container with id 564ff7f48e8f77c6a66a497cc7ecdaea6e153c920b0a1212e74ad98c2666d025 Apr 21 10:04:26.429333 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:26.429304 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qxvf7" event={"ID":"8723d4f5-441e-4586-b642-f008d599b082","Type":"ContainerStarted","Data":"564ff7f48e8f77c6a66a497cc7ecdaea6e153c920b0a1212e74ad98c2666d025"} Apr 21 10:04:26.871349 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:26.871173 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0-cert\") pod \"ingress-canary-6lwhm\" (UID: \"b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0\") " pod="openshift-ingress-canary/ingress-canary-6lwhm" Apr 21 10:04:26.871540 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:26.871388 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5a4cb6b4-0870-47bc-b13c-24f96bc4d282-metrics-tls\") pod \"dns-default-75v2d\" (UID: \"5a4cb6b4-0870-47bc-b13c-24f96bc4d282\") " pod="openshift-dns/dns-default-75v2d" Apr 21 10:04:26.871540 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:26.871313 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 10:04:26.871540 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:26.871466 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0-cert podName:b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:28.871447967 +0000 UTC m=+37.209981964 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0-cert") pod "ingress-canary-6lwhm" (UID: "b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0") : secret "canary-serving-cert" not found Apr 21 10:04:26.871540 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:26.871470 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 10:04:26.871540 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:26.871509 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a4cb6b4-0870-47bc-b13c-24f96bc4d282-metrics-tls podName:5a4cb6b4-0870-47bc-b13c-24f96bc4d282 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:28.871497214 +0000 UTC m=+37.210031209 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5a4cb6b4-0870-47bc-b13c-24f96bc4d282-metrics-tls") pod "dns-default-75v2d" (UID: "5a4cb6b4-0870-47bc-b13c-24f96bc4d282") : secret "dns-default-metrics-tls" not found Apr 21 10:04:27.434257 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:27.434219 2577 generic.go:358] "Generic (PLEG): container finished" podID="1e8275fc-da1c-442f-8dd4-cc1ad0f529fe" containerID="000e3e9a779a5ea479be2f74cc7f640b41f84d5c3b1505cfb0e5a568b9588045" exitCode=0 Apr 21 10:04:27.434682 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:27.434273 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g77pc" event={"ID":"1e8275fc-da1c-442f-8dd4-cc1ad0f529fe","Type":"ContainerDied","Data":"000e3e9a779a5ea479be2f74cc7f640b41f84d5c3b1505cfb0e5a568b9588045"} Apr 21 10:04:28.439455 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:28.439417 2577 generic.go:358] "Generic (PLEG): container finished" podID="1e8275fc-da1c-442f-8dd4-cc1ad0f529fe" containerID="818595d85cc45a905462893269a4788083c5525322059fc931af81d4ef6fe78f" exitCode=0 Apr 21 10:04:28.439964 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:28.439476 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g77pc" event={"ID":"1e8275fc-da1c-442f-8dd4-cc1ad0f529fe","Type":"ContainerDied","Data":"818595d85cc45a905462893269a4788083c5525322059fc931af81d4ef6fe78f"} Apr 21 10:04:28.886003 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:28.885910 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0-cert\") pod \"ingress-canary-6lwhm\" (UID: \"b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0\") " pod="openshift-ingress-canary/ingress-canary-6lwhm" Apr 21 10:04:28.886179 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:28.886018 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5a4cb6b4-0870-47bc-b13c-24f96bc4d282-metrics-tls\") pod \"dns-default-75v2d\" (UID: \"5a4cb6b4-0870-47bc-b13c-24f96bc4d282\") " pod="openshift-dns/dns-default-75v2d" Apr 21 10:04:28.886179 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:28.886068 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 10:04:28.886179 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:28.886131 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 10:04:28.886179 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:28.886145 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0-cert podName:b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:32.886127195 +0000 UTC m=+41.224661210 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0-cert") pod "ingress-canary-6lwhm" (UID: "b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0") : secret "canary-serving-cert" not found Apr 21 10:04:28.886179 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:28.886182 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a4cb6b4-0870-47bc-b13c-24f96bc4d282-metrics-tls podName:5a4cb6b4-0870-47bc-b13c-24f96bc4d282 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:32.886166105 +0000 UTC m=+41.224700115 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5a4cb6b4-0870-47bc-b13c-24f96bc4d282-metrics-tls") pod "dns-default-75v2d" (UID: "5a4cb6b4-0870-47bc-b13c-24f96bc4d282") : secret "dns-default-metrics-tls" not found Apr 21 10:04:30.449828 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:30.449789 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g77pc" event={"ID":"1e8275fc-da1c-442f-8dd4-cc1ad0f529fe","Type":"ContainerStarted","Data":"16fc1abc1b04325f1d9cccb818f55f4d7f9f6ca3f4cca0e103e7cd89d77d8aa3"} Apr 21 10:04:30.450997 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:30.450976 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-qxvf7" event={"ID":"8723d4f5-441e-4586-b642-f008d599b082","Type":"ContainerStarted","Data":"38e11660d8ce24b864754e93413abec49278a57514b58d1a72cc4ddcc3fdd5f5"} Apr 21 10:04:30.451131 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:30.451118 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-qxvf7" Apr 21 10:04:30.474154 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:30.474105 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-g77pc" podStartSLOduration=6.777946529 podStartE2EDuration="38.474089837s" podCreationTimestamp="2026-04-21 10:03:52 +0000 UTC" firstStartedPulling="2026-04-21 10:03:54.718261064 +0000 UTC m=+3.056795059" lastFinishedPulling="2026-04-21 10:04:26.414404361 +0000 UTC m=+34.752938367" observedRunningTime="2026-04-21 10:04:30.473732557 +0000 UTC m=+38.812266574" watchObservedRunningTime="2026-04-21 10:04:30.474089837 +0000 UTC m=+38.812623855" Apr 21 10:04:30.497231 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:30.497178 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-qxvf7" podStartSLOduration=35.253919562 podStartE2EDuration="38.49716408s" podCreationTimestamp="2026-04-21 10:03:52 +0000 UTC" firstStartedPulling="2026-04-21 10:04:26.390571209 +0000 UTC m=+34.729105209" lastFinishedPulling="2026-04-21 10:04:29.633815729 +0000 UTC m=+37.972349727" observedRunningTime="2026-04-21 10:04:30.496328859 +0000 UTC m=+38.834862876" watchObservedRunningTime="2026-04-21 10:04:30.49716408 +0000 UTC m=+38.835698091" Apr 21 10:04:32.914486 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:32.914447 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5a4cb6b4-0870-47bc-b13c-24f96bc4d282-metrics-tls\") pod \"dns-default-75v2d\" (UID: \"5a4cb6b4-0870-47bc-b13c-24f96bc4d282\") " pod="openshift-dns/dns-default-75v2d" Apr 21 10:04:32.914486 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:32.914493 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0-cert\") pod \"ingress-canary-6lwhm\" (UID: \"b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0\") " pod="openshift-ingress-canary/ingress-canary-6lwhm" Apr 21 10:04:32.914938 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:32.914595 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 10:04:32.914938 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:32.914598 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 10:04:32.914938 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:32.914648 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0-cert podName:b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:40.914633492 +0000 UTC m=+49.253167487 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0-cert") pod "ingress-canary-6lwhm" (UID: "b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0") : secret "canary-serving-cert" not found Apr 21 10:04:32.914938 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:32.914664 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a4cb6b4-0870-47bc-b13c-24f96bc4d282-metrics-tls podName:5a4cb6b4-0870-47bc-b13c-24f96bc4d282 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:40.914657543 +0000 UTC m=+49.253191538 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5a4cb6b4-0870-47bc-b13c-24f96bc4d282-metrics-tls") pod "dns-default-75v2d" (UID: "5a4cb6b4-0870-47bc-b13c-24f96bc4d282") : secret "dns-default-metrics-tls" not found Apr 21 10:04:40.968393 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:40.968346 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5a4cb6b4-0870-47bc-b13c-24f96bc4d282-metrics-tls\") pod \"dns-default-75v2d\" (UID: \"5a4cb6b4-0870-47bc-b13c-24f96bc4d282\") " pod="openshift-dns/dns-default-75v2d" Apr 21 10:04:40.968393 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:40.968401 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0-cert\") pod \"ingress-canary-6lwhm\" (UID: \"b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0\") " pod="openshift-ingress-canary/ingress-canary-6lwhm" Apr 21 10:04:40.968816 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:40.968503 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 10:04:40.968816 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:40.968571 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a4cb6b4-0870-47bc-b13c-24f96bc4d282-metrics-tls podName:5a4cb6b4-0870-47bc-b13c-24f96bc4d282 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:56.968555593 +0000 UTC m=+65.307089591 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5a4cb6b4-0870-47bc-b13c-24f96bc4d282-metrics-tls") pod "dns-default-75v2d" (UID: "5a4cb6b4-0870-47bc-b13c-24f96bc4d282") : secret "dns-default-metrics-tls" not found Apr 21 10:04:40.968816 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:40.968513 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 10:04:40.968816 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:40.968647 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0-cert podName:b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0 nodeName:}" failed. No retries permitted until 2026-04-21 10:04:56.968633487 +0000 UTC m=+65.307167486 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0-cert") pod "ingress-canary-6lwhm" (UID: "b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0") : secret "canary-serving-cert" not found Apr 21 10:04:51.429826 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:51.429798 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hc44q" Apr 21 10:04:56.975063 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:56.975019 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0-cert\") pod \"ingress-canary-6lwhm\" (UID: \"b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0\") " pod="openshift-ingress-canary/ingress-canary-6lwhm" Apr 21 10:04:56.975452 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:56.975088 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5a4cb6b4-0870-47bc-b13c-24f96bc4d282-metrics-tls\") pod \"dns-default-75v2d\" (UID: \"5a4cb6b4-0870-47bc-b13c-24f96bc4d282\") " pod="openshift-dns/dns-default-75v2d" Apr 21 10:04:56.975452 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:56.975198 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 10:04:56.975452 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:56.975273 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a4cb6b4-0870-47bc-b13c-24f96bc4d282-metrics-tls podName:5a4cb6b4-0870-47bc-b13c-24f96bc4d282 nodeName:}" failed. No retries permitted until 2026-04-21 10:05:28.975255855 +0000 UTC m=+97.313789857 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5a4cb6b4-0870-47bc-b13c-24f96bc4d282-metrics-tls") pod "dns-default-75v2d" (UID: "5a4cb6b4-0870-47bc-b13c-24f96bc4d282") : secret "dns-default-metrics-tls" not found Apr 21 10:04:56.975452 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:56.975198 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 10:04:56.975452 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:56.975314 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0-cert podName:b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0 nodeName:}" failed. No retries permitted until 2026-04-21 10:05:28.975305809 +0000 UTC m=+97.313839819 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0-cert") pod "ingress-canary-6lwhm" (UID: "b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0") : secret "canary-serving-cert" not found Apr 21 10:04:57.982556 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:04:57.982518 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e437b5da-7e75-4ed5-8d79-e418168b80fe-metrics-certs\") pod \"network-metrics-daemon-kckvj\" (UID: \"e437b5da-7e75-4ed5-8d79-e418168b80fe\") " pod="openshift-multus/network-metrics-daemon-kckvj" Apr 21 10:04:57.982971 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:57.982627 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 10:04:57.982971 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:04:57.982681 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e437b5da-7e75-4ed5-8d79-e418168b80fe-metrics-certs podName:e437b5da-7e75-4ed5-8d79-e418168b80fe nodeName:}" failed. No retries permitted until 2026-04-21 10:06:01.982666699 +0000 UTC m=+130.321200699 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e437b5da-7e75-4ed5-8d79-e418168b80fe-metrics-certs") pod "network-metrics-daemon-kckvj" (UID: "e437b5da-7e75-4ed5-8d79-e418168b80fe") : secret "metrics-daemon-secret" not found Apr 21 10:05:01.455761 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:05:01.455714 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-qxvf7" Apr 21 10:05:28.981657 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:05:28.981608 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5a4cb6b4-0870-47bc-b13c-24f96bc4d282-metrics-tls\") pod \"dns-default-75v2d\" (UID: \"5a4cb6b4-0870-47bc-b13c-24f96bc4d282\") " pod="openshift-dns/dns-default-75v2d" Apr 21 10:05:28.981657 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:05:28.981677 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0-cert\") pod \"ingress-canary-6lwhm\" (UID: \"b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0\") " pod="openshift-ingress-canary/ingress-canary-6lwhm" Apr 21 10:05:28.982126 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:05:28.981792 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 21 10:05:28.982126 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:05:28.981793 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 21 10:05:28.982126 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:05:28.981860 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0-cert podName:b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:32.981843749 +0000 UTC m=+161.320377744 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0-cert") pod "ingress-canary-6lwhm" (UID: "b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0") : secret "canary-serving-cert" not found Apr 21 10:05:28.982126 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:05:28.981873 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5a4cb6b4-0870-47bc-b13c-24f96bc4d282-metrics-tls podName:5a4cb6b4-0870-47bc-b13c-24f96bc4d282 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:32.981867571 +0000 UTC m=+161.320401565 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5a4cb6b4-0870-47bc-b13c-24f96bc4d282-metrics-tls") pod "dns-default-75v2d" (UID: "5a4cb6b4-0870-47bc-b13c-24f96bc4d282") : secret "dns-default-metrics-tls" not found Apr 21 10:06:02.014066 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:02.014011 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e437b5da-7e75-4ed5-8d79-e418168b80fe-metrics-certs\") pod \"network-metrics-daemon-kckvj\" (UID: \"e437b5da-7e75-4ed5-8d79-e418168b80fe\") " pod="openshift-multus/network-metrics-daemon-kckvj" Apr 21 10:06:02.014589 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:06:02.014152 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 21 10:06:02.014589 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:06:02.014233 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e437b5da-7e75-4ed5-8d79-e418168b80fe-metrics-certs podName:e437b5da-7e75-4ed5-8d79-e418168b80fe nodeName:}" failed. No retries permitted until 2026-04-21 10:08:04.014215944 +0000 UTC m=+252.352749943 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e437b5da-7e75-4ed5-8d79-e418168b80fe-metrics-certs") pod "network-metrics-daemon-kckvj" (UID: "e437b5da-7e75-4ed5-8d79-e418168b80fe") : secret "metrics-daemon-secret" not found Apr 21 10:06:15.318614 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:15.318579 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-wbqzj"] Apr 21 10:06:15.320992 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:15.320974 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-wbqzj" Apr 21 10:06:15.323415 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:15.323392 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 21 10:06:15.323580 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:15.323392 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 21 10:06:15.324094 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:15.324072 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-nhc8v\"" Apr 21 10:06:15.328989 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:15.328966 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-wbqzj"] Apr 21 10:06:15.414643 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:15.414604 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s96rf\" (UniqueName: \"kubernetes.io/projected/1b0d20d2-b3e2-4083-9dd1-a3a892bbf98a-kube-api-access-s96rf\") pod \"volume-data-source-validator-7c6cbb6c87-wbqzj\" (UID: \"1b0d20d2-b3e2-4083-9dd1-a3a892bbf98a\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-wbqzj" Apr 21 10:06:15.424941 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:15.424908 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pk2f9"] Apr 21 10:06:15.426921 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:15.426905 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pk2f9" Apr 21 10:06:15.429686 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:15.429660 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 21 10:06:15.429935 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:15.429778 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-5g6vh\"" Apr 21 10:06:15.430150 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:15.430128 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 21 10:06:15.430357 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:15.430143 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 21 10:06:15.430868 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:15.430851 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 21 10:06:15.438312 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:15.438284 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pk2f9"] Apr 21 10:06:15.515697 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:15.515662 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s96rf\" (UniqueName: \"kubernetes.io/projected/1b0d20d2-b3e2-4083-9dd1-a3a892bbf98a-kube-api-access-s96rf\") pod \"volume-data-source-validator-7c6cbb6c87-wbqzj\" (UID: \"1b0d20d2-b3e2-4083-9dd1-a3a892bbf98a\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-wbqzj" Apr 21 10:06:15.515918 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:15.515722 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42a73500-9d9d-4413-8386-90da11688aaa-serving-cert\") pod \"service-ca-operator-d6fc45fc5-pk2f9\" (UID: \"42a73500-9d9d-4413-8386-90da11688aaa\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pk2f9" Apr 21 10:06:15.515918 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:15.515793 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42a73500-9d9d-4413-8386-90da11688aaa-config\") pod \"service-ca-operator-d6fc45fc5-pk2f9\" (UID: \"42a73500-9d9d-4413-8386-90da11688aaa\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pk2f9" Apr 21 10:06:15.515918 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:15.515824 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwtmm\" (UniqueName: \"kubernetes.io/projected/42a73500-9d9d-4413-8386-90da11688aaa-kube-api-access-mwtmm\") pod \"service-ca-operator-d6fc45fc5-pk2f9\" (UID: \"42a73500-9d9d-4413-8386-90da11688aaa\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pk2f9" Apr 21 10:06:15.527544 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:15.527503 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-9vsls"] Apr 21 10:06:15.529678 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:15.529655 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-9vsls" Apr 21 10:06:15.532274 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:15.532247 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 21 10:06:15.532540 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:15.532511 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 21 10:06:15.532702 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:15.532541 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 21 10:06:15.532702 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:15.532570 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-vwwlp\"" Apr 21 10:06:15.533047 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:15.533033 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 21 10:06:15.538206 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:15.538183 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 21 10:06:15.539877 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:15.539854 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s96rf\" (UniqueName: \"kubernetes.io/projected/1b0d20d2-b3e2-4083-9dd1-a3a892bbf98a-kube-api-access-s96rf\") pod \"volume-data-source-validator-7c6cbb6c87-wbqzj\" (UID: \"1b0d20d2-b3e2-4083-9dd1-a3a892bbf98a\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-wbqzj" Apr 21 10:06:15.540032 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:15.540016 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-9vsls"] Apr 21 10:06:15.616688 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:15.616590 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mwtmm\" (UniqueName: \"kubernetes.io/projected/42a73500-9d9d-4413-8386-90da11688aaa-kube-api-access-mwtmm\") pod \"service-ca-operator-d6fc45fc5-pk2f9\" (UID: \"42a73500-9d9d-4413-8386-90da11688aaa\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pk2f9" Apr 21 10:06:15.616688 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:15.616634 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7fc40ec-8080-4847-9b12-78671f916c03-config\") pod \"console-operator-9d4b6777b-9vsls\" (UID: \"c7fc40ec-8080-4847-9b12-78671f916c03\") " pod="openshift-console-operator/console-operator-9d4b6777b-9vsls" Apr 21 10:06:15.616688 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:15.616666 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c7fc40ec-8080-4847-9b12-78671f916c03-trusted-ca\") pod \"console-operator-9d4b6777b-9vsls\" (UID: \"c7fc40ec-8080-4847-9b12-78671f916c03\") " pod="openshift-console-operator/console-operator-9d4b6777b-9vsls" Apr 21 10:06:15.616968 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:15.616813 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7fc40ec-8080-4847-9b12-78671f916c03-serving-cert\") pod \"console-operator-9d4b6777b-9vsls\" (UID: \"c7fc40ec-8080-4847-9b12-78671f916c03\") " pod="openshift-console-operator/console-operator-9d4b6777b-9vsls" Apr 21 10:06:15.616968 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:15.616854 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42a73500-9d9d-4413-8386-90da11688aaa-serving-cert\") pod \"service-ca-operator-d6fc45fc5-pk2f9\" (UID: \"42a73500-9d9d-4413-8386-90da11688aaa\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pk2f9" Apr 21 10:06:15.616968 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:15.616890 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqmtt\" (UniqueName: \"kubernetes.io/projected/c7fc40ec-8080-4847-9b12-78671f916c03-kube-api-access-zqmtt\") pod \"console-operator-9d4b6777b-9vsls\" (UID: \"c7fc40ec-8080-4847-9b12-78671f916c03\") " pod="openshift-console-operator/console-operator-9d4b6777b-9vsls" Apr 21 10:06:15.616968 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:15.616918 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42a73500-9d9d-4413-8386-90da11688aaa-config\") pod \"service-ca-operator-d6fc45fc5-pk2f9\" (UID: \"42a73500-9d9d-4413-8386-90da11688aaa\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pk2f9" Apr 21 10:06:15.617382 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:15.617363 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42a73500-9d9d-4413-8386-90da11688aaa-config\") pod \"service-ca-operator-d6fc45fc5-pk2f9\" (UID: \"42a73500-9d9d-4413-8386-90da11688aaa\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pk2f9" Apr 21 10:06:15.619126 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:15.619104 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42a73500-9d9d-4413-8386-90da11688aaa-serving-cert\") pod \"service-ca-operator-d6fc45fc5-pk2f9\" (UID: \"42a73500-9d9d-4413-8386-90da11688aaa\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pk2f9" Apr 21 10:06:15.624651 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:15.624624 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwtmm\" (UniqueName: \"kubernetes.io/projected/42a73500-9d9d-4413-8386-90da11688aaa-kube-api-access-mwtmm\") pod \"service-ca-operator-d6fc45fc5-pk2f9\" (UID: \"42a73500-9d9d-4413-8386-90da11688aaa\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pk2f9" Apr 21 10:06:15.630495 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:15.630474 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-wbqzj" Apr 21 10:06:15.718187 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:15.718154 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zqmtt\" (UniqueName: \"kubernetes.io/projected/c7fc40ec-8080-4847-9b12-78671f916c03-kube-api-access-zqmtt\") pod \"console-operator-9d4b6777b-9vsls\" (UID: \"c7fc40ec-8080-4847-9b12-78671f916c03\") " pod="openshift-console-operator/console-operator-9d4b6777b-9vsls" Apr 21 10:06:15.718365 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:15.718217 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7fc40ec-8080-4847-9b12-78671f916c03-config\") pod \"console-operator-9d4b6777b-9vsls\" (UID: \"c7fc40ec-8080-4847-9b12-78671f916c03\") " pod="openshift-console-operator/console-operator-9d4b6777b-9vsls" Apr 21 10:06:15.718365 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:15.718253 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c7fc40ec-8080-4847-9b12-78671f916c03-trusted-ca\") pod \"console-operator-9d4b6777b-9vsls\" (UID: \"c7fc40ec-8080-4847-9b12-78671f916c03\") " pod="openshift-console-operator/console-operator-9d4b6777b-9vsls" Apr 21 10:06:15.718365 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:15.718314 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7fc40ec-8080-4847-9b12-78671f916c03-serving-cert\") pod \"console-operator-9d4b6777b-9vsls\" (UID: \"c7fc40ec-8080-4847-9b12-78671f916c03\") " pod="openshift-console-operator/console-operator-9d4b6777b-9vsls" Apr 21 10:06:15.719084 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:15.719055 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7fc40ec-8080-4847-9b12-78671f916c03-config\") pod \"console-operator-9d4b6777b-9vsls\" (UID: \"c7fc40ec-8080-4847-9b12-78671f916c03\") " pod="openshift-console-operator/console-operator-9d4b6777b-9vsls" Apr 21 10:06:15.719357 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:15.719338 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c7fc40ec-8080-4847-9b12-78671f916c03-trusted-ca\") pod \"console-operator-9d4b6777b-9vsls\" (UID: \"c7fc40ec-8080-4847-9b12-78671f916c03\") " pod="openshift-console-operator/console-operator-9d4b6777b-9vsls" Apr 21 10:06:15.720813 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:15.720791 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c7fc40ec-8080-4847-9b12-78671f916c03-serving-cert\") pod \"console-operator-9d4b6777b-9vsls\" (UID: \"c7fc40ec-8080-4847-9b12-78671f916c03\") " pod="openshift-console-operator/console-operator-9d4b6777b-9vsls" Apr 21 10:06:15.725737 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:15.725716 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqmtt\" (UniqueName: \"kubernetes.io/projected/c7fc40ec-8080-4847-9b12-78671f916c03-kube-api-access-zqmtt\") pod \"console-operator-9d4b6777b-9vsls\" (UID: \"c7fc40ec-8080-4847-9b12-78671f916c03\") " pod="openshift-console-operator/console-operator-9d4b6777b-9vsls" Apr 21 10:06:15.739627 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:15.739592 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pk2f9" Apr 21 10:06:15.746532 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:15.746501 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-wbqzj"] Apr 21 10:06:15.750270 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:06:15.750235 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b0d20d2_b3e2_4083_9dd1_a3a892bbf98a.slice/crio-2cbd404384e0b06b3e6d8f25c2b401c2eca90959fb9b4c72ad0074cfd6f97f6e WatchSource:0}: Error finding container 2cbd404384e0b06b3e6d8f25c2b401c2eca90959fb9b4c72ad0074cfd6f97f6e: Status 404 returned error can't find the container with id 2cbd404384e0b06b3e6d8f25c2b401c2eca90959fb9b4c72ad0074cfd6f97f6e Apr 21 10:06:15.839583 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:15.839553 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-9vsls" Apr 21 10:06:15.854839 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:15.854806 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pk2f9"] Apr 21 10:06:15.858649 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:06:15.858623 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42a73500_9d9d_4413_8386_90da11688aaa.slice/crio-a5721d47b6a79a954d9c4f2e7d592f80f6d5300bf9365c5fde8ae7dfda1844a8 WatchSource:0}: Error finding container a5721d47b6a79a954d9c4f2e7d592f80f6d5300bf9365c5fde8ae7dfda1844a8: Status 404 returned error can't find the container with id a5721d47b6a79a954d9c4f2e7d592f80f6d5300bf9365c5fde8ae7dfda1844a8 Apr 21 10:06:15.956705 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:15.956669 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-9vsls"] Apr 21 10:06:15.960977 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:06:15.960949 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7fc40ec_8080_4847_9b12_78671f916c03.slice/crio-00236e48426a82b991b68b09a8a15c31f21b4a9d8552dcdb97f72780bc4fb472 WatchSource:0}: Error finding container 00236e48426a82b991b68b09a8a15c31f21b4a9d8552dcdb97f72780bc4fb472: Status 404 returned error can't find the container with id 00236e48426a82b991b68b09a8a15c31f21b4a9d8552dcdb97f72780bc4fb472 Apr 21 10:06:16.241004 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:16.240969 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5c9756474c-jljk4"] Apr 21 10:06:16.243912 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:16.243889 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5c9756474c-jljk4" Apr 21 10:06:16.246501 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:16.246314 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 21 10:06:16.246630 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:16.246526 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 21 10:06:16.246630 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:16.246529 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 21 10:06:16.246630 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:16.246614 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-7p24h\"" Apr 21 10:06:16.251674 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:16.251644 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 21 10:06:16.257149 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:16.257123 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5c9756474c-jljk4"] Apr 21 10:06:16.322432 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:16.322317 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0d34def8-62fa-4cc6-a42f-4efb93bf5234-trusted-ca\") pod \"image-registry-5c9756474c-jljk4\" (UID: \"0d34def8-62fa-4cc6-a42f-4efb93bf5234\") " pod="openshift-image-registry/image-registry-5c9756474c-jljk4" Apr 21 10:06:16.322432 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:16.322370 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn7xv\" (UniqueName: \"kubernetes.io/projected/0d34def8-62fa-4cc6-a42f-4efb93bf5234-kube-api-access-kn7xv\") pod \"image-registry-5c9756474c-jljk4\" (UID: \"0d34def8-62fa-4cc6-a42f-4efb93bf5234\") " pod="openshift-image-registry/image-registry-5c9756474c-jljk4" Apr 21 10:06:16.322984 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:16.322456 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0d34def8-62fa-4cc6-a42f-4efb93bf5234-bound-sa-token\") pod \"image-registry-5c9756474c-jljk4\" (UID: \"0d34def8-62fa-4cc6-a42f-4efb93bf5234\") " pod="openshift-image-registry/image-registry-5c9756474c-jljk4" Apr 21 10:06:16.322984 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:16.322502 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0d34def8-62fa-4cc6-a42f-4efb93bf5234-registry-tls\") pod \"image-registry-5c9756474c-jljk4\" (UID: \"0d34def8-62fa-4cc6-a42f-4efb93bf5234\") " pod="openshift-image-registry/image-registry-5c9756474c-jljk4" Apr 21 10:06:16.322984 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:16.322530 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0d34def8-62fa-4cc6-a42f-4efb93bf5234-installation-pull-secrets\") pod \"image-registry-5c9756474c-jljk4\" (UID: \"0d34def8-62fa-4cc6-a42f-4efb93bf5234\") " pod="openshift-image-registry/image-registry-5c9756474c-jljk4" Apr 21 10:06:16.322984 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:16.322600 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0d34def8-62fa-4cc6-a42f-4efb93bf5234-registry-certificates\") pod \"image-registry-5c9756474c-jljk4\" (UID: \"0d34def8-62fa-4cc6-a42f-4efb93bf5234\") " pod="openshift-image-registry/image-registry-5c9756474c-jljk4" Apr 21 10:06:16.322984 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:16.322705 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0d34def8-62fa-4cc6-a42f-4efb93bf5234-image-registry-private-configuration\") pod \"image-registry-5c9756474c-jljk4\" (UID: \"0d34def8-62fa-4cc6-a42f-4efb93bf5234\") " pod="openshift-image-registry/image-registry-5c9756474c-jljk4" Apr 21 10:06:16.322984 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:16.322741 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0d34def8-62fa-4cc6-a42f-4efb93bf5234-ca-trust-extracted\") pod \"image-registry-5c9756474c-jljk4\" (UID: \"0d34def8-62fa-4cc6-a42f-4efb93bf5234\") " pod="openshift-image-registry/image-registry-5c9756474c-jljk4" Apr 21 10:06:16.423519 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:16.423480 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0d34def8-62fa-4cc6-a42f-4efb93bf5234-bound-sa-token\") pod \"image-registry-5c9756474c-jljk4\" (UID: \"0d34def8-62fa-4cc6-a42f-4efb93bf5234\") " pod="openshift-image-registry/image-registry-5c9756474c-jljk4" Apr 21 10:06:16.423705 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:16.423596 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0d34def8-62fa-4cc6-a42f-4efb93bf5234-registry-tls\") pod \"image-registry-5c9756474c-jljk4\" (UID: \"0d34def8-62fa-4cc6-a42f-4efb93bf5234\") " pod="openshift-image-registry/image-registry-5c9756474c-jljk4" Apr 21 10:06:16.423705 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:16.423625 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0d34def8-62fa-4cc6-a42f-4efb93bf5234-installation-pull-secrets\") pod \"image-registry-5c9756474c-jljk4\" (UID: \"0d34def8-62fa-4cc6-a42f-4efb93bf5234\") " pod="openshift-image-registry/image-registry-5c9756474c-jljk4" Apr 21 10:06:16.423705 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:16.423660 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0d34def8-62fa-4cc6-a42f-4efb93bf5234-registry-certificates\") pod \"image-registry-5c9756474c-jljk4\" (UID: \"0d34def8-62fa-4cc6-a42f-4efb93bf5234\") " pod="openshift-image-registry/image-registry-5c9756474c-jljk4" Apr 21 10:06:16.423928 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:16.423726 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0d34def8-62fa-4cc6-a42f-4efb93bf5234-image-registry-private-configuration\") pod \"image-registry-5c9756474c-jljk4\" (UID: \"0d34def8-62fa-4cc6-a42f-4efb93bf5234\") " pod="openshift-image-registry/image-registry-5c9756474c-jljk4" Apr 21 10:06:16.423928 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:06:16.423764 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 10:06:16.423928 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:06:16.423789 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c9756474c-jljk4: secret "image-registry-tls" not found Apr 21 10:06:16.423928 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:06:16.423849 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0d34def8-62fa-4cc6-a42f-4efb93bf5234-registry-tls podName:0d34def8-62fa-4cc6-a42f-4efb93bf5234 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:16.92382789 +0000 UTC m=+145.262361905 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0d34def8-62fa-4cc6-a42f-4efb93bf5234-registry-tls") pod "image-registry-5c9756474c-jljk4" (UID: "0d34def8-62fa-4cc6-a42f-4efb93bf5234") : secret "image-registry-tls" not found Apr 21 10:06:16.423928 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:16.423768 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0d34def8-62fa-4cc6-a42f-4efb93bf5234-ca-trust-extracted\") pod \"image-registry-5c9756474c-jljk4\" (UID: \"0d34def8-62fa-4cc6-a42f-4efb93bf5234\") " pod="openshift-image-registry/image-registry-5c9756474c-jljk4" Apr 21 10:06:16.423928 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:16.423919 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0d34def8-62fa-4cc6-a42f-4efb93bf5234-trusted-ca\") pod \"image-registry-5c9756474c-jljk4\" (UID: \"0d34def8-62fa-4cc6-a42f-4efb93bf5234\") " pod="openshift-image-registry/image-registry-5c9756474c-jljk4" Apr 21 10:06:16.424233 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:16.423958 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kn7xv\" (UniqueName: \"kubernetes.io/projected/0d34def8-62fa-4cc6-a42f-4efb93bf5234-kube-api-access-kn7xv\") pod \"image-registry-5c9756474c-jljk4\" (UID: \"0d34def8-62fa-4cc6-a42f-4efb93bf5234\") " pod="openshift-image-registry/image-registry-5c9756474c-jljk4" Apr 21 10:06:16.424288 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:16.424230 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0d34def8-62fa-4cc6-a42f-4efb93bf5234-ca-trust-extracted\") pod \"image-registry-5c9756474c-jljk4\" (UID: \"0d34def8-62fa-4cc6-a42f-4efb93bf5234\") " pod="openshift-image-registry/image-registry-5c9756474c-jljk4" Apr 21 10:06:16.424814 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:16.424789 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0d34def8-62fa-4cc6-a42f-4efb93bf5234-registry-certificates\") pod \"image-registry-5c9756474c-jljk4\" (UID: \"0d34def8-62fa-4cc6-a42f-4efb93bf5234\") " pod="openshift-image-registry/image-registry-5c9756474c-jljk4" Apr 21 10:06:16.425503 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:16.425457 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0d34def8-62fa-4cc6-a42f-4efb93bf5234-trusted-ca\") pod \"image-registry-5c9756474c-jljk4\" (UID: \"0d34def8-62fa-4cc6-a42f-4efb93bf5234\") " pod="openshift-image-registry/image-registry-5c9756474c-jljk4" Apr 21 10:06:16.427997 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:16.427955 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0d34def8-62fa-4cc6-a42f-4efb93bf5234-installation-pull-secrets\") pod \"image-registry-5c9756474c-jljk4\" (UID: \"0d34def8-62fa-4cc6-a42f-4efb93bf5234\") " pod="openshift-image-registry/image-registry-5c9756474c-jljk4" Apr 21 10:06:16.427997 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:16.427966 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0d34def8-62fa-4cc6-a42f-4efb93bf5234-image-registry-private-configuration\") pod \"image-registry-5c9756474c-jljk4\" (UID: \"0d34def8-62fa-4cc6-a42f-4efb93bf5234\") " pod="openshift-image-registry/image-registry-5c9756474c-jljk4" Apr 21 10:06:16.435095 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:16.435068 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn7xv\" (UniqueName: \"kubernetes.io/projected/0d34def8-62fa-4cc6-a42f-4efb93bf5234-kube-api-access-kn7xv\") pod \"image-registry-5c9756474c-jljk4\" (UID: \"0d34def8-62fa-4cc6-a42f-4efb93bf5234\") " pod="openshift-image-registry/image-registry-5c9756474c-jljk4" Apr 21 10:06:16.435274 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:16.435251 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0d34def8-62fa-4cc6-a42f-4efb93bf5234-bound-sa-token\") pod \"image-registry-5c9756474c-jljk4\" (UID: \"0d34def8-62fa-4cc6-a42f-4efb93bf5234\") " pod="openshift-image-registry/image-registry-5c9756474c-jljk4" Apr 21 10:06:16.655121 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:16.655015 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-9vsls" event={"ID":"c7fc40ec-8080-4847-9b12-78671f916c03","Type":"ContainerStarted","Data":"00236e48426a82b991b68b09a8a15c31f21b4a9d8552dcdb97f72780bc4fb472"} Apr 21 10:06:16.656070 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:16.656039 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pk2f9" event={"ID":"42a73500-9d9d-4413-8386-90da11688aaa","Type":"ContainerStarted","Data":"a5721d47b6a79a954d9c4f2e7d592f80f6d5300bf9365c5fde8ae7dfda1844a8"} Apr 21 10:06:16.657138 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:16.657074 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-wbqzj" event={"ID":"1b0d20d2-b3e2-4083-9dd1-a3a892bbf98a","Type":"ContainerStarted","Data":"2cbd404384e0b06b3e6d8f25c2b401c2eca90959fb9b4c72ad0074cfd6f97f6e"} Apr 21 10:06:16.928644 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:16.928500 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0d34def8-62fa-4cc6-a42f-4efb93bf5234-registry-tls\") pod \"image-registry-5c9756474c-jljk4\" (UID: \"0d34def8-62fa-4cc6-a42f-4efb93bf5234\") " pod="openshift-image-registry/image-registry-5c9756474c-jljk4" Apr 21 10:06:16.928837 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:06:16.928683 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 10:06:16.928837 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:06:16.928708 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c9756474c-jljk4: secret "image-registry-tls" not found Apr 21 10:06:16.928837 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:06:16.928799 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0d34def8-62fa-4cc6-a42f-4efb93bf5234-registry-tls podName:0d34def8-62fa-4cc6-a42f-4efb93bf5234 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:17.928776558 +0000 UTC m=+146.267310577 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0d34def8-62fa-4cc6-a42f-4efb93bf5234-registry-tls") pod "image-registry-5c9756474c-jljk4" (UID: "0d34def8-62fa-4cc6-a42f-4efb93bf5234") : secret "image-registry-tls" not found Apr 21 10:06:17.660494 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:17.660456 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-wbqzj" event={"ID":"1b0d20d2-b3e2-4083-9dd1-a3a892bbf98a","Type":"ContainerStarted","Data":"b96d9eb8690b66e8d6a76621ed0b466516dda1485ad6718f80b025061d4d2947"} Apr 21 10:06:17.678974 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:17.678921 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-wbqzj" podStartSLOduration=1.396690702 podStartE2EDuration="2.678905419s" podCreationTimestamp="2026-04-21 10:06:15 +0000 UTC" firstStartedPulling="2026-04-21 10:06:15.752089734 +0000 UTC m=+144.090623736" lastFinishedPulling="2026-04-21 10:06:17.034304455 +0000 UTC m=+145.372838453" observedRunningTime="2026-04-21 10:06:17.678097118 +0000 UTC m=+146.016631136" watchObservedRunningTime="2026-04-21 10:06:17.678905419 +0000 UTC m=+146.017439456" Apr 21 10:06:17.936480 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:17.936385 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0d34def8-62fa-4cc6-a42f-4efb93bf5234-registry-tls\") pod \"image-registry-5c9756474c-jljk4\" (UID: \"0d34def8-62fa-4cc6-a42f-4efb93bf5234\") " pod="openshift-image-registry/image-registry-5c9756474c-jljk4" Apr 21 10:06:17.936662 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:06:17.936572 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 10:06:17.936662 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:06:17.936598 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c9756474c-jljk4: secret "image-registry-tls" not found Apr 21 10:06:17.936791 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:06:17.936670 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0d34def8-62fa-4cc6-a42f-4efb93bf5234-registry-tls podName:0d34def8-62fa-4cc6-a42f-4efb93bf5234 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:19.936648881 +0000 UTC m=+148.275182892 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0d34def8-62fa-4cc6-a42f-4efb93bf5234-registry-tls") pod "image-registry-5c9756474c-jljk4" (UID: "0d34def8-62fa-4cc6-a42f-4efb93bf5234") : secret "image-registry-tls" not found Apr 21 10:06:18.664565 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:18.664477 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-9vsls_c7fc40ec-8080-4847-9b12-78671f916c03/console-operator/0.log" Apr 21 10:06:18.664565 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:18.664526 2577 generic.go:358] "Generic (PLEG): container finished" podID="c7fc40ec-8080-4847-9b12-78671f916c03" containerID="6f055b1fd6db17f5b7c48f5972bb6d9dc2ff2371bf80275cbe66dd1ec9e8e881" exitCode=255 Apr 21 10:06:18.665105 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:18.664589 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-9vsls" event={"ID":"c7fc40ec-8080-4847-9b12-78671f916c03","Type":"ContainerDied","Data":"6f055b1fd6db17f5b7c48f5972bb6d9dc2ff2371bf80275cbe66dd1ec9e8e881"} Apr 21 10:06:18.665105 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:18.664895 2577 scope.go:117] "RemoveContainer" containerID="6f055b1fd6db17f5b7c48f5972bb6d9dc2ff2371bf80275cbe66dd1ec9e8e881" Apr 21 10:06:18.666342 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:18.666318 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pk2f9" event={"ID":"42a73500-9d9d-4413-8386-90da11688aaa","Type":"ContainerStarted","Data":"a5fd0ec4fa5adfff2013c81ea12945d6a1e7d7a9a3e10da6aba1f6fa6425cb8b"} Apr 21 10:06:19.669436 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:19.669381 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-9vsls_c7fc40ec-8080-4847-9b12-78671f916c03/console-operator/1.log" Apr 21 10:06:19.669876 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:19.669791 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-9vsls_c7fc40ec-8080-4847-9b12-78671f916c03/console-operator/0.log" Apr 21 10:06:19.669876 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:19.669833 2577 generic.go:358] "Generic (PLEG): container finished" podID="c7fc40ec-8080-4847-9b12-78671f916c03" containerID="8fe0675785ca8ac3f48d3d86f2cdf80bec8d00c408c832b3843ff43f58ddb045" exitCode=255 Apr 21 10:06:19.669949 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:19.669927 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-9vsls" event={"ID":"c7fc40ec-8080-4847-9b12-78671f916c03","Type":"ContainerDied","Data":"8fe0675785ca8ac3f48d3d86f2cdf80bec8d00c408c832b3843ff43f58ddb045"} Apr 21 10:06:19.669986 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:19.669968 2577 scope.go:117] "RemoveContainer" containerID="6f055b1fd6db17f5b7c48f5972bb6d9dc2ff2371bf80275cbe66dd1ec9e8e881" Apr 21 10:06:19.670202 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:19.670183 2577 scope.go:117] "RemoveContainer" containerID="8fe0675785ca8ac3f48d3d86f2cdf80bec8d00c408c832b3843ff43f58ddb045" Apr 21 10:06:19.670379 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:06:19.670364 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-9vsls_openshift-console-operator(c7fc40ec-8080-4847-9b12-78671f916c03)\"" pod="openshift-console-operator/console-operator-9d4b6777b-9vsls" podUID="c7fc40ec-8080-4847-9b12-78671f916c03" Apr 21 10:06:19.692235 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:19.692187 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pk2f9" podStartSLOduration=2.452282102 podStartE2EDuration="4.69217226s" podCreationTimestamp="2026-04-21 10:06:15 +0000 UTC" firstStartedPulling="2026-04-21 10:06:15.860601419 +0000 UTC m=+144.199135415" lastFinishedPulling="2026-04-21 10:06:18.100491578 +0000 UTC m=+146.439025573" observedRunningTime="2026-04-21 10:06:18.699347823 +0000 UTC m=+147.037881834" watchObservedRunningTime="2026-04-21 10:06:19.69217226 +0000 UTC m=+148.030706277" Apr 21 10:06:19.956987 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:19.956899 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0d34def8-62fa-4cc6-a42f-4efb93bf5234-registry-tls\") pod \"image-registry-5c9756474c-jljk4\" (UID: \"0d34def8-62fa-4cc6-a42f-4efb93bf5234\") " pod="openshift-image-registry/image-registry-5c9756474c-jljk4" Apr 21 10:06:19.957133 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:06:19.957023 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 10:06:19.957133 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:06:19.957036 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c9756474c-jljk4: secret "image-registry-tls" not found Apr 21 10:06:19.957133 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:06:19.957086 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0d34def8-62fa-4cc6-a42f-4efb93bf5234-registry-tls podName:0d34def8-62fa-4cc6-a42f-4efb93bf5234 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:23.957072899 +0000 UTC m=+152.295606899 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0d34def8-62fa-4cc6-a42f-4efb93bf5234-registry-tls") pod "image-registry-5c9756474c-jljk4" (UID: "0d34def8-62fa-4cc6-a42f-4efb93bf5234") : secret "image-registry-tls" not found Apr 21 10:06:20.676520 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:20.673963 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-9vsls_c7fc40ec-8080-4847-9b12-78671f916c03/console-operator/1.log" Apr 21 10:06:20.677215 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:20.677192 2577 scope.go:117] "RemoveContainer" containerID="8fe0675785ca8ac3f48d3d86f2cdf80bec8d00c408c832b3843ff43f58ddb045" Apr 21 10:06:20.677393 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:06:20.677374 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-9vsls_openshift-console-operator(c7fc40ec-8080-4847-9b12-78671f916c03)\"" pod="openshift-console-operator/console-operator-9d4b6777b-9vsls" podUID="c7fc40ec-8080-4847-9b12-78671f916c03" Apr 21 10:06:20.902718 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:20.902688 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-c588q_186c1594-0ba9-495b-8213-27692e681b57/dns-node-resolver/0.log" Apr 21 10:06:21.489176 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:21.489137 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-7829w"] Apr 21 10:06:21.493367 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:21.493348 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-7829w" Apr 21 10:06:21.495717 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:21.495684 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 21 10:06:21.495885 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:21.495722 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 21 10:06:21.496439 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:21.496421 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-ngxc8\"" Apr 21 10:06:21.502278 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:21.502253 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-7829w"] Apr 21 10:06:21.504846 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:21.504825 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-4ggkj_b4e66db4-f8f5-415c-aac2-60c02dfc43ff/node-ca/0.log" Apr 21 10:06:21.570250 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:21.570212 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djm9v\" (UniqueName: \"kubernetes.io/projected/ad55884a-4753-4348-a129-272c6dfc8db3-kube-api-access-djm9v\") pod \"migrator-74bb7799d9-7829w\" (UID: \"ad55884a-4753-4348-a129-272c6dfc8db3\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-7829w" Apr 21 10:06:21.670676 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:21.670643 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-djm9v\" (UniqueName: \"kubernetes.io/projected/ad55884a-4753-4348-a129-272c6dfc8db3-kube-api-access-djm9v\") pod \"migrator-74bb7799d9-7829w\" (UID: \"ad55884a-4753-4348-a129-272c6dfc8db3\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-7829w" Apr 21 10:06:21.679605 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:21.679574 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-djm9v\" (UniqueName: \"kubernetes.io/projected/ad55884a-4753-4348-a129-272c6dfc8db3-kube-api-access-djm9v\") pod \"migrator-74bb7799d9-7829w\" (UID: \"ad55884a-4753-4348-a129-272c6dfc8db3\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-7829w" Apr 21 10:06:21.803068 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:21.802966 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-7829w" Apr 21 10:06:21.923994 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:21.923958 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-7829w"] Apr 21 10:06:21.927098 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:06:21.927068 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad55884a_4753_4348_a129_272c6dfc8db3.slice/crio-80772233805a98936a4ce0d6f0a5a001ac5a9f97c497319c748c58ccd42e55a6 WatchSource:0}: Error finding container 80772233805a98936a4ce0d6f0a5a001ac5a9f97c497319c748c58ccd42e55a6: Status 404 returned error can't find the container with id 80772233805a98936a4ce0d6f0a5a001ac5a9f97c497319c748c58ccd42e55a6 Apr 21 10:06:22.679583 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:22.679529 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-7829w" event={"ID":"ad55884a-4753-4348-a129-272c6dfc8db3","Type":"ContainerStarted","Data":"80772233805a98936a4ce0d6f0a5a001ac5a9f97c497319c748c58ccd42e55a6"} Apr 21 10:06:23.683682 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:23.683647 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-7829w" event={"ID":"ad55884a-4753-4348-a129-272c6dfc8db3","Type":"ContainerStarted","Data":"e785c9151d0c7cc725a3c69c91317cb83407063580ddf6890be7ed48ac7efd64"} Apr 21 10:06:23.683682 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:23.683684 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-7829w" event={"ID":"ad55884a-4753-4348-a129-272c6dfc8db3","Type":"ContainerStarted","Data":"622b6db5e2010c1d0183d96926deed5fec1a5b8ffbacf7a8421ed00d74d5ddab"} Apr 21 10:06:23.702739 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:23.702678 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-7829w" podStartSLOduration=1.71635624 podStartE2EDuration="2.702662506s" podCreationTimestamp="2026-04-21 10:06:21 +0000 UTC" firstStartedPulling="2026-04-21 10:06:21.929135781 +0000 UTC m=+150.267669776" lastFinishedPulling="2026-04-21 10:06:22.915442047 +0000 UTC m=+151.253976042" observedRunningTime="2026-04-21 10:06:23.700797089 +0000 UTC m=+152.039331105" watchObservedRunningTime="2026-04-21 10:06:23.702662506 +0000 UTC m=+152.041196550" Apr 21 10:06:23.988435 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:23.988398 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0d34def8-62fa-4cc6-a42f-4efb93bf5234-registry-tls\") pod \"image-registry-5c9756474c-jljk4\" (UID: \"0d34def8-62fa-4cc6-a42f-4efb93bf5234\") " pod="openshift-image-registry/image-registry-5c9756474c-jljk4" Apr 21 10:06:23.988632 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:06:23.988606 2577 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 21 10:06:23.988707 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:06:23.988636 2577 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5c9756474c-jljk4: secret "image-registry-tls" not found Apr 21 10:06:23.988790 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:06:23.988730 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0d34def8-62fa-4cc6-a42f-4efb93bf5234-registry-tls podName:0d34def8-62fa-4cc6-a42f-4efb93bf5234 nodeName:}" failed. No retries permitted until 2026-04-21 10:06:31.988706878 +0000 UTC m=+160.327240893 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0d34def8-62fa-4cc6-a42f-4efb93bf5234-registry-tls") pod "image-registry-5c9756474c-jljk4" (UID: "0d34def8-62fa-4cc6-a42f-4efb93bf5234") : secret "image-registry-tls" not found Apr 21 10:06:25.840428 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:25.840389 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-9vsls" Apr 21 10:06:25.840428 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:25.840434 2577 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-9vsls" Apr 21 10:06:25.840875 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:25.840828 2577 scope.go:117] "RemoveContainer" containerID="8fe0675785ca8ac3f48d3d86f2cdf80bec8d00c408c832b3843ff43f58ddb045" Apr 21 10:06:25.841016 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:06:25.840996 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-9vsls_openshift-console-operator(c7fc40ec-8080-4847-9b12-78671f916c03)\"" pod="openshift-console-operator/console-operator-9d4b6777b-9vsls" podUID="c7fc40ec-8080-4847-9b12-78671f916c03" Apr 21 10:06:28.115688 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:06:28.115637 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-6lwhm" podUID="b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0" Apr 21 10:06:28.133113 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:06:28.133065 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-75v2d" podUID="5a4cb6b4-0870-47bc-b13c-24f96bc4d282" Apr 21 10:06:28.247271 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:06:28.247225 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-kckvj" podUID="e437b5da-7e75-4ed5-8d79-e418168b80fe" Apr 21 10:06:28.695737 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:28.695704 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-75v2d" Apr 21 10:06:28.695923 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:28.695742 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6lwhm" Apr 21 10:06:32.054708 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:32.054650 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0d34def8-62fa-4cc6-a42f-4efb93bf5234-registry-tls\") pod \"image-registry-5c9756474c-jljk4\" (UID: \"0d34def8-62fa-4cc6-a42f-4efb93bf5234\") " pod="openshift-image-registry/image-registry-5c9756474c-jljk4" Apr 21 10:06:32.057097 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:32.057073 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0d34def8-62fa-4cc6-a42f-4efb93bf5234-registry-tls\") pod \"image-registry-5c9756474c-jljk4\" (UID: \"0d34def8-62fa-4cc6-a42f-4efb93bf5234\") " pod="openshift-image-registry/image-registry-5c9756474c-jljk4" Apr 21 10:06:32.156602 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:32.156549 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5c9756474c-jljk4" Apr 21 10:06:32.282711 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:32.282678 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5c9756474c-jljk4"] Apr 21 10:06:32.285923 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:06:32.285894 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d34def8_62fa_4cc6_a42f_4efb93bf5234.slice/crio-686f2c96cee31161763d4dc213d1c92b51c6daa4b8a60422e5fe33d93b55a16d WatchSource:0}: Error finding container 686f2c96cee31161763d4dc213d1c92b51c6daa4b8a60422e5fe33d93b55a16d: Status 404 returned error can't find the container with id 686f2c96cee31161763d4dc213d1c92b51c6daa4b8a60422e5fe33d93b55a16d Apr 21 10:06:32.705857 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:32.705736 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5c9756474c-jljk4" event={"ID":"0d34def8-62fa-4cc6-a42f-4efb93bf5234","Type":"ContainerStarted","Data":"5aac823bcdf65d084c3fea4e4014d5d990a0bc1d03e004682c0c993bb9b6a2e2"} Apr 21 10:06:32.705857 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:32.705791 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5c9756474c-jljk4" event={"ID":"0d34def8-62fa-4cc6-a42f-4efb93bf5234","Type":"ContainerStarted","Data":"686f2c96cee31161763d4dc213d1c92b51c6daa4b8a60422e5fe33d93b55a16d"} Apr 21 10:06:32.706077 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:32.705879 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5c9756474c-jljk4" Apr 21 10:06:32.727116 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:32.727065 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5c9756474c-jljk4" podStartSLOduration=16.727048329 podStartE2EDuration="16.727048329s" podCreationTimestamp="2026-04-21 10:06:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:06:32.726283102 +0000 UTC m=+161.064817120" watchObservedRunningTime="2026-04-21 10:06:32.727048329 +0000 UTC m=+161.065582376" Apr 21 10:06:33.063642 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:33.063601 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0-cert\") pod \"ingress-canary-6lwhm\" (UID: \"b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0\") " pod="openshift-ingress-canary/ingress-canary-6lwhm" Apr 21 10:06:33.064062 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:33.063682 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5a4cb6b4-0870-47bc-b13c-24f96bc4d282-metrics-tls\") pod \"dns-default-75v2d\" (UID: \"5a4cb6b4-0870-47bc-b13c-24f96bc4d282\") " pod="openshift-dns/dns-default-75v2d" Apr 21 10:06:33.066103 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:33.066074 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5a4cb6b4-0870-47bc-b13c-24f96bc4d282-metrics-tls\") pod \"dns-default-75v2d\" (UID: \"5a4cb6b4-0870-47bc-b13c-24f96bc4d282\") " pod="openshift-dns/dns-default-75v2d" Apr 21 10:06:33.066284 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:33.066263 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0-cert\") pod \"ingress-canary-6lwhm\" (UID: \"b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0\") " pod="openshift-ingress-canary/ingress-canary-6lwhm" Apr 21 10:06:33.198675 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:33.198637 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-2fp54\"" Apr 21 10:06:33.198890 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:33.198720 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-nqzpr\"" Apr 21 10:06:33.206792 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:33.206740 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-6lwhm" Apr 21 10:06:33.206862 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:33.206786 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-75v2d" Apr 21 10:06:33.335049 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:33.334978 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-75v2d"] Apr 21 10:06:33.338730 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:06:33.338691 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a4cb6b4_0870_47bc_b13c_24f96bc4d282.slice/crio-b95b44608e325c25c3ab002509a7c22665e5ba4d7eac43b3245ab50b1302ab8e WatchSource:0}: Error finding container b95b44608e325c25c3ab002509a7c22665e5ba4d7eac43b3245ab50b1302ab8e: Status 404 returned error can't find the container with id b95b44608e325c25c3ab002509a7c22665e5ba4d7eac43b3245ab50b1302ab8e Apr 21 10:06:33.347817 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:33.347788 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-6lwhm"] Apr 21 10:06:33.351228 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:06:33.351200 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4056ea2_3fd8_4fb5_8ed3_575f7ee5cda0.slice/crio-b6e0b03f6334943de08a0de088a1179d8b8ec0e478aa53cfbf7f76e8cf173c9a WatchSource:0}: Error finding container b6e0b03f6334943de08a0de088a1179d8b8ec0e478aa53cfbf7f76e8cf173c9a: Status 404 returned error can't find the container with id b6e0b03f6334943de08a0de088a1179d8b8ec0e478aa53cfbf7f76e8cf173c9a Apr 21 10:06:33.709575 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:33.709476 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6lwhm" event={"ID":"b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0","Type":"ContainerStarted","Data":"b6e0b03f6334943de08a0de088a1179d8b8ec0e478aa53cfbf7f76e8cf173c9a"} Apr 21 10:06:33.710352 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:33.710328 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-75v2d" event={"ID":"5a4cb6b4-0870-47bc-b13c-24f96bc4d282","Type":"ContainerStarted","Data":"b95b44608e325c25c3ab002509a7c22665e5ba4d7eac43b3245ab50b1302ab8e"} Apr 21 10:06:35.716519 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:35.716483 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-75v2d" event={"ID":"5a4cb6b4-0870-47bc-b13c-24f96bc4d282","Type":"ContainerStarted","Data":"1e24b32c52958bba3ba809ba5725e8759539c3225753519e14256165097df4e3"} Apr 21 10:06:35.716519 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:35.716521 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-75v2d" event={"ID":"5a4cb6b4-0870-47bc-b13c-24f96bc4d282","Type":"ContainerStarted","Data":"b488dc228bd6be8e0b6b9384c8518e29c0036490470d7404c97e757cfb01926d"} Apr 21 10:06:35.717012 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:35.716616 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-75v2d" Apr 21 10:06:35.717851 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:35.717817 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-6lwhm" event={"ID":"b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0","Type":"ContainerStarted","Data":"217bdbec90a9599d91ebe739c0cbac420d81facb354c534734790be529430fbf"} Apr 21 10:06:35.733030 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:35.732982 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-75v2d" podStartSLOduration=128.888335247 podStartE2EDuration="2m10.732965607s" podCreationTimestamp="2026-04-21 10:04:25 +0000 UTC" firstStartedPulling="2026-04-21 10:06:33.340941913 +0000 UTC m=+161.679475921" lastFinishedPulling="2026-04-21 10:06:35.185572285 +0000 UTC m=+163.524106281" observedRunningTime="2026-04-21 10:06:35.731474859 +0000 UTC m=+164.070008876" watchObservedRunningTime="2026-04-21 10:06:35.732965607 +0000 UTC m=+164.071499625" Apr 21 10:06:35.745770 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:35.745698 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-6lwhm" podStartSLOduration=128.868589171 podStartE2EDuration="2m10.745678699s" podCreationTimestamp="2026-04-21 10:04:25 +0000 UTC" firstStartedPulling="2026-04-21 10:06:33.35285411 +0000 UTC m=+161.691388105" lastFinishedPulling="2026-04-21 10:06:35.229943625 +0000 UTC m=+163.568477633" observedRunningTime="2026-04-21 10:06:35.745462458 +0000 UTC m=+164.083996472" watchObservedRunningTime="2026-04-21 10:06:35.745678699 +0000 UTC m=+164.084212712" Apr 21 10:06:38.228337 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:38.228305 2577 scope.go:117] "RemoveContainer" containerID="8fe0675785ca8ac3f48d3d86f2cdf80bec8d00c408c832b3843ff43f58ddb045" Apr 21 10:06:38.726589 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:38.726555 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-9vsls_c7fc40ec-8080-4847-9b12-78671f916c03/console-operator/2.log" Apr 21 10:06:38.727001 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:38.726986 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-9vsls_c7fc40ec-8080-4847-9b12-78671f916c03/console-operator/1.log" Apr 21 10:06:38.727098 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:38.727018 2577 generic.go:358] "Generic (PLEG): container finished" podID="c7fc40ec-8080-4847-9b12-78671f916c03" containerID="e55ef6afed1c174a4fbbc2a8c1e3120b97ccef8aa477568ce01343bded667032" exitCode=255 Apr 21 10:06:38.727098 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:38.727063 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-9vsls" event={"ID":"c7fc40ec-8080-4847-9b12-78671f916c03","Type":"ContainerDied","Data":"e55ef6afed1c174a4fbbc2a8c1e3120b97ccef8aa477568ce01343bded667032"} Apr 21 10:06:38.727098 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:38.727090 2577 scope.go:117] "RemoveContainer" containerID="8fe0675785ca8ac3f48d3d86f2cdf80bec8d00c408c832b3843ff43f58ddb045" Apr 21 10:06:38.727401 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:38.727383 2577 scope.go:117] "RemoveContainer" containerID="e55ef6afed1c174a4fbbc2a8c1e3120b97ccef8aa477568ce01343bded667032" Apr 21 10:06:38.727577 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:06:38.727556 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-9vsls_openshift-console-operator(c7fc40ec-8080-4847-9b12-78671f916c03)\"" pod="openshift-console-operator/console-operator-9d4b6777b-9vsls" podUID="c7fc40ec-8080-4847-9b12-78671f916c03" Apr 21 10:06:39.730477 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:39.730445 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-9vsls_c7fc40ec-8080-4847-9b12-78671f916c03/console-operator/2.log" Apr 21 10:06:40.231126 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:40.231102 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kckvj" Apr 21 10:06:44.400969 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:44.400884 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-lttth"] Apr 21 10:06:44.406604 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:44.406582 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-lttth" Apr 21 10:06:44.409085 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:44.409060 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 21 10:06:44.409188 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:44.409059 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 21 10:06:44.410021 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:44.409999 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 21 10:06:44.410102 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:44.410029 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-fdjpw\"" Apr 21 10:06:44.410102 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:44.410002 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 21 10:06:44.419589 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:44.419561 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-lttth"] Apr 21 10:06:44.452198 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:44.452164 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/93143241-5e63-4495-914e-e2c61040261e-crio-socket\") pod \"insights-runtime-extractor-lttth\" (UID: \"93143241-5e63-4495-914e-e2c61040261e\") " pod="openshift-insights/insights-runtime-extractor-lttth" Apr 21 10:06:44.452387 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:44.452251 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/93143241-5e63-4495-914e-e2c61040261e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-lttth\" (UID: \"93143241-5e63-4495-914e-e2c61040261e\") " pod="openshift-insights/insights-runtime-extractor-lttth" Apr 21 10:06:44.452387 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:44.452278 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/93143241-5e63-4495-914e-e2c61040261e-data-volume\") pod \"insights-runtime-extractor-lttth\" (UID: \"93143241-5e63-4495-914e-e2c61040261e\") " pod="openshift-insights/insights-runtime-extractor-lttth" Apr 21 10:06:44.452387 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:44.452319 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/93143241-5e63-4495-914e-e2c61040261e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-lttth\" (UID: \"93143241-5e63-4495-914e-e2c61040261e\") " pod="openshift-insights/insights-runtime-extractor-lttth" Apr 21 10:06:44.452387 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:44.452346 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6q5d\" (UniqueName: \"kubernetes.io/projected/93143241-5e63-4495-914e-e2c61040261e-kube-api-access-v6q5d\") pod \"insights-runtime-extractor-lttth\" (UID: \"93143241-5e63-4495-914e-e2c61040261e\") " pod="openshift-insights/insights-runtime-extractor-lttth" Apr 21 10:06:44.476238 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:44.476200 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5c9756474c-jljk4"] Apr 21 10:06:44.553170 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:44.553127 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/93143241-5e63-4495-914e-e2c61040261e-crio-socket\") pod \"insights-runtime-extractor-lttth\" (UID: \"93143241-5e63-4495-914e-e2c61040261e\") " pod="openshift-insights/insights-runtime-extractor-lttth" Apr 21 10:06:44.553343 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:44.553194 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/93143241-5e63-4495-914e-e2c61040261e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-lttth\" (UID: \"93143241-5e63-4495-914e-e2c61040261e\") " pod="openshift-insights/insights-runtime-extractor-lttth" Apr 21 10:06:44.553343 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:44.553265 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/93143241-5e63-4495-914e-e2c61040261e-crio-socket\") pod \"insights-runtime-extractor-lttth\" (UID: \"93143241-5e63-4495-914e-e2c61040261e\") " pod="openshift-insights/insights-runtime-extractor-lttth" Apr 21 10:06:44.553343 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:44.553312 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/93143241-5e63-4495-914e-e2c61040261e-data-volume\") pod \"insights-runtime-extractor-lttth\" (UID: \"93143241-5e63-4495-914e-e2c61040261e\") " pod="openshift-insights/insights-runtime-extractor-lttth" Apr 21 10:06:44.553489 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:44.553351 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/93143241-5e63-4495-914e-e2c61040261e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-lttth\" (UID: \"93143241-5e63-4495-914e-e2c61040261e\") " pod="openshift-insights/insights-runtime-extractor-lttth" Apr 21 10:06:44.553489 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:44.553371 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v6q5d\" (UniqueName: \"kubernetes.io/projected/93143241-5e63-4495-914e-e2c61040261e-kube-api-access-v6q5d\") pod \"insights-runtime-extractor-lttth\" (UID: \"93143241-5e63-4495-914e-e2c61040261e\") " pod="openshift-insights/insights-runtime-extractor-lttth" Apr 21 10:06:44.553726 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:44.553706 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/93143241-5e63-4495-914e-e2c61040261e-data-volume\") pod \"insights-runtime-extractor-lttth\" (UID: \"93143241-5e63-4495-914e-e2c61040261e\") " pod="openshift-insights/insights-runtime-extractor-lttth" Apr 21 10:06:44.553841 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:44.553732 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/93143241-5e63-4495-914e-e2c61040261e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-lttth\" (UID: \"93143241-5e63-4495-914e-e2c61040261e\") " pod="openshift-insights/insights-runtime-extractor-lttth" Apr 21 10:06:44.555782 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:44.555766 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/93143241-5e63-4495-914e-e2c61040261e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-lttth\" (UID: \"93143241-5e63-4495-914e-e2c61040261e\") " pod="openshift-insights/insights-runtime-extractor-lttth" Apr 21 10:06:44.564586 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:44.564544 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6q5d\" (UniqueName: \"kubernetes.io/projected/93143241-5e63-4495-914e-e2c61040261e-kube-api-access-v6q5d\") pod \"insights-runtime-extractor-lttth\" (UID: \"93143241-5e63-4495-914e-e2c61040261e\") " pod="openshift-insights/insights-runtime-extractor-lttth" Apr 21 10:06:44.715803 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:44.715777 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-lttth" Apr 21 10:06:44.839922 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:44.839884 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-lttth"] Apr 21 10:06:44.843808 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:06:44.843774 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93143241_5e63_4495_914e_e2c61040261e.slice/crio-0d4ba5eb4a4f70022f69b7500282edf1828c65b1f2f3a5110cfa55680c1bdae9 WatchSource:0}: Error finding container 0d4ba5eb4a4f70022f69b7500282edf1828c65b1f2f3a5110cfa55680c1bdae9: Status 404 returned error can't find the container with id 0d4ba5eb4a4f70022f69b7500282edf1828c65b1f2f3a5110cfa55680c1bdae9 Apr 21 10:06:45.722183 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:45.722147 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-75v2d" Apr 21 10:06:45.750608 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:45.750509 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-lttth" event={"ID":"93143241-5e63-4495-914e-e2c61040261e","Type":"ContainerStarted","Data":"0d0083aec411b9e4879c60820c16ad50e6913f70f6abd5605a48c6a4af152893"} Apr 21 10:06:45.750608 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:45.750546 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-lttth" event={"ID":"93143241-5e63-4495-914e-e2c61040261e","Type":"ContainerStarted","Data":"e9fe57fdac401b516c9b38835ff9813edca67f7428b7a55c53223ba3cbf8ff21"} Apr 21 10:06:45.750608 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:45.750555 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-lttth" event={"ID":"93143241-5e63-4495-914e-e2c61040261e","Type":"ContainerStarted","Data":"0d4ba5eb4a4f70022f69b7500282edf1828c65b1f2f3a5110cfa55680c1bdae9"} Apr 21 10:06:45.839973 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:45.839936 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-9vsls" Apr 21 10:06:45.840116 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:45.839992 2577 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-9vsls" Apr 21 10:06:45.840396 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:45.840379 2577 scope.go:117] "RemoveContainer" containerID="e55ef6afed1c174a4fbbc2a8c1e3120b97ccef8aa477568ce01343bded667032" Apr 21 10:06:45.840596 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:06:45.840577 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-9vsls_openshift-console-operator(c7fc40ec-8080-4847-9b12-78671f916c03)\"" pod="openshift-console-operator/console-operator-9d4b6777b-9vsls" podUID="c7fc40ec-8080-4847-9b12-78671f916c03" Apr 21 10:06:47.759444 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:47.759409 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-lttth" event={"ID":"93143241-5e63-4495-914e-e2c61040261e","Type":"ContainerStarted","Data":"02060ebea74a7ebd075cfddfc918c29874d18206f1aeb2b574358e77bf8ac4ad"} Apr 21 10:06:47.814810 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:47.814739 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-lttth" podStartSLOduration=1.760581949 podStartE2EDuration="3.814722672s" podCreationTimestamp="2026-04-21 10:06:44 +0000 UTC" firstStartedPulling="2026-04-21 10:06:44.899803238 +0000 UTC m=+173.238337247" lastFinishedPulling="2026-04-21 10:06:46.95394397 +0000 UTC m=+175.292477970" observedRunningTime="2026-04-21 10:06:47.813612852 +0000 UTC m=+176.152146868" watchObservedRunningTime="2026-04-21 10:06:47.814722672 +0000 UTC m=+176.153256689" Apr 21 10:06:54.482120 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:54.482091 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5c9756474c-jljk4" Apr 21 10:06:57.017054 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:57.017020 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-wfs5l"] Apr 21 10:06:57.021638 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:57.021608 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-wfs5l" Apr 21 10:06:57.025675 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:57.025649 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 21 10:06:57.026498 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:57.026444 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 21 10:06:57.026963 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:57.026948 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 21 10:06:57.027492 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:57.027381 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 21 10:06:57.027978 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:57.027961 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 21 10:06:57.028879 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:57.028295 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 21 10:06:57.029432 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:57.029418 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-5rh4g\"" Apr 21 10:06:57.046457 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:57.046422 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1957a663-aa91-445e-af62-0b93aec5c600-node-exporter-accelerators-collector-config\") pod \"node-exporter-wfs5l\" (UID: \"1957a663-aa91-445e-af62-0b93aec5c600\") " pod="openshift-monitoring/node-exporter-wfs5l" Apr 21 10:06:57.046650 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:57.046470 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1957a663-aa91-445e-af62-0b93aec5c600-root\") pod \"node-exporter-wfs5l\" (UID: \"1957a663-aa91-445e-af62-0b93aec5c600\") " pod="openshift-monitoring/node-exporter-wfs5l" Apr 21 10:06:57.046650 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:57.046570 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1957a663-aa91-445e-af62-0b93aec5c600-node-exporter-tls\") pod \"node-exporter-wfs5l\" (UID: \"1957a663-aa91-445e-af62-0b93aec5c600\") " pod="openshift-monitoring/node-exporter-wfs5l" Apr 21 10:06:57.046650 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:57.046605 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1957a663-aa91-445e-af62-0b93aec5c600-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-wfs5l\" (UID: \"1957a663-aa91-445e-af62-0b93aec5c600\") " pod="openshift-monitoring/node-exporter-wfs5l" Apr 21 10:06:57.046650 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:57.046624 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1957a663-aa91-445e-af62-0b93aec5c600-sys\") pod \"node-exporter-wfs5l\" (UID: \"1957a663-aa91-445e-af62-0b93aec5c600\") " pod="openshift-monitoring/node-exporter-wfs5l" Apr 21 10:06:57.046869 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:57.046670 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1957a663-aa91-445e-af62-0b93aec5c600-metrics-client-ca\") pod \"node-exporter-wfs5l\" (UID: \"1957a663-aa91-445e-af62-0b93aec5c600\") " pod="openshift-monitoring/node-exporter-wfs5l" Apr 21 10:06:57.046869 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:57.046697 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd8s7\" (UniqueName: \"kubernetes.io/projected/1957a663-aa91-445e-af62-0b93aec5c600-kube-api-access-zd8s7\") pod \"node-exporter-wfs5l\" (UID: \"1957a663-aa91-445e-af62-0b93aec5c600\") " pod="openshift-monitoring/node-exporter-wfs5l" Apr 21 10:06:57.046869 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:57.046728 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1957a663-aa91-445e-af62-0b93aec5c600-node-exporter-textfile\") pod \"node-exporter-wfs5l\" (UID: \"1957a663-aa91-445e-af62-0b93aec5c600\") " pod="openshift-monitoring/node-exporter-wfs5l" Apr 21 10:06:57.046869 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:57.046772 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1957a663-aa91-445e-af62-0b93aec5c600-node-exporter-wtmp\") pod \"node-exporter-wfs5l\" (UID: \"1957a663-aa91-445e-af62-0b93aec5c600\") " pod="openshift-monitoring/node-exporter-wfs5l" Apr 21 10:06:57.148041 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:57.147996 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1957a663-aa91-445e-af62-0b93aec5c600-node-exporter-textfile\") pod \"node-exporter-wfs5l\" (UID: \"1957a663-aa91-445e-af62-0b93aec5c600\") " pod="openshift-monitoring/node-exporter-wfs5l" Apr 21 10:06:57.148041 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:57.148048 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1957a663-aa91-445e-af62-0b93aec5c600-node-exporter-wtmp\") pod \"node-exporter-wfs5l\" (UID: \"1957a663-aa91-445e-af62-0b93aec5c600\") " pod="openshift-monitoring/node-exporter-wfs5l" Apr 21 10:06:57.148297 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:57.148083 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1957a663-aa91-445e-af62-0b93aec5c600-node-exporter-accelerators-collector-config\") pod \"node-exporter-wfs5l\" (UID: \"1957a663-aa91-445e-af62-0b93aec5c600\") " pod="openshift-monitoring/node-exporter-wfs5l" Apr 21 10:06:57.148297 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:57.148129 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1957a663-aa91-445e-af62-0b93aec5c600-root\") pod \"node-exporter-wfs5l\" (UID: \"1957a663-aa91-445e-af62-0b93aec5c600\") " pod="openshift-monitoring/node-exporter-wfs5l" Apr 21 10:06:57.148297 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:57.148182 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1957a663-aa91-445e-af62-0b93aec5c600-node-exporter-tls\") pod \"node-exporter-wfs5l\" (UID: \"1957a663-aa91-445e-af62-0b93aec5c600\") " pod="openshift-monitoring/node-exporter-wfs5l" Apr 21 10:06:57.148297 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:57.148205 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1957a663-aa91-445e-af62-0b93aec5c600-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-wfs5l\" (UID: \"1957a663-aa91-445e-af62-0b93aec5c600\") " pod="openshift-monitoring/node-exporter-wfs5l" Apr 21 10:06:57.148297 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:57.148240 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/1957a663-aa91-445e-af62-0b93aec5c600-node-exporter-wtmp\") pod \"node-exporter-wfs5l\" (UID: \"1957a663-aa91-445e-af62-0b93aec5c600\") " pod="openshift-monitoring/node-exporter-wfs5l" Apr 21 10:06:57.148297 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:57.148272 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1957a663-aa91-445e-af62-0b93aec5c600-sys\") pod \"node-exporter-wfs5l\" (UID: \"1957a663-aa91-445e-af62-0b93aec5c600\") " pod="openshift-monitoring/node-exporter-wfs5l" Apr 21 10:06:57.148592 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:57.148310 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1957a663-aa91-445e-af62-0b93aec5c600-metrics-client-ca\") pod \"node-exporter-wfs5l\" (UID: \"1957a663-aa91-445e-af62-0b93aec5c600\") " pod="openshift-monitoring/node-exporter-wfs5l" Apr 21 10:06:57.148592 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:57.148340 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zd8s7\" (UniqueName: \"kubernetes.io/projected/1957a663-aa91-445e-af62-0b93aec5c600-kube-api-access-zd8s7\") pod \"node-exporter-wfs5l\" (UID: \"1957a663-aa91-445e-af62-0b93aec5c600\") " pod="openshift-monitoring/node-exporter-wfs5l" Apr 21 10:06:57.148592 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:57.148377 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1957a663-aa91-445e-af62-0b93aec5c600-sys\") pod \"node-exporter-wfs5l\" (UID: \"1957a663-aa91-445e-af62-0b93aec5c600\") " pod="openshift-monitoring/node-exporter-wfs5l" Apr 21 10:06:57.148592 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:57.148248 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/1957a663-aa91-445e-af62-0b93aec5c600-root\") pod \"node-exporter-wfs5l\" (UID: \"1957a663-aa91-445e-af62-0b93aec5c600\") " pod="openshift-monitoring/node-exporter-wfs5l" Apr 21 10:06:57.148592 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:57.148412 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/1957a663-aa91-445e-af62-0b93aec5c600-node-exporter-textfile\") pod \"node-exporter-wfs5l\" (UID: \"1957a663-aa91-445e-af62-0b93aec5c600\") " pod="openshift-monitoring/node-exporter-wfs5l" Apr 21 10:06:57.148813 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:57.148669 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/1957a663-aa91-445e-af62-0b93aec5c600-node-exporter-accelerators-collector-config\") pod \"node-exporter-wfs5l\" (UID: \"1957a663-aa91-445e-af62-0b93aec5c600\") " pod="openshift-monitoring/node-exporter-wfs5l" Apr 21 10:06:57.149582 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:57.149557 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1957a663-aa91-445e-af62-0b93aec5c600-metrics-client-ca\") pod \"node-exporter-wfs5l\" (UID: \"1957a663-aa91-445e-af62-0b93aec5c600\") " pod="openshift-monitoring/node-exporter-wfs5l" Apr 21 10:06:57.151149 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:57.151127 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1957a663-aa91-445e-af62-0b93aec5c600-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-wfs5l\" (UID: \"1957a663-aa91-445e-af62-0b93aec5c600\") " pod="openshift-monitoring/node-exporter-wfs5l" Apr 21 10:06:57.151266 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:57.151203 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/1957a663-aa91-445e-af62-0b93aec5c600-node-exporter-tls\") pod \"node-exporter-wfs5l\" (UID: \"1957a663-aa91-445e-af62-0b93aec5c600\") " pod="openshift-monitoring/node-exporter-wfs5l" Apr 21 10:06:57.157298 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:57.157265 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd8s7\" (UniqueName: \"kubernetes.io/projected/1957a663-aa91-445e-af62-0b93aec5c600-kube-api-access-zd8s7\") pod \"node-exporter-wfs5l\" (UID: \"1957a663-aa91-445e-af62-0b93aec5c600\") " pod="openshift-monitoring/node-exporter-wfs5l" Apr 21 10:06:57.341140 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:57.341047 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-wfs5l" Apr 21 10:06:57.349191 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:06:57.349159 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1957a663_aa91_445e_af62_0b93aec5c600.slice/crio-e864e38f9bc3da5b69b57514e4f498a13db6682c0b87203259006792d240fa07 WatchSource:0}: Error finding container e864e38f9bc3da5b69b57514e4f498a13db6682c0b87203259006792d240fa07: Status 404 returned error can't find the container with id e864e38f9bc3da5b69b57514e4f498a13db6682c0b87203259006792d240fa07 Apr 21 10:06:57.789337 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:57.789275 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wfs5l" event={"ID":"1957a663-aa91-445e-af62-0b93aec5c600","Type":"ContainerStarted","Data":"e864e38f9bc3da5b69b57514e4f498a13db6682c0b87203259006792d240fa07"} Apr 21 10:06:58.792826 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:58.792792 2577 generic.go:358] "Generic (PLEG): container finished" podID="1957a663-aa91-445e-af62-0b93aec5c600" containerID="9d2fbeb0edbfdbdb7a2b8e0322047daababc99028b008288ccaaa1a0231091c9" exitCode=0 Apr 21 10:06:58.793173 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:58.792832 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wfs5l" event={"ID":"1957a663-aa91-445e-af62-0b93aec5c600","Type":"ContainerDied","Data":"9d2fbeb0edbfdbdb7a2b8e0322047daababc99028b008288ccaaa1a0231091c9"} Apr 21 10:06:59.797770 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:59.797720 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wfs5l" event={"ID":"1957a663-aa91-445e-af62-0b93aec5c600","Type":"ContainerStarted","Data":"5eca7be79c91692d17a997d5b929315b53ac9c5c445ee974fa16c50cfb20a33d"} Apr 21 10:06:59.798158 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:59.797780 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-wfs5l" event={"ID":"1957a663-aa91-445e-af62-0b93aec5c600","Type":"ContainerStarted","Data":"6e3340c2103edae9111c3dfe5f1f6e5cc5116f3260dfbba24dd963e8735f7238"} Apr 21 10:06:59.817664 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:06:59.817602 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-wfs5l" podStartSLOduration=3.170338511 podStartE2EDuration="3.817586712s" podCreationTimestamp="2026-04-21 10:06:56 +0000 UTC" firstStartedPulling="2026-04-21 10:06:57.351071966 +0000 UTC m=+185.689605962" lastFinishedPulling="2026-04-21 10:06:57.998320165 +0000 UTC m=+186.336854163" observedRunningTime="2026-04-21 10:06:59.817571879 +0000 UTC m=+188.156105896" watchObservedRunningTime="2026-04-21 10:06:59.817586712 +0000 UTC m=+188.156120728" Apr 21 10:07:00.171847 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:00.171770 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-5d57f5d8f8-lvwjb"] Apr 21 10:07:00.175300 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:00.175275 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5d57f5d8f8-lvwjb" Apr 21 10:07:00.179496 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:00.179471 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 21 10:07:00.179628 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:00.179471 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 21 10:07:00.179921 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:00.179896 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-7a0r4l6snb25b\"" Apr 21 10:07:00.180001 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:00.179901 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 21 10:07:00.180001 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:00.179940 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 21 10:07:00.180274 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:00.180248 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-5rtzs\"" Apr 21 10:07:00.180274 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:00.180265 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 21 10:07:00.190487 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:00.190460 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5d57f5d8f8-lvwjb"] Apr 21 10:07:00.228444 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:00.228418 2577 scope.go:117] "RemoveContainer" containerID="e55ef6afed1c174a4fbbc2a8c1e3120b97ccef8aa477568ce01343bded667032" Apr 21 10:07:00.273780 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:00.273733 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/afe4eaaa-27cd-4437-9086-34a188a2d172-metrics-client-ca\") pod \"thanos-querier-5d57f5d8f8-lvwjb\" (UID: \"afe4eaaa-27cd-4437-9086-34a188a2d172\") " pod="openshift-monitoring/thanos-querier-5d57f5d8f8-lvwjb" Apr 21 10:07:00.273967 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:00.273797 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/afe4eaaa-27cd-4437-9086-34a188a2d172-secret-thanos-querier-tls\") pod \"thanos-querier-5d57f5d8f8-lvwjb\" (UID: \"afe4eaaa-27cd-4437-9086-34a188a2d172\") " pod="openshift-monitoring/thanos-querier-5d57f5d8f8-lvwjb" Apr 21 10:07:00.273967 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:00.273830 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/afe4eaaa-27cd-4437-9086-34a188a2d172-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5d57f5d8f8-lvwjb\" (UID: \"afe4eaaa-27cd-4437-9086-34a188a2d172\") " pod="openshift-monitoring/thanos-querier-5d57f5d8f8-lvwjb" Apr 21 10:07:00.273967 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:00.273930 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/afe4eaaa-27cd-4437-9086-34a188a2d172-secret-grpc-tls\") pod \"thanos-querier-5d57f5d8f8-lvwjb\" (UID: \"afe4eaaa-27cd-4437-9086-34a188a2d172\") " pod="openshift-monitoring/thanos-querier-5d57f5d8f8-lvwjb" Apr 21 10:07:00.274139 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:00.274017 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/afe4eaaa-27cd-4437-9086-34a188a2d172-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5d57f5d8f8-lvwjb\" (UID: \"afe4eaaa-27cd-4437-9086-34a188a2d172\") " pod="openshift-monitoring/thanos-querier-5d57f5d8f8-lvwjb" Apr 21 10:07:00.274139 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:00.274086 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgnns\" (UniqueName: \"kubernetes.io/projected/afe4eaaa-27cd-4437-9086-34a188a2d172-kube-api-access-kgnns\") pod \"thanos-querier-5d57f5d8f8-lvwjb\" (UID: \"afe4eaaa-27cd-4437-9086-34a188a2d172\") " pod="openshift-monitoring/thanos-querier-5d57f5d8f8-lvwjb" Apr 21 10:07:00.274350 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:00.274157 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/afe4eaaa-27cd-4437-9086-34a188a2d172-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5d57f5d8f8-lvwjb\" (UID: \"afe4eaaa-27cd-4437-9086-34a188a2d172\") " pod="openshift-monitoring/thanos-querier-5d57f5d8f8-lvwjb" Apr 21 10:07:00.274350 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:00.274227 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/afe4eaaa-27cd-4437-9086-34a188a2d172-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5d57f5d8f8-lvwjb\" (UID: \"afe4eaaa-27cd-4437-9086-34a188a2d172\") " pod="openshift-monitoring/thanos-querier-5d57f5d8f8-lvwjb" Apr 21 10:07:00.374953 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:00.374914 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/afe4eaaa-27cd-4437-9086-34a188a2d172-secret-grpc-tls\") pod \"thanos-querier-5d57f5d8f8-lvwjb\" (UID: \"afe4eaaa-27cd-4437-9086-34a188a2d172\") " pod="openshift-monitoring/thanos-querier-5d57f5d8f8-lvwjb" Apr 21 10:07:00.375153 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:00.374975 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/afe4eaaa-27cd-4437-9086-34a188a2d172-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5d57f5d8f8-lvwjb\" (UID: \"afe4eaaa-27cd-4437-9086-34a188a2d172\") " pod="openshift-monitoring/thanos-querier-5d57f5d8f8-lvwjb" Apr 21 10:07:00.375153 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:00.375040 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kgnns\" (UniqueName: \"kubernetes.io/projected/afe4eaaa-27cd-4437-9086-34a188a2d172-kube-api-access-kgnns\") pod \"thanos-querier-5d57f5d8f8-lvwjb\" (UID: \"afe4eaaa-27cd-4437-9086-34a188a2d172\") " pod="openshift-monitoring/thanos-querier-5d57f5d8f8-lvwjb" Apr 21 10:07:00.375153 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:00.375072 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/afe4eaaa-27cd-4437-9086-34a188a2d172-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5d57f5d8f8-lvwjb\" (UID: \"afe4eaaa-27cd-4437-9086-34a188a2d172\") " pod="openshift-monitoring/thanos-querier-5d57f5d8f8-lvwjb" Apr 21 10:07:00.375153 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:00.375103 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/afe4eaaa-27cd-4437-9086-34a188a2d172-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5d57f5d8f8-lvwjb\" (UID: \"afe4eaaa-27cd-4437-9086-34a188a2d172\") " pod="openshift-monitoring/thanos-querier-5d57f5d8f8-lvwjb" Apr 21 10:07:00.375153 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:00.375132 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/afe4eaaa-27cd-4437-9086-34a188a2d172-metrics-client-ca\") pod \"thanos-querier-5d57f5d8f8-lvwjb\" (UID: \"afe4eaaa-27cd-4437-9086-34a188a2d172\") " pod="openshift-monitoring/thanos-querier-5d57f5d8f8-lvwjb" Apr 21 10:07:00.375447 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:00.375262 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/afe4eaaa-27cd-4437-9086-34a188a2d172-secret-thanos-querier-tls\") pod \"thanos-querier-5d57f5d8f8-lvwjb\" (UID: \"afe4eaaa-27cd-4437-9086-34a188a2d172\") " pod="openshift-monitoring/thanos-querier-5d57f5d8f8-lvwjb" Apr 21 10:07:00.375447 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:00.375298 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/afe4eaaa-27cd-4437-9086-34a188a2d172-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5d57f5d8f8-lvwjb\" (UID: \"afe4eaaa-27cd-4437-9086-34a188a2d172\") " pod="openshift-monitoring/thanos-querier-5d57f5d8f8-lvwjb" Apr 21 10:07:00.376656 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:00.376628 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/afe4eaaa-27cd-4437-9086-34a188a2d172-metrics-client-ca\") pod \"thanos-querier-5d57f5d8f8-lvwjb\" (UID: \"afe4eaaa-27cd-4437-9086-34a188a2d172\") " pod="openshift-monitoring/thanos-querier-5d57f5d8f8-lvwjb" Apr 21 10:07:00.378137 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:00.378112 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/afe4eaaa-27cd-4437-9086-34a188a2d172-secret-grpc-tls\") pod \"thanos-querier-5d57f5d8f8-lvwjb\" (UID: \"afe4eaaa-27cd-4437-9086-34a188a2d172\") " pod="openshift-monitoring/thanos-querier-5d57f5d8f8-lvwjb" Apr 21 10:07:00.378286 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:00.378260 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/afe4eaaa-27cd-4437-9086-34a188a2d172-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5d57f5d8f8-lvwjb\" (UID: \"afe4eaaa-27cd-4437-9086-34a188a2d172\") " pod="openshift-monitoring/thanos-querier-5d57f5d8f8-lvwjb" Apr 21 10:07:00.378377 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:00.378354 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/afe4eaaa-27cd-4437-9086-34a188a2d172-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5d57f5d8f8-lvwjb\" (UID: \"afe4eaaa-27cd-4437-9086-34a188a2d172\") " pod="openshift-monitoring/thanos-querier-5d57f5d8f8-lvwjb" Apr 21 10:07:00.378447 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:00.378397 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/afe4eaaa-27cd-4437-9086-34a188a2d172-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5d57f5d8f8-lvwjb\" (UID: \"afe4eaaa-27cd-4437-9086-34a188a2d172\") " pod="openshift-monitoring/thanos-querier-5d57f5d8f8-lvwjb" Apr 21 10:07:00.378547 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:00.378529 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/afe4eaaa-27cd-4437-9086-34a188a2d172-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5d57f5d8f8-lvwjb\" (UID: \"afe4eaaa-27cd-4437-9086-34a188a2d172\") " pod="openshift-monitoring/thanos-querier-5d57f5d8f8-lvwjb" Apr 21 10:07:00.378601 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:00.378532 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/afe4eaaa-27cd-4437-9086-34a188a2d172-secret-thanos-querier-tls\") pod \"thanos-querier-5d57f5d8f8-lvwjb\" (UID: \"afe4eaaa-27cd-4437-9086-34a188a2d172\") " pod="openshift-monitoring/thanos-querier-5d57f5d8f8-lvwjb" Apr 21 10:07:00.384889 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:00.384859 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgnns\" (UniqueName: \"kubernetes.io/projected/afe4eaaa-27cd-4437-9086-34a188a2d172-kube-api-access-kgnns\") pod \"thanos-querier-5d57f5d8f8-lvwjb\" (UID: \"afe4eaaa-27cd-4437-9086-34a188a2d172\") " pod="openshift-monitoring/thanos-querier-5d57f5d8f8-lvwjb" Apr 21 10:07:00.484602 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:00.484568 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5d57f5d8f8-lvwjb" Apr 21 10:07:00.630140 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:00.630101 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5d57f5d8f8-lvwjb"] Apr 21 10:07:00.633987 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:07:00.633957 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafe4eaaa_27cd_4437_9086_34a188a2d172.slice/crio-974440af5d8322c40828bb6f6fdd9e95a566248a30427319a95cd0c222090cff WatchSource:0}: Error finding container 974440af5d8322c40828bb6f6fdd9e95a566248a30427319a95cd0c222090cff: Status 404 returned error can't find the container with id 974440af5d8322c40828bb6f6fdd9e95a566248a30427319a95cd0c222090cff Apr 21 10:07:00.801823 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:00.801717 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d57f5d8f8-lvwjb" event={"ID":"afe4eaaa-27cd-4437-9086-34a188a2d172","Type":"ContainerStarted","Data":"974440af5d8322c40828bb6f6fdd9e95a566248a30427319a95cd0c222090cff"} Apr 21 10:07:00.803272 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:00.803253 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-9vsls_c7fc40ec-8080-4847-9b12-78671f916c03/console-operator/2.log" Apr 21 10:07:00.803402 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:00.803371 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-9vsls" event={"ID":"c7fc40ec-8080-4847-9b12-78671f916c03","Type":"ContainerStarted","Data":"97583f8304802016909f0e701d23d8b4cf5ee260d6f3084f53d685fd27d69981"} Apr 21 10:07:00.803891 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:00.803870 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-9vsls" Apr 21 10:07:00.825638 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:00.825581 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-9vsls" podStartSLOduration=43.686638207 podStartE2EDuration="45.825563994s" podCreationTimestamp="2026-04-21 10:06:15 +0000 UTC" firstStartedPulling="2026-04-21 10:06:15.962726069 +0000 UTC m=+144.301260073" lastFinishedPulling="2026-04-21 10:06:18.101651837 +0000 UTC m=+146.440185860" observedRunningTime="2026-04-21 10:07:00.824607735 +0000 UTC m=+189.163141753" watchObservedRunningTime="2026-04-21 10:07:00.825563994 +0000 UTC m=+189.164098039" Apr 21 10:07:01.163203 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:01.163112 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-9vsls" Apr 21 10:07:01.360181 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:01.360133 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-4wrmx"] Apr 21 10:07:01.363342 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:01.363312 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-4wrmx" Apr 21 10:07:01.366034 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:01.366008 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-wfmgz\"" Apr 21 10:07:01.366220 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:01.366039 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 21 10:07:01.366418 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:01.366398 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 21 10:07:01.378777 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:01.378733 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-4wrmx"] Apr 21 10:07:01.384356 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:01.384321 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rvqf\" (UniqueName: \"kubernetes.io/projected/00a9f177-852d-4071-8823-418bcec59544-kube-api-access-5rvqf\") pod \"downloads-6bcc868b7-4wrmx\" (UID: \"00a9f177-852d-4071-8823-418bcec59544\") " pod="openshift-console/downloads-6bcc868b7-4wrmx" Apr 21 10:07:01.485703 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:01.485666 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5rvqf\" (UniqueName: \"kubernetes.io/projected/00a9f177-852d-4071-8823-418bcec59544-kube-api-access-5rvqf\") pod \"downloads-6bcc868b7-4wrmx\" (UID: \"00a9f177-852d-4071-8823-418bcec59544\") " pod="openshift-console/downloads-6bcc868b7-4wrmx" Apr 21 10:07:01.495460 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:01.495424 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rvqf\" (UniqueName: \"kubernetes.io/projected/00a9f177-852d-4071-8823-418bcec59544-kube-api-access-5rvqf\") pod \"downloads-6bcc868b7-4wrmx\" (UID: \"00a9f177-852d-4071-8823-418bcec59544\") " pod="openshift-console/downloads-6bcc868b7-4wrmx" Apr 21 10:07:01.674890 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:01.674848 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-4wrmx" Apr 21 10:07:01.714847 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:01.714800 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-4bsk9"] Apr 21 10:07:01.719883 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:01.719329 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4bsk9" Apr 21 10:07:01.721702 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:01.721676 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 21 10:07:01.721868 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:01.721706 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-gtmp9\"" Apr 21 10:07:01.728943 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:01.728810 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-4bsk9"] Apr 21 10:07:01.788741 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:01.788675 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8f4fd6b3-4c2d-49bd-a059-b4f9a8b90d38-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-4bsk9\" (UID: \"8f4fd6b3-4c2d-49bd-a059-b4f9a8b90d38\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4bsk9" Apr 21 10:07:01.820784 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:01.820733 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-4wrmx"] Apr 21 10:07:01.824309 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:07:01.824273 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00a9f177_852d_4071_8823_418bcec59544.slice/crio-58eb277c82fbfbef5c563bc593d21484ea94444d26e871a7ab36c3a176f12bd3 WatchSource:0}: Error finding container 58eb277c82fbfbef5c563bc593d21484ea94444d26e871a7ab36c3a176f12bd3: Status 404 returned error can't find the container with id 58eb277c82fbfbef5c563bc593d21484ea94444d26e871a7ab36c3a176f12bd3 Apr 21 10:07:01.889892 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:01.889844 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8f4fd6b3-4c2d-49bd-a059-b4f9a8b90d38-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-4bsk9\" (UID: \"8f4fd6b3-4c2d-49bd-a059-b4f9a8b90d38\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4bsk9" Apr 21 10:07:01.890107 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:07:01.889998 2577 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 21 10:07:01.890107 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:07:01.890072 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f4fd6b3-4c2d-49bd-a059-b4f9a8b90d38-monitoring-plugin-cert podName:8f4fd6b3-4c2d-49bd-a059-b4f9a8b90d38 nodeName:}" failed. No retries permitted until 2026-04-21 10:07:02.39005271 +0000 UTC m=+190.728586711 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/8f4fd6b3-4c2d-49bd-a059-b4f9a8b90d38-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-4bsk9" (UID: "8f4fd6b3-4c2d-49bd-a059-b4f9a8b90d38") : secret "monitoring-plugin-cert" not found Apr 21 10:07:02.394172 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:02.394126 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8f4fd6b3-4c2d-49bd-a059-b4f9a8b90d38-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-4bsk9\" (UID: \"8f4fd6b3-4c2d-49bd-a059-b4f9a8b90d38\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4bsk9" Apr 21 10:07:02.397320 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:02.397234 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/8f4fd6b3-4c2d-49bd-a059-b4f9a8b90d38-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-4bsk9\" (UID: \"8f4fd6b3-4c2d-49bd-a059-b4f9a8b90d38\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4bsk9" Apr 21 10:07:02.634907 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:02.634862 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4bsk9" Apr 21 10:07:02.810288 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:02.810243 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-4wrmx" event={"ID":"00a9f177-852d-4071-8823-418bcec59544","Type":"ContainerStarted","Data":"58eb277c82fbfbef5c563bc593d21484ea94444d26e871a7ab36c3a176f12bd3"} Apr 21 10:07:02.862495 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:02.862467 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-4bsk9"] Apr 21 10:07:02.865674 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:07:02.865650 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f4fd6b3_4c2d_49bd_a059_b4f9a8b90d38.slice/crio-e793fc5123f00e9b19da19af903b22c46a815e8124750e909ea3ea81b847d78d WatchSource:0}: Error finding container e793fc5123f00e9b19da19af903b22c46a815e8124750e909ea3ea81b847d78d: Status 404 returned error can't find the container with id e793fc5123f00e9b19da19af903b22c46a815e8124750e909ea3ea81b847d78d Apr 21 10:07:03.815233 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:03.815197 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4bsk9" event={"ID":"8f4fd6b3-4c2d-49bd-a059-b4f9a8b90d38","Type":"ContainerStarted","Data":"e793fc5123f00e9b19da19af903b22c46a815e8124750e909ea3ea81b847d78d"} Apr 21 10:07:03.818504 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:03.818465 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d57f5d8f8-lvwjb" event={"ID":"afe4eaaa-27cd-4437-9086-34a188a2d172","Type":"ContainerStarted","Data":"e15e247a634d4cbc485fcc1eca12f3749b14e65d9efe00cfdf4036afe34234bb"} Apr 21 10:07:03.818649 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:03.818507 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d57f5d8f8-lvwjb" event={"ID":"afe4eaaa-27cd-4437-9086-34a188a2d172","Type":"ContainerStarted","Data":"70fd1540079bd8de4bf93a76e9250c8f3b07cbe4fcd7abf9648f38d622753cf1"} Apr 21 10:07:03.818649 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:03.818522 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d57f5d8f8-lvwjb" event={"ID":"afe4eaaa-27cd-4437-9086-34a188a2d172","Type":"ContainerStarted","Data":"3ca912829fc75daee7821ccbd649f4b1c7053bbb72df9e7c19b25c5e190be955"} Apr 21 10:07:04.824819 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:04.824785 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d57f5d8f8-lvwjb" event={"ID":"afe4eaaa-27cd-4437-9086-34a188a2d172","Type":"ContainerStarted","Data":"3b23d6ad4962d5fdc1feb9cd8085704e76bebec2e71c4d9dfbb14da2950a54a0"} Apr 21 10:07:04.825250 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:04.824829 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d57f5d8f8-lvwjb" event={"ID":"afe4eaaa-27cd-4437-9086-34a188a2d172","Type":"ContainerStarted","Data":"848d3d93bde01f87eb25824cf92e36a73580d1772d45f76410520a9806bb936c"} Apr 21 10:07:04.825250 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:04.824838 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d57f5d8f8-lvwjb" event={"ID":"afe4eaaa-27cd-4437-9086-34a188a2d172","Type":"ContainerStarted","Data":"0f9560ae3fb9ef1f6576238edea90cc33296143974b262b0e154786a375b2452"} Apr 21 10:07:04.825250 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:04.824948 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-5d57f5d8f8-lvwjb" Apr 21 10:07:04.826263 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:04.826232 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4bsk9" event={"ID":"8f4fd6b3-4c2d-49bd-a059-b4f9a8b90d38","Type":"ContainerStarted","Data":"2dfd9068bc597bee951198727d7533ff638e74337f19e7ea42166ac97939c18d"} Apr 21 10:07:04.826429 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:04.826401 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4bsk9" Apr 21 10:07:04.831462 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:04.831437 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4bsk9" Apr 21 10:07:04.871217 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:04.871159 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-5d57f5d8f8-lvwjb" podStartSLOduration=1.689799674 podStartE2EDuration="4.871141072s" podCreationTimestamp="2026-04-21 10:07:00 +0000 UTC" firstStartedPulling="2026-04-21 10:07:00.635843933 +0000 UTC m=+188.974377928" lastFinishedPulling="2026-04-21 10:07:03.817185324 +0000 UTC m=+192.155719326" observedRunningTime="2026-04-21 10:07:04.869577846 +0000 UTC m=+193.208111862" watchObservedRunningTime="2026-04-21 10:07:04.871141072 +0000 UTC m=+193.209675088" Apr 21 10:07:04.904539 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:04.904479 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-4bsk9" podStartSLOduration=2.46318281 podStartE2EDuration="3.904456314s" podCreationTimestamp="2026-04-21 10:07:01 +0000 UTC" firstStartedPulling="2026-04-21 10:07:02.867608606 +0000 UTC m=+191.206142600" lastFinishedPulling="2026-04-21 10:07:04.308882094 +0000 UTC m=+192.647416104" observedRunningTime="2026-04-21 10:07:04.902307987 +0000 UTC m=+193.240842006" watchObservedRunningTime="2026-04-21 10:07:04.904456314 +0000 UTC m=+193.242990333" Apr 21 10:07:09.075628 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.075593 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-58f9c847f7-xvhw5"] Apr 21 10:07:09.079069 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.079047 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58f9c847f7-xvhw5" Apr 21 10:07:09.082249 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.082223 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 21 10:07:09.082400 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.082311 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 21 10:07:09.083329 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.083310 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 21 10:07:09.083329 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.083318 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 21 10:07:09.083494 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.083353 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 21 10:07:09.084019 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.084000 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-thkz4\"" Apr 21 10:07:09.092267 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.092237 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-58f9c847f7-xvhw5"] Apr 21 10:07:09.156515 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.156478 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vg7d\" (UniqueName: \"kubernetes.io/projected/9362dc62-82ed-4b54-adf1-74be4565a0ae-kube-api-access-8vg7d\") pod \"console-58f9c847f7-xvhw5\" (UID: \"9362dc62-82ed-4b54-adf1-74be4565a0ae\") " pod="openshift-console/console-58f9c847f7-xvhw5" Apr 21 10:07:09.156795 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.156532 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9362dc62-82ed-4b54-adf1-74be4565a0ae-console-serving-cert\") pod \"console-58f9c847f7-xvhw5\" (UID: \"9362dc62-82ed-4b54-adf1-74be4565a0ae\") " pod="openshift-console/console-58f9c847f7-xvhw5" Apr 21 10:07:09.156795 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.156646 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9362dc62-82ed-4b54-adf1-74be4565a0ae-service-ca\") pod \"console-58f9c847f7-xvhw5\" (UID: \"9362dc62-82ed-4b54-adf1-74be4565a0ae\") " pod="openshift-console/console-58f9c847f7-xvhw5" Apr 21 10:07:09.156795 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.156688 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9362dc62-82ed-4b54-adf1-74be4565a0ae-console-config\") pod \"console-58f9c847f7-xvhw5\" (UID: \"9362dc62-82ed-4b54-adf1-74be4565a0ae\") " pod="openshift-console/console-58f9c847f7-xvhw5" Apr 21 10:07:09.156795 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.156712 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9362dc62-82ed-4b54-adf1-74be4565a0ae-oauth-serving-cert\") pod \"console-58f9c847f7-xvhw5\" (UID: \"9362dc62-82ed-4b54-adf1-74be4565a0ae\") " pod="openshift-console/console-58f9c847f7-xvhw5" Apr 21 10:07:09.156795 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.156742 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9362dc62-82ed-4b54-adf1-74be4565a0ae-console-oauth-config\") pod \"console-58f9c847f7-xvhw5\" (UID: \"9362dc62-82ed-4b54-adf1-74be4565a0ae\") " pod="openshift-console/console-58f9c847f7-xvhw5" Apr 21 10:07:09.258010 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.257960 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9362dc62-82ed-4b54-adf1-74be4565a0ae-service-ca\") pod \"console-58f9c847f7-xvhw5\" (UID: \"9362dc62-82ed-4b54-adf1-74be4565a0ae\") " pod="openshift-console/console-58f9c847f7-xvhw5" Apr 21 10:07:09.258308 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.258023 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9362dc62-82ed-4b54-adf1-74be4565a0ae-console-config\") pod \"console-58f9c847f7-xvhw5\" (UID: \"9362dc62-82ed-4b54-adf1-74be4565a0ae\") " pod="openshift-console/console-58f9c847f7-xvhw5" Apr 21 10:07:09.258308 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.258141 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9362dc62-82ed-4b54-adf1-74be4565a0ae-oauth-serving-cert\") pod \"console-58f9c847f7-xvhw5\" (UID: \"9362dc62-82ed-4b54-adf1-74be4565a0ae\") " pod="openshift-console/console-58f9c847f7-xvhw5" Apr 21 10:07:09.258308 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.258179 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9362dc62-82ed-4b54-adf1-74be4565a0ae-console-oauth-config\") pod \"console-58f9c847f7-xvhw5\" (UID: \"9362dc62-82ed-4b54-adf1-74be4565a0ae\") " pod="openshift-console/console-58f9c847f7-xvhw5" Apr 21 10:07:09.258308 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.258244 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8vg7d\" (UniqueName: \"kubernetes.io/projected/9362dc62-82ed-4b54-adf1-74be4565a0ae-kube-api-access-8vg7d\") pod \"console-58f9c847f7-xvhw5\" (UID: \"9362dc62-82ed-4b54-adf1-74be4565a0ae\") " pod="openshift-console/console-58f9c847f7-xvhw5" Apr 21 10:07:09.258308 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.258271 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9362dc62-82ed-4b54-adf1-74be4565a0ae-console-serving-cert\") pod \"console-58f9c847f7-xvhw5\" (UID: \"9362dc62-82ed-4b54-adf1-74be4565a0ae\") " pod="openshift-console/console-58f9c847f7-xvhw5" Apr 21 10:07:09.258882 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.258824 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9362dc62-82ed-4b54-adf1-74be4565a0ae-service-ca\") pod \"console-58f9c847f7-xvhw5\" (UID: \"9362dc62-82ed-4b54-adf1-74be4565a0ae\") " pod="openshift-console/console-58f9c847f7-xvhw5" Apr 21 10:07:09.258882 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.258824 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9362dc62-82ed-4b54-adf1-74be4565a0ae-oauth-serving-cert\") pod \"console-58f9c847f7-xvhw5\" (UID: \"9362dc62-82ed-4b54-adf1-74be4565a0ae\") " pod="openshift-console/console-58f9c847f7-xvhw5" Apr 21 10:07:09.259072 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.258902 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9362dc62-82ed-4b54-adf1-74be4565a0ae-console-config\") pod \"console-58f9c847f7-xvhw5\" (UID: \"9362dc62-82ed-4b54-adf1-74be4565a0ae\") " pod="openshift-console/console-58f9c847f7-xvhw5" Apr 21 10:07:09.261004 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.260978 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9362dc62-82ed-4b54-adf1-74be4565a0ae-console-serving-cert\") pod \"console-58f9c847f7-xvhw5\" (UID: \"9362dc62-82ed-4b54-adf1-74be4565a0ae\") " pod="openshift-console/console-58f9c847f7-xvhw5" Apr 21 10:07:09.261107 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.261054 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9362dc62-82ed-4b54-adf1-74be4565a0ae-console-oauth-config\") pod \"console-58f9c847f7-xvhw5\" (UID: \"9362dc62-82ed-4b54-adf1-74be4565a0ae\") " pod="openshift-console/console-58f9c847f7-xvhw5" Apr 21 10:07:09.270542 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.270506 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vg7d\" (UniqueName: \"kubernetes.io/projected/9362dc62-82ed-4b54-adf1-74be4565a0ae-kube-api-access-8vg7d\") pod \"console-58f9c847f7-xvhw5\" (UID: \"9362dc62-82ed-4b54-adf1-74be4565a0ae\") " pod="openshift-console/console-58f9c847f7-xvhw5" Apr 21 10:07:09.388500 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.388412 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58f9c847f7-xvhw5" Apr 21 10:07:09.495019 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.494935 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-5c9756474c-jljk4" podUID="0d34def8-62fa-4cc6-a42f-4efb93bf5234" containerName="registry" containerID="cri-o://5aac823bcdf65d084c3fea4e4014d5d990a0bc1d03e004682c0c993bb9b6a2e2" gracePeriod=30 Apr 21 10:07:09.530569 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.530540 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-58f9c847f7-xvhw5"] Apr 21 10:07:09.551929 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:07:09.551885 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9362dc62_82ed_4b54_adf1_74be4565a0ae.slice/crio-5863a178f69d2afba3b611ad37a8361602d3ef81d4e0dab7322b2ba91e61cf19 WatchSource:0}: Error finding container 5863a178f69d2afba3b611ad37a8361602d3ef81d4e0dab7322b2ba91e61cf19: Status 404 returned error can't find the container with id 5863a178f69d2afba3b611ad37a8361602d3ef81d4e0dab7322b2ba91e61cf19 Apr 21 10:07:09.752438 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.752415 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5c9756474c-jljk4" Apr 21 10:07:09.843778 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.843713 2577 generic.go:358] "Generic (PLEG): container finished" podID="0d34def8-62fa-4cc6-a42f-4efb93bf5234" containerID="5aac823bcdf65d084c3fea4e4014d5d990a0bc1d03e004682c0c993bb9b6a2e2" exitCode=0 Apr 21 10:07:09.843970 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.843798 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5c9756474c-jljk4" Apr 21 10:07:09.843970 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.843825 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5c9756474c-jljk4" event={"ID":"0d34def8-62fa-4cc6-a42f-4efb93bf5234","Type":"ContainerDied","Data":"5aac823bcdf65d084c3fea4e4014d5d990a0bc1d03e004682c0c993bb9b6a2e2"} Apr 21 10:07:09.843970 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.843856 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5c9756474c-jljk4" event={"ID":"0d34def8-62fa-4cc6-a42f-4efb93bf5234","Type":"ContainerDied","Data":"686f2c96cee31161763d4dc213d1c92b51c6daa4b8a60422e5fe33d93b55a16d"} Apr 21 10:07:09.843970 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.843877 2577 scope.go:117] "RemoveContainer" containerID="5aac823bcdf65d084c3fea4e4014d5d990a0bc1d03e004682c0c993bb9b6a2e2" Apr 21 10:07:09.845078 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.845035 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58f9c847f7-xvhw5" event={"ID":"9362dc62-82ed-4b54-adf1-74be4565a0ae","Type":"ContainerStarted","Data":"5863a178f69d2afba3b611ad37a8361602d3ef81d4e0dab7322b2ba91e61cf19"} Apr 21 10:07:09.852484 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.852458 2577 scope.go:117] "RemoveContainer" containerID="5aac823bcdf65d084c3fea4e4014d5d990a0bc1d03e004682c0c993bb9b6a2e2" Apr 21 10:07:09.852848 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:07:09.852819 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5aac823bcdf65d084c3fea4e4014d5d990a0bc1d03e004682c0c993bb9b6a2e2\": container with ID starting with 5aac823bcdf65d084c3fea4e4014d5d990a0bc1d03e004682c0c993bb9b6a2e2 not found: ID does not exist" containerID="5aac823bcdf65d084c3fea4e4014d5d990a0bc1d03e004682c0c993bb9b6a2e2" Apr 21 10:07:09.852983 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.852858 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5aac823bcdf65d084c3fea4e4014d5d990a0bc1d03e004682c0c993bb9b6a2e2"} err="failed to get container status \"5aac823bcdf65d084c3fea4e4014d5d990a0bc1d03e004682c0c993bb9b6a2e2\": rpc error: code = NotFound desc = could not find container \"5aac823bcdf65d084c3fea4e4014d5d990a0bc1d03e004682c0c993bb9b6a2e2\": container with ID starting with 5aac823bcdf65d084c3fea4e4014d5d990a0bc1d03e004682c0c993bb9b6a2e2 not found: ID does not exist" Apr 21 10:07:09.863759 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.863717 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0d34def8-62fa-4cc6-a42f-4efb93bf5234-registry-certificates\") pod \"0d34def8-62fa-4cc6-a42f-4efb93bf5234\" (UID: \"0d34def8-62fa-4cc6-a42f-4efb93bf5234\") " Apr 21 10:07:09.863932 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.863777 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0d34def8-62fa-4cc6-a42f-4efb93bf5234-installation-pull-secrets\") pod \"0d34def8-62fa-4cc6-a42f-4efb93bf5234\" (UID: \"0d34def8-62fa-4cc6-a42f-4efb93bf5234\") " Apr 21 10:07:09.863932 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.863804 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kn7xv\" (UniqueName: \"kubernetes.io/projected/0d34def8-62fa-4cc6-a42f-4efb93bf5234-kube-api-access-kn7xv\") pod \"0d34def8-62fa-4cc6-a42f-4efb93bf5234\" (UID: \"0d34def8-62fa-4cc6-a42f-4efb93bf5234\") " Apr 21 10:07:09.863932 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.863834 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0d34def8-62fa-4cc6-a42f-4efb93bf5234-ca-trust-extracted\") pod \"0d34def8-62fa-4cc6-a42f-4efb93bf5234\" (UID: \"0d34def8-62fa-4cc6-a42f-4efb93bf5234\") " Apr 21 10:07:09.863932 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.863880 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0d34def8-62fa-4cc6-a42f-4efb93bf5234-trusted-ca\") pod \"0d34def8-62fa-4cc6-a42f-4efb93bf5234\" (UID: \"0d34def8-62fa-4cc6-a42f-4efb93bf5234\") " Apr 21 10:07:09.863932 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.863911 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0d34def8-62fa-4cc6-a42f-4efb93bf5234-registry-tls\") pod \"0d34def8-62fa-4cc6-a42f-4efb93bf5234\" (UID: \"0d34def8-62fa-4cc6-a42f-4efb93bf5234\") " Apr 21 10:07:09.864179 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.863963 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0d34def8-62fa-4cc6-a42f-4efb93bf5234-image-registry-private-configuration\") pod \"0d34def8-62fa-4cc6-a42f-4efb93bf5234\" (UID: \"0d34def8-62fa-4cc6-a42f-4efb93bf5234\") " Apr 21 10:07:09.864179 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.863997 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0d34def8-62fa-4cc6-a42f-4efb93bf5234-bound-sa-token\") pod \"0d34def8-62fa-4cc6-a42f-4efb93bf5234\" (UID: \"0d34def8-62fa-4cc6-a42f-4efb93bf5234\") " Apr 21 10:07:09.864288 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.864224 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d34def8-62fa-4cc6-a42f-4efb93bf5234-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "0d34def8-62fa-4cc6-a42f-4efb93bf5234" (UID: "0d34def8-62fa-4cc6-a42f-4efb93bf5234"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:07:09.864627 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.864567 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d34def8-62fa-4cc6-a42f-4efb93bf5234-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "0d34def8-62fa-4cc6-a42f-4efb93bf5234" (UID: "0d34def8-62fa-4cc6-a42f-4efb93bf5234"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:07:09.866720 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.866633 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d34def8-62fa-4cc6-a42f-4efb93bf5234-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "0d34def8-62fa-4cc6-a42f-4efb93bf5234" (UID: "0d34def8-62fa-4cc6-a42f-4efb93bf5234"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:07:09.866720 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.866671 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d34def8-62fa-4cc6-a42f-4efb93bf5234-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "0d34def8-62fa-4cc6-a42f-4efb93bf5234" (UID: "0d34def8-62fa-4cc6-a42f-4efb93bf5234"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 10:07:09.866720 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.866700 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d34def8-62fa-4cc6-a42f-4efb93bf5234-kube-api-access-kn7xv" (OuterVolumeSpecName: "kube-api-access-kn7xv") pod "0d34def8-62fa-4cc6-a42f-4efb93bf5234" (UID: "0d34def8-62fa-4cc6-a42f-4efb93bf5234"). InnerVolumeSpecName "kube-api-access-kn7xv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 10:07:09.866968 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.866877 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d34def8-62fa-4cc6-a42f-4efb93bf5234-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "0d34def8-62fa-4cc6-a42f-4efb93bf5234" (UID: "0d34def8-62fa-4cc6-a42f-4efb93bf5234"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 10:07:09.867080 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.867059 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d34def8-62fa-4cc6-a42f-4efb93bf5234-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "0d34def8-62fa-4cc6-a42f-4efb93bf5234" (UID: "0d34def8-62fa-4cc6-a42f-4efb93bf5234"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:07:09.873372 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.873338 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d34def8-62fa-4cc6-a42f-4efb93bf5234-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "0d34def8-62fa-4cc6-a42f-4efb93bf5234" (UID: "0d34def8-62fa-4cc6-a42f-4efb93bf5234"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:07:09.965453 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.965413 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kn7xv\" (UniqueName: \"kubernetes.io/projected/0d34def8-62fa-4cc6-a42f-4efb93bf5234-kube-api-access-kn7xv\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:07:09.965453 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.965448 2577 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0d34def8-62fa-4cc6-a42f-4efb93bf5234-ca-trust-extracted\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:07:09.965662 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.965461 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0d34def8-62fa-4cc6-a42f-4efb93bf5234-trusted-ca\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:07:09.965662 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.965473 2577 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0d34def8-62fa-4cc6-a42f-4efb93bf5234-registry-tls\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:07:09.965662 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.965486 2577 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0d34def8-62fa-4cc6-a42f-4efb93bf5234-image-registry-private-configuration\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:07:09.965662 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.965501 2577 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0d34def8-62fa-4cc6-a42f-4efb93bf5234-bound-sa-token\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:07:09.965662 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.965513 2577 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0d34def8-62fa-4cc6-a42f-4efb93bf5234-registry-certificates\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:07:09.965662 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:09.965524 2577 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0d34def8-62fa-4cc6-a42f-4efb93bf5234-installation-pull-secrets\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:07:10.167642 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:10.167611 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5c9756474c-jljk4"] Apr 21 10:07:10.170304 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:10.170278 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5c9756474c-jljk4"] Apr 21 10:07:10.233147 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:10.233061 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d34def8-62fa-4cc6-a42f-4efb93bf5234" path="/var/lib/kubelet/pods/0d34def8-62fa-4cc6-a42f-4efb93bf5234/volumes" Apr 21 10:07:10.838937 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:10.838896 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-5d57f5d8f8-lvwjb" Apr 21 10:07:12.857355 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:12.857261 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58f9c847f7-xvhw5" event={"ID":"9362dc62-82ed-4b54-adf1-74be4565a0ae","Type":"ContainerStarted","Data":"5596b68b42ecfd2ac89c53a06cf124d56d42e63249200ea1b52db0e06beafff1"} Apr 21 10:07:12.876219 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:12.876153 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-58f9c847f7-xvhw5" podStartSLOduration=0.952666517 podStartE2EDuration="3.876133324s" podCreationTimestamp="2026-04-21 10:07:09 +0000 UTC" firstStartedPulling="2026-04-21 10:07:09.600653508 +0000 UTC m=+197.939187503" lastFinishedPulling="2026-04-21 10:07:12.524120316 +0000 UTC m=+200.862654310" observedRunningTime="2026-04-21 10:07:12.875063909 +0000 UTC m=+201.213597937" watchObservedRunningTime="2026-04-21 10:07:12.876133324 +0000 UTC m=+201.214667345" Apr 21 10:07:16.657563 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:16.657521 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-75d6986bd6-bw6jc"] Apr 21 10:07:16.658017 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:16.657894 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0d34def8-62fa-4cc6-a42f-4efb93bf5234" containerName="registry" Apr 21 10:07:16.658017 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:16.657909 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d34def8-62fa-4cc6-a42f-4efb93bf5234" containerName="registry" Apr 21 10:07:16.658017 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:16.657985 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="0d34def8-62fa-4cc6-a42f-4efb93bf5234" containerName="registry" Apr 21 10:07:16.662151 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:16.662124 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-75d6986bd6-bw6jc" Apr 21 10:07:16.669483 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:16.669450 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 21 10:07:16.672425 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:16.672392 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-75d6986bd6-bw6jc"] Apr 21 10:07:16.729790 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:16.729730 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/973a2c88-57d4-4b8e-bb09-06e91a8d11cd-service-ca\") pod \"console-75d6986bd6-bw6jc\" (UID: \"973a2c88-57d4-4b8e-bb09-06e91a8d11cd\") " pod="openshift-console/console-75d6986bd6-bw6jc" Apr 21 10:07:16.729790 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:16.729796 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/973a2c88-57d4-4b8e-bb09-06e91a8d11cd-console-config\") pod \"console-75d6986bd6-bw6jc\" (UID: \"973a2c88-57d4-4b8e-bb09-06e91a8d11cd\") " pod="openshift-console/console-75d6986bd6-bw6jc" Apr 21 10:07:16.730034 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:16.729934 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-599ml\" (UniqueName: \"kubernetes.io/projected/973a2c88-57d4-4b8e-bb09-06e91a8d11cd-kube-api-access-599ml\") pod \"console-75d6986bd6-bw6jc\" (UID: \"973a2c88-57d4-4b8e-bb09-06e91a8d11cd\") " pod="openshift-console/console-75d6986bd6-bw6jc" Apr 21 10:07:16.730034 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:16.729983 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/973a2c88-57d4-4b8e-bb09-06e91a8d11cd-oauth-serving-cert\") pod \"console-75d6986bd6-bw6jc\" (UID: \"973a2c88-57d4-4b8e-bb09-06e91a8d11cd\") " pod="openshift-console/console-75d6986bd6-bw6jc" Apr 21 10:07:16.730034 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:16.730030 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/973a2c88-57d4-4b8e-bb09-06e91a8d11cd-console-serving-cert\") pod \"console-75d6986bd6-bw6jc\" (UID: \"973a2c88-57d4-4b8e-bb09-06e91a8d11cd\") " pod="openshift-console/console-75d6986bd6-bw6jc" Apr 21 10:07:16.730181 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:16.730057 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/973a2c88-57d4-4b8e-bb09-06e91a8d11cd-trusted-ca-bundle\") pod \"console-75d6986bd6-bw6jc\" (UID: \"973a2c88-57d4-4b8e-bb09-06e91a8d11cd\") " pod="openshift-console/console-75d6986bd6-bw6jc" Apr 21 10:07:16.730181 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:16.730088 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/973a2c88-57d4-4b8e-bb09-06e91a8d11cd-console-oauth-config\") pod \"console-75d6986bd6-bw6jc\" (UID: \"973a2c88-57d4-4b8e-bb09-06e91a8d11cd\") " pod="openshift-console/console-75d6986bd6-bw6jc" Apr 21 10:07:16.830639 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:16.830596 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/973a2c88-57d4-4b8e-bb09-06e91a8d11cd-service-ca\") pod \"console-75d6986bd6-bw6jc\" (UID: \"973a2c88-57d4-4b8e-bb09-06e91a8d11cd\") " pod="openshift-console/console-75d6986bd6-bw6jc" Apr 21 10:07:16.830639 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:16.830642 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/973a2c88-57d4-4b8e-bb09-06e91a8d11cd-console-config\") pod \"console-75d6986bd6-bw6jc\" (UID: \"973a2c88-57d4-4b8e-bb09-06e91a8d11cd\") " pod="openshift-console/console-75d6986bd6-bw6jc" Apr 21 10:07:16.830919 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:16.830695 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-599ml\" (UniqueName: \"kubernetes.io/projected/973a2c88-57d4-4b8e-bb09-06e91a8d11cd-kube-api-access-599ml\") pod \"console-75d6986bd6-bw6jc\" (UID: \"973a2c88-57d4-4b8e-bb09-06e91a8d11cd\") " pod="openshift-console/console-75d6986bd6-bw6jc" Apr 21 10:07:16.830919 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:16.830718 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/973a2c88-57d4-4b8e-bb09-06e91a8d11cd-oauth-serving-cert\") pod \"console-75d6986bd6-bw6jc\" (UID: \"973a2c88-57d4-4b8e-bb09-06e91a8d11cd\") " pod="openshift-console/console-75d6986bd6-bw6jc" Apr 21 10:07:16.830919 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:16.830764 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/973a2c88-57d4-4b8e-bb09-06e91a8d11cd-console-serving-cert\") pod \"console-75d6986bd6-bw6jc\" (UID: \"973a2c88-57d4-4b8e-bb09-06e91a8d11cd\") " pod="openshift-console/console-75d6986bd6-bw6jc" Apr 21 10:07:16.830919 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:16.830821 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/973a2c88-57d4-4b8e-bb09-06e91a8d11cd-trusted-ca-bundle\") pod \"console-75d6986bd6-bw6jc\" (UID: \"973a2c88-57d4-4b8e-bb09-06e91a8d11cd\") " pod="openshift-console/console-75d6986bd6-bw6jc" Apr 21 10:07:16.830919 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:16.830871 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/973a2c88-57d4-4b8e-bb09-06e91a8d11cd-console-oauth-config\") pod \"console-75d6986bd6-bw6jc\" (UID: \"973a2c88-57d4-4b8e-bb09-06e91a8d11cd\") " pod="openshift-console/console-75d6986bd6-bw6jc" Apr 21 10:07:16.831539 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:16.831444 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/973a2c88-57d4-4b8e-bb09-06e91a8d11cd-service-ca\") pod \"console-75d6986bd6-bw6jc\" (UID: \"973a2c88-57d4-4b8e-bb09-06e91a8d11cd\") " pod="openshift-console/console-75d6986bd6-bw6jc" Apr 21 10:07:16.831677 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:16.831549 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/973a2c88-57d4-4b8e-bb09-06e91a8d11cd-console-config\") pod \"console-75d6986bd6-bw6jc\" (UID: \"973a2c88-57d4-4b8e-bb09-06e91a8d11cd\") " pod="openshift-console/console-75d6986bd6-bw6jc" Apr 21 10:07:16.832015 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:16.831986 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/973a2c88-57d4-4b8e-bb09-06e91a8d11cd-trusted-ca-bundle\") pod \"console-75d6986bd6-bw6jc\" (UID: \"973a2c88-57d4-4b8e-bb09-06e91a8d11cd\") " pod="openshift-console/console-75d6986bd6-bw6jc" Apr 21 10:07:16.832182 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:16.832158 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/973a2c88-57d4-4b8e-bb09-06e91a8d11cd-oauth-serving-cert\") pod \"console-75d6986bd6-bw6jc\" (UID: \"973a2c88-57d4-4b8e-bb09-06e91a8d11cd\") " pod="openshift-console/console-75d6986bd6-bw6jc" Apr 21 10:07:16.833538 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:16.833507 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/973a2c88-57d4-4b8e-bb09-06e91a8d11cd-console-oauth-config\") pod \"console-75d6986bd6-bw6jc\" (UID: \"973a2c88-57d4-4b8e-bb09-06e91a8d11cd\") " pod="openshift-console/console-75d6986bd6-bw6jc" Apr 21 10:07:16.833741 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:16.833722 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/973a2c88-57d4-4b8e-bb09-06e91a8d11cd-console-serving-cert\") pod \"console-75d6986bd6-bw6jc\" (UID: \"973a2c88-57d4-4b8e-bb09-06e91a8d11cd\") " pod="openshift-console/console-75d6986bd6-bw6jc" Apr 21 10:07:16.840074 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:16.840034 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-599ml\" (UniqueName: \"kubernetes.io/projected/973a2c88-57d4-4b8e-bb09-06e91a8d11cd-kube-api-access-599ml\") pod \"console-75d6986bd6-bw6jc\" (UID: \"973a2c88-57d4-4b8e-bb09-06e91a8d11cd\") " pod="openshift-console/console-75d6986bd6-bw6jc" Apr 21 10:07:16.974596 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:16.974559 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-75d6986bd6-bw6jc" Apr 21 10:07:19.389013 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:19.388975 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-58f9c847f7-xvhw5" Apr 21 10:07:19.389013 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:19.389018 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-58f9c847f7-xvhw5" Apr 21 10:07:19.394286 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:19.394256 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-58f9c847f7-xvhw5" Apr 21 10:07:19.885181 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:19.885145 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-58f9c847f7-xvhw5" Apr 21 10:07:20.873679 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:20.873643 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-75d6986bd6-bw6jc"] Apr 21 10:07:20.876726 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:07:20.876690 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod973a2c88_57d4_4b8e_bb09_06e91a8d11cd.slice/crio-13495dfe44b9c7533a585d859494f8109112fe8fd90478d4deb4b0c1f7d048c4 WatchSource:0}: Error finding container 13495dfe44b9c7533a585d859494f8109112fe8fd90478d4deb4b0c1f7d048c4: Status 404 returned error can't find the container with id 13495dfe44b9c7533a585d859494f8109112fe8fd90478d4deb4b0c1f7d048c4 Apr 21 10:07:20.884464 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:20.884433 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-75d6986bd6-bw6jc" event={"ID":"973a2c88-57d4-4b8e-bb09-06e91a8d11cd","Type":"ContainerStarted","Data":"13495dfe44b9c7533a585d859494f8109112fe8fd90478d4deb4b0c1f7d048c4"} Apr 21 10:07:21.888967 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:21.888932 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-4wrmx" event={"ID":"00a9f177-852d-4071-8823-418bcec59544","Type":"ContainerStarted","Data":"f134b0177ed53ad10e9f350ef548d68879d58ef98bb8b52748aaa9223949cfe1"} Apr 21 10:07:21.889444 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:21.889130 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-4wrmx" Apr 21 10:07:21.890670 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:21.890633 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-75d6986bd6-bw6jc" event={"ID":"973a2c88-57d4-4b8e-bb09-06e91a8d11cd","Type":"ContainerStarted","Data":"d76b7afa0e710caf12621462bf1c193e346eca22edd126f572e24446ed92b62b"} Apr 21 10:07:21.905762 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:21.905718 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-4wrmx" Apr 21 10:07:21.915916 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:21.915863 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-4wrmx" podStartSLOduration=1.92581366 podStartE2EDuration="20.915847068s" podCreationTimestamp="2026-04-21 10:07:01 +0000 UTC" firstStartedPulling="2026-04-21 10:07:01.826966307 +0000 UTC m=+190.165500321" lastFinishedPulling="2026-04-21 10:07:20.816999731 +0000 UTC m=+209.155533729" observedRunningTime="2026-04-21 10:07:21.915431918 +0000 UTC m=+210.253965947" watchObservedRunningTime="2026-04-21 10:07:21.915847068 +0000 UTC m=+210.254381086" Apr 21 10:07:21.965432 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:21.965368 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-75d6986bd6-bw6jc" podStartSLOduration=5.965346892 podStartE2EDuration="5.965346892s" podCreationTimestamp="2026-04-21 10:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:07:21.964062562 +0000 UTC m=+210.302596605" watchObservedRunningTime="2026-04-21 10:07:21.965346892 +0000 UTC m=+210.303880908" Apr 21 10:07:26.974998 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:26.974952 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-75d6986bd6-bw6jc" Apr 21 10:07:26.975548 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:26.975016 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-75d6986bd6-bw6jc" Apr 21 10:07:26.980577 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:26.980544 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-75d6986bd6-bw6jc" Apr 21 10:07:27.913456 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:27.913418 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-75d6986bd6-bw6jc" Apr 21 10:07:27.964940 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:27.964903 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-58f9c847f7-xvhw5"] Apr 21 10:07:33.925794 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:33.925682 2577 generic.go:358] "Generic (PLEG): container finished" podID="42a73500-9d9d-4413-8386-90da11688aaa" containerID="a5fd0ec4fa5adfff2013c81ea12945d6a1e7d7a9a3e10da6aba1f6fa6425cb8b" exitCode=0 Apr 21 10:07:33.925794 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:33.925774 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pk2f9" event={"ID":"42a73500-9d9d-4413-8386-90da11688aaa","Type":"ContainerDied","Data":"a5fd0ec4fa5adfff2013c81ea12945d6a1e7d7a9a3e10da6aba1f6fa6425cb8b"} Apr 21 10:07:33.926305 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:33.926147 2577 scope.go:117] "RemoveContainer" containerID="a5fd0ec4fa5adfff2013c81ea12945d6a1e7d7a9a3e10da6aba1f6fa6425cb8b" Apr 21 10:07:34.612217 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:34.612145 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-75v2d_5a4cb6b4-0870-47bc-b13c-24f96bc4d282/dns/0.log" Apr 21 10:07:34.617668 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:34.617636 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-75v2d_5a4cb6b4-0870-47bc-b13c-24f96bc4d282/kube-rbac-proxy/0.log" Apr 21 10:07:34.930881 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:34.930788 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-pk2f9" event={"ID":"42a73500-9d9d-4413-8386-90da11688aaa","Type":"ContainerStarted","Data":"06a4006dd4fece81b38543fff409b04ec737b2050c5ab6742f30008e87868fb5"} Apr 21 10:07:35.197803 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:35.197695 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-c588q_186c1594-0ba9-495b-8213-27692e681b57/dns-node-resolver/0.log" Apr 21 10:07:52.989135 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:52.989073 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-58f9c847f7-xvhw5" podUID="9362dc62-82ed-4b54-adf1-74be4565a0ae" containerName="console" containerID="cri-o://5596b68b42ecfd2ac89c53a06cf124d56d42e63249200ea1b52db0e06beafff1" gracePeriod=15 Apr 21 10:07:53.286611 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:53.286584 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-58f9c847f7-xvhw5_9362dc62-82ed-4b54-adf1-74be4565a0ae/console/0.log" Apr 21 10:07:53.286780 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:53.286656 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58f9c847f7-xvhw5" Apr 21 10:07:53.361458 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:53.361424 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9362dc62-82ed-4b54-adf1-74be4565a0ae-service-ca\") pod \"9362dc62-82ed-4b54-adf1-74be4565a0ae\" (UID: \"9362dc62-82ed-4b54-adf1-74be4565a0ae\") " Apr 21 10:07:53.361645 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:53.361480 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9362dc62-82ed-4b54-adf1-74be4565a0ae-console-oauth-config\") pod \"9362dc62-82ed-4b54-adf1-74be4565a0ae\" (UID: \"9362dc62-82ed-4b54-adf1-74be4565a0ae\") " Apr 21 10:07:53.361645 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:53.361509 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vg7d\" (UniqueName: \"kubernetes.io/projected/9362dc62-82ed-4b54-adf1-74be4565a0ae-kube-api-access-8vg7d\") pod \"9362dc62-82ed-4b54-adf1-74be4565a0ae\" (UID: \"9362dc62-82ed-4b54-adf1-74be4565a0ae\") " Apr 21 10:07:53.361645 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:53.361527 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9362dc62-82ed-4b54-adf1-74be4565a0ae-oauth-serving-cert\") pod \"9362dc62-82ed-4b54-adf1-74be4565a0ae\" (UID: \"9362dc62-82ed-4b54-adf1-74be4565a0ae\") " Apr 21 10:07:53.361845 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:53.361644 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9362dc62-82ed-4b54-adf1-74be4565a0ae-console-serving-cert\") pod \"9362dc62-82ed-4b54-adf1-74be4565a0ae\" (UID: \"9362dc62-82ed-4b54-adf1-74be4565a0ae\") " Apr 21 10:07:53.361845 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:53.361693 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9362dc62-82ed-4b54-adf1-74be4565a0ae-console-config\") pod \"9362dc62-82ed-4b54-adf1-74be4565a0ae\" (UID: \"9362dc62-82ed-4b54-adf1-74be4565a0ae\") " Apr 21 10:07:53.361959 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:53.361893 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9362dc62-82ed-4b54-adf1-74be4565a0ae-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "9362dc62-82ed-4b54-adf1-74be4565a0ae" (UID: "9362dc62-82ed-4b54-adf1-74be4565a0ae"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:07:53.361959 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:53.361933 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9362dc62-82ed-4b54-adf1-74be4565a0ae-service-ca" (OuterVolumeSpecName: "service-ca") pod "9362dc62-82ed-4b54-adf1-74be4565a0ae" (UID: "9362dc62-82ed-4b54-adf1-74be4565a0ae"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:07:53.362064 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:53.362009 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9362dc62-82ed-4b54-adf1-74be4565a0ae-oauth-serving-cert\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:07:53.362064 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:53.362026 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9362dc62-82ed-4b54-adf1-74be4565a0ae-service-ca\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:07:53.362219 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:53.362192 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9362dc62-82ed-4b54-adf1-74be4565a0ae-console-config" (OuterVolumeSpecName: "console-config") pod "9362dc62-82ed-4b54-adf1-74be4565a0ae" (UID: "9362dc62-82ed-4b54-adf1-74be4565a0ae"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:07:53.363915 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:53.363882 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9362dc62-82ed-4b54-adf1-74be4565a0ae-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "9362dc62-82ed-4b54-adf1-74be4565a0ae" (UID: "9362dc62-82ed-4b54-adf1-74be4565a0ae"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:07:53.364033 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:53.363923 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9362dc62-82ed-4b54-adf1-74be4565a0ae-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "9362dc62-82ed-4b54-adf1-74be4565a0ae" (UID: "9362dc62-82ed-4b54-adf1-74be4565a0ae"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:07:53.364187 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:53.364171 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9362dc62-82ed-4b54-adf1-74be4565a0ae-kube-api-access-8vg7d" (OuterVolumeSpecName: "kube-api-access-8vg7d") pod "9362dc62-82ed-4b54-adf1-74be4565a0ae" (UID: "9362dc62-82ed-4b54-adf1-74be4565a0ae"). InnerVolumeSpecName "kube-api-access-8vg7d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 10:07:53.462532 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:53.462489 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9362dc62-82ed-4b54-adf1-74be4565a0ae-console-oauth-config\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:07:53.462532 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:53.462522 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8vg7d\" (UniqueName: \"kubernetes.io/projected/9362dc62-82ed-4b54-adf1-74be4565a0ae-kube-api-access-8vg7d\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:07:53.462532 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:53.462541 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9362dc62-82ed-4b54-adf1-74be4565a0ae-console-serving-cert\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:07:53.462853 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:53.462554 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9362dc62-82ed-4b54-adf1-74be4565a0ae-console-config\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:07:53.985961 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:53.985929 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-58f9c847f7-xvhw5_9362dc62-82ed-4b54-adf1-74be4565a0ae/console/0.log" Apr 21 10:07:53.986165 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:53.985973 2577 generic.go:358] "Generic (PLEG): container finished" podID="9362dc62-82ed-4b54-adf1-74be4565a0ae" containerID="5596b68b42ecfd2ac89c53a06cf124d56d42e63249200ea1b52db0e06beafff1" exitCode=2 Apr 21 10:07:53.986165 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:53.986015 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58f9c847f7-xvhw5" event={"ID":"9362dc62-82ed-4b54-adf1-74be4565a0ae","Type":"ContainerDied","Data":"5596b68b42ecfd2ac89c53a06cf124d56d42e63249200ea1b52db0e06beafff1"} Apr 21 10:07:53.986165 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:53.986062 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58f9c847f7-xvhw5" event={"ID":"9362dc62-82ed-4b54-adf1-74be4565a0ae","Type":"ContainerDied","Data":"5863a178f69d2afba3b611ad37a8361602d3ef81d4e0dab7322b2ba91e61cf19"} Apr 21 10:07:53.986165 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:53.986079 2577 scope.go:117] "RemoveContainer" containerID="5596b68b42ecfd2ac89c53a06cf124d56d42e63249200ea1b52db0e06beafff1" Apr 21 10:07:53.986165 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:53.986088 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58f9c847f7-xvhw5" Apr 21 10:07:53.995379 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:53.995357 2577 scope.go:117] "RemoveContainer" containerID="5596b68b42ecfd2ac89c53a06cf124d56d42e63249200ea1b52db0e06beafff1" Apr 21 10:07:53.995711 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:07:53.995681 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5596b68b42ecfd2ac89c53a06cf124d56d42e63249200ea1b52db0e06beafff1\": container with ID starting with 5596b68b42ecfd2ac89c53a06cf124d56d42e63249200ea1b52db0e06beafff1 not found: ID does not exist" containerID="5596b68b42ecfd2ac89c53a06cf124d56d42e63249200ea1b52db0e06beafff1" Apr 21 10:07:53.995781 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:53.995708 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5596b68b42ecfd2ac89c53a06cf124d56d42e63249200ea1b52db0e06beafff1"} err="failed to get container status \"5596b68b42ecfd2ac89c53a06cf124d56d42e63249200ea1b52db0e06beafff1\": rpc error: code = NotFound desc = could not find container \"5596b68b42ecfd2ac89c53a06cf124d56d42e63249200ea1b52db0e06beafff1\": container with ID starting with 5596b68b42ecfd2ac89c53a06cf124d56d42e63249200ea1b52db0e06beafff1 not found: ID does not exist" Apr 21 10:07:54.007797 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:54.007735 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-58f9c847f7-xvhw5"] Apr 21 10:07:54.011810 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:54.011774 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-58f9c847f7-xvhw5"] Apr 21 10:07:54.231085 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:07:54.231052 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9362dc62-82ed-4b54-adf1-74be4565a0ae" path="/var/lib/kubelet/pods/9362dc62-82ed-4b54-adf1-74be4565a0ae/volumes" Apr 21 10:08:04.052641 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:08:04.052590 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e437b5da-7e75-4ed5-8d79-e418168b80fe-metrics-certs\") pod \"network-metrics-daemon-kckvj\" (UID: \"e437b5da-7e75-4ed5-8d79-e418168b80fe\") " pod="openshift-multus/network-metrics-daemon-kckvj" Apr 21 10:08:04.055066 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:08:04.055035 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e437b5da-7e75-4ed5-8d79-e418168b80fe-metrics-certs\") pod \"network-metrics-daemon-kckvj\" (UID: \"e437b5da-7e75-4ed5-8d79-e418168b80fe\") " pod="openshift-multus/network-metrics-daemon-kckvj" Apr 21 10:08:04.233944 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:08:04.233902 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-dfhd5\"" Apr 21 10:08:04.241741 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:08:04.241713 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kckvj" Apr 21 10:08:04.368369 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:08:04.368331 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-kckvj"] Apr 21 10:08:04.371847 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:08:04.371799 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode437b5da_7e75_4ed5_8d79_e418168b80fe.slice/crio-e4cd6bfd7c122e5c12faea11e400cf466675711e3208652e6b7af838518c0958 WatchSource:0}: Error finding container e4cd6bfd7c122e5c12faea11e400cf466675711e3208652e6b7af838518c0958: Status 404 returned error can't find the container with id e4cd6bfd7c122e5c12faea11e400cf466675711e3208652e6b7af838518c0958 Apr 21 10:08:05.023317 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:08:05.023275 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kckvj" event={"ID":"e437b5da-7e75-4ed5-8d79-e418168b80fe","Type":"ContainerStarted","Data":"e4cd6bfd7c122e5c12faea11e400cf466675711e3208652e6b7af838518c0958"} Apr 21 10:08:06.027604 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:08:06.027568 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kckvj" event={"ID":"e437b5da-7e75-4ed5-8d79-e418168b80fe","Type":"ContainerStarted","Data":"f3865a4fbc48cd558e59f75d0b0880833acde36f7f7f9064f05f0da7ea4caa70"} Apr 21 10:08:07.032895 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:08:07.032857 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kckvj" event={"ID":"e437b5da-7e75-4ed5-8d79-e418168b80fe","Type":"ContainerStarted","Data":"ea93f7f3aead81e3c0ac9bd2a6988daa6869131a786e3c322719883836867fb9"} Apr 21 10:08:07.049170 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:08:07.049117 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-kckvj" podStartSLOduration=253.623540379 podStartE2EDuration="4m15.049102484s" podCreationTimestamp="2026-04-21 10:03:52 +0000 UTC" firstStartedPulling="2026-04-21 10:08:04.373718651 +0000 UTC m=+252.712252646" lastFinishedPulling="2026-04-21 10:08:05.799280753 +0000 UTC m=+254.137814751" observedRunningTime="2026-04-21 10:08:07.047627068 +0000 UTC m=+255.386161085" watchObservedRunningTime="2026-04-21 10:08:07.049102484 +0000 UTC m=+255.387636500" Apr 21 10:08:24.531714 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:08:24.531620 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5f6fff848f-58wsb"] Apr 21 10:08:24.532151 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:08:24.531989 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9362dc62-82ed-4b54-adf1-74be4565a0ae" containerName="console" Apr 21 10:08:24.532151 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:08:24.532006 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="9362dc62-82ed-4b54-adf1-74be4565a0ae" containerName="console" Apr 21 10:08:24.532151 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:08:24.532081 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="9362dc62-82ed-4b54-adf1-74be4565a0ae" containerName="console" Apr 21 10:08:24.534927 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:08:24.534906 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f6fff848f-58wsb" Apr 21 10:08:24.545848 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:08:24.545811 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f6fff848f-58wsb"] Apr 21 10:08:24.614834 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:08:24.614796 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/146bb9bb-e778-4eee-b20b-750bf8b9c59d-console-serving-cert\") pod \"console-5f6fff848f-58wsb\" (UID: \"146bb9bb-e778-4eee-b20b-750bf8b9c59d\") " pod="openshift-console/console-5f6fff848f-58wsb" Apr 21 10:08:24.614834 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:08:24.614838 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/146bb9bb-e778-4eee-b20b-750bf8b9c59d-trusted-ca-bundle\") pod \"console-5f6fff848f-58wsb\" (UID: \"146bb9bb-e778-4eee-b20b-750bf8b9c59d\") " pod="openshift-console/console-5f6fff848f-58wsb" Apr 21 10:08:24.615046 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:08:24.614939 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/146bb9bb-e778-4eee-b20b-750bf8b9c59d-console-oauth-config\") pod \"console-5f6fff848f-58wsb\" (UID: \"146bb9bb-e778-4eee-b20b-750bf8b9c59d\") " pod="openshift-console/console-5f6fff848f-58wsb" Apr 21 10:08:24.615046 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:08:24.614992 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/146bb9bb-e778-4eee-b20b-750bf8b9c59d-console-config\") pod \"console-5f6fff848f-58wsb\" (UID: \"146bb9bb-e778-4eee-b20b-750bf8b9c59d\") " pod="openshift-console/console-5f6fff848f-58wsb" Apr 21 10:08:24.615046 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:08:24.615040 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/146bb9bb-e778-4eee-b20b-750bf8b9c59d-oauth-serving-cert\") pod \"console-5f6fff848f-58wsb\" (UID: \"146bb9bb-e778-4eee-b20b-750bf8b9c59d\") " pod="openshift-console/console-5f6fff848f-58wsb" Apr 21 10:08:24.615046 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:08:24.615070 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/146bb9bb-e778-4eee-b20b-750bf8b9c59d-service-ca\") pod \"console-5f6fff848f-58wsb\" (UID: \"146bb9bb-e778-4eee-b20b-750bf8b9c59d\") " pod="openshift-console/console-5f6fff848f-58wsb" Apr 21 10:08:24.615243 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:08:24.615115 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lf7q6\" (UniqueName: \"kubernetes.io/projected/146bb9bb-e778-4eee-b20b-750bf8b9c59d-kube-api-access-lf7q6\") pod \"console-5f6fff848f-58wsb\" (UID: \"146bb9bb-e778-4eee-b20b-750bf8b9c59d\") " pod="openshift-console/console-5f6fff848f-58wsb" Apr 21 10:08:24.715785 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:08:24.715708 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/146bb9bb-e778-4eee-b20b-750bf8b9c59d-console-config\") pod \"console-5f6fff848f-58wsb\" (UID: \"146bb9bb-e778-4eee-b20b-750bf8b9c59d\") " pod="openshift-console/console-5f6fff848f-58wsb" Apr 21 10:08:24.715971 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:08:24.715802 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/146bb9bb-e778-4eee-b20b-750bf8b9c59d-oauth-serving-cert\") pod \"console-5f6fff848f-58wsb\" (UID: \"146bb9bb-e778-4eee-b20b-750bf8b9c59d\") " pod="openshift-console/console-5f6fff848f-58wsb" Apr 21 10:08:24.715971 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:08:24.715833 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/146bb9bb-e778-4eee-b20b-750bf8b9c59d-service-ca\") pod \"console-5f6fff848f-58wsb\" (UID: \"146bb9bb-e778-4eee-b20b-750bf8b9c59d\") " pod="openshift-console/console-5f6fff848f-58wsb" Apr 21 10:08:24.715971 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:08:24.715851 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lf7q6\" (UniqueName: \"kubernetes.io/projected/146bb9bb-e778-4eee-b20b-750bf8b9c59d-kube-api-access-lf7q6\") pod \"console-5f6fff848f-58wsb\" (UID: \"146bb9bb-e778-4eee-b20b-750bf8b9c59d\") " pod="openshift-console/console-5f6fff848f-58wsb" Apr 21 10:08:24.715971 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:08:24.715871 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/146bb9bb-e778-4eee-b20b-750bf8b9c59d-console-serving-cert\") pod \"console-5f6fff848f-58wsb\" (UID: \"146bb9bb-e778-4eee-b20b-750bf8b9c59d\") " pod="openshift-console/console-5f6fff848f-58wsb" Apr 21 10:08:24.715971 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:08:24.715894 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/146bb9bb-e778-4eee-b20b-750bf8b9c59d-trusted-ca-bundle\") pod \"console-5f6fff848f-58wsb\" (UID: \"146bb9bb-e778-4eee-b20b-750bf8b9c59d\") " pod="openshift-console/console-5f6fff848f-58wsb" Apr 21 10:08:24.715971 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:08:24.715934 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/146bb9bb-e778-4eee-b20b-750bf8b9c59d-console-oauth-config\") pod \"console-5f6fff848f-58wsb\" (UID: \"146bb9bb-e778-4eee-b20b-750bf8b9c59d\") " pod="openshift-console/console-5f6fff848f-58wsb" Apr 21 10:08:24.716602 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:08:24.716543 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/146bb9bb-e778-4eee-b20b-750bf8b9c59d-console-config\") pod \"console-5f6fff848f-58wsb\" (UID: \"146bb9bb-e778-4eee-b20b-750bf8b9c59d\") " pod="openshift-console/console-5f6fff848f-58wsb" Apr 21 10:08:24.716602 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:08:24.716543 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/146bb9bb-e778-4eee-b20b-750bf8b9c59d-oauth-serving-cert\") pod \"console-5f6fff848f-58wsb\" (UID: \"146bb9bb-e778-4eee-b20b-750bf8b9c59d\") " pod="openshift-console/console-5f6fff848f-58wsb" Apr 21 10:08:24.716763 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:08:24.716636 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/146bb9bb-e778-4eee-b20b-750bf8b9c59d-service-ca\") pod \"console-5f6fff848f-58wsb\" (UID: \"146bb9bb-e778-4eee-b20b-750bf8b9c59d\") " pod="openshift-console/console-5f6fff848f-58wsb" Apr 21 10:08:24.716954 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:08:24.716924 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/146bb9bb-e778-4eee-b20b-750bf8b9c59d-trusted-ca-bundle\") pod \"console-5f6fff848f-58wsb\" (UID: \"146bb9bb-e778-4eee-b20b-750bf8b9c59d\") " pod="openshift-console/console-5f6fff848f-58wsb" Apr 21 10:08:24.718499 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:08:24.718468 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/146bb9bb-e778-4eee-b20b-750bf8b9c59d-console-serving-cert\") pod \"console-5f6fff848f-58wsb\" (UID: \"146bb9bb-e778-4eee-b20b-750bf8b9c59d\") " pod="openshift-console/console-5f6fff848f-58wsb" Apr 21 10:08:24.718601 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:08:24.718546 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/146bb9bb-e778-4eee-b20b-750bf8b9c59d-console-oauth-config\") pod \"console-5f6fff848f-58wsb\" (UID: \"146bb9bb-e778-4eee-b20b-750bf8b9c59d\") " pod="openshift-console/console-5f6fff848f-58wsb" Apr 21 10:08:24.729446 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:08:24.729415 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lf7q6\" (UniqueName: \"kubernetes.io/projected/146bb9bb-e778-4eee-b20b-750bf8b9c59d-kube-api-access-lf7q6\") pod \"console-5f6fff848f-58wsb\" (UID: \"146bb9bb-e778-4eee-b20b-750bf8b9c59d\") " pod="openshift-console/console-5f6fff848f-58wsb" Apr 21 10:08:24.847182 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:08:24.847065 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f6fff848f-58wsb" Apr 21 10:08:24.973440 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:08:24.973396 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f6fff848f-58wsb"] Apr 21 10:08:24.976975 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:08:24.976944 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod146bb9bb_e778_4eee_b20b_750bf8b9c59d.slice/crio-2a3a46bc3e4245f2bc53227c7f9f3302186347ad04dd3e415bbb180cb4b83ad3 WatchSource:0}: Error finding container 2a3a46bc3e4245f2bc53227c7f9f3302186347ad04dd3e415bbb180cb4b83ad3: Status 404 returned error can't find the container with id 2a3a46bc3e4245f2bc53227c7f9f3302186347ad04dd3e415bbb180cb4b83ad3 Apr 21 10:08:25.084766 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:08:25.084720 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f6fff848f-58wsb" event={"ID":"146bb9bb-e778-4eee-b20b-750bf8b9c59d","Type":"ContainerStarted","Data":"61328c9a4ef3149b98118a6a2ca1b64553cdc3ded2570e11ef88c63f7b4adfb1"} Apr 21 10:08:25.084954 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:08:25.084785 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f6fff848f-58wsb" event={"ID":"146bb9bb-e778-4eee-b20b-750bf8b9c59d","Type":"ContainerStarted","Data":"2a3a46bc3e4245f2bc53227c7f9f3302186347ad04dd3e415bbb180cb4b83ad3"} Apr 21 10:08:25.102084 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:08:25.101971 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5f6fff848f-58wsb" podStartSLOduration=1.101951956 podStartE2EDuration="1.101951956s" podCreationTimestamp="2026-04-21 10:08:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:08:25.100625445 +0000 UTC m=+273.439159461" watchObservedRunningTime="2026-04-21 10:08:25.101951956 +0000 UTC m=+273.440485974" Apr 21 10:08:34.848157 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:08:34.848114 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5f6fff848f-58wsb" Apr 21 10:08:34.848593 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:08:34.848186 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5f6fff848f-58wsb" Apr 21 10:08:34.853035 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:08:34.853010 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5f6fff848f-58wsb" Apr 21 10:08:35.115774 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:08:35.115678 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5f6fff848f-58wsb" Apr 21 10:08:35.160536 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:08:35.160500 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-75d6986bd6-bw6jc"] Apr 21 10:08:52.110845 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:08:52.110816 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-9vsls_c7fc40ec-8080-4847-9b12-78671f916c03/console-operator/2.log" Apr 21 10:08:52.114614 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:08:52.114587 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-9vsls_c7fc40ec-8080-4847-9b12-78671f916c03/console-operator/2.log" Apr 21 10:08:52.119456 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:08:52.119424 2577 kubelet.go:1628] "Image garbage collection succeeded" Apr 21 10:09:00.180383 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:00.180323 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-75d6986bd6-bw6jc" podUID="973a2c88-57d4-4b8e-bb09-06e91a8d11cd" containerName="console" containerID="cri-o://d76b7afa0e710caf12621462bf1c193e346eca22edd126f572e24446ed92b62b" gracePeriod=15 Apr 21 10:09:00.412567 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:00.412544 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-75d6986bd6-bw6jc_973a2c88-57d4-4b8e-bb09-06e91a8d11cd/console/0.log" Apr 21 10:09:00.412703 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:00.412605 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-75d6986bd6-bw6jc" Apr 21 10:09:00.485275 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:00.485238 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/973a2c88-57d4-4b8e-bb09-06e91a8d11cd-service-ca\") pod \"973a2c88-57d4-4b8e-bb09-06e91a8d11cd\" (UID: \"973a2c88-57d4-4b8e-bb09-06e91a8d11cd\") " Apr 21 10:09:00.485460 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:00.485295 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-599ml\" (UniqueName: \"kubernetes.io/projected/973a2c88-57d4-4b8e-bb09-06e91a8d11cd-kube-api-access-599ml\") pod \"973a2c88-57d4-4b8e-bb09-06e91a8d11cd\" (UID: \"973a2c88-57d4-4b8e-bb09-06e91a8d11cd\") " Apr 21 10:09:00.485460 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:00.485332 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/973a2c88-57d4-4b8e-bb09-06e91a8d11cd-trusted-ca-bundle\") pod \"973a2c88-57d4-4b8e-bb09-06e91a8d11cd\" (UID: \"973a2c88-57d4-4b8e-bb09-06e91a8d11cd\") " Apr 21 10:09:00.485460 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:00.485356 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/973a2c88-57d4-4b8e-bb09-06e91a8d11cd-oauth-serving-cert\") pod \"973a2c88-57d4-4b8e-bb09-06e91a8d11cd\" (UID: \"973a2c88-57d4-4b8e-bb09-06e91a8d11cd\") " Apr 21 10:09:00.485460 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:00.485390 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/973a2c88-57d4-4b8e-bb09-06e91a8d11cd-console-config\") pod \"973a2c88-57d4-4b8e-bb09-06e91a8d11cd\" (UID: \"973a2c88-57d4-4b8e-bb09-06e91a8d11cd\") " Apr 21 10:09:00.485460 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:00.485425 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/973a2c88-57d4-4b8e-bb09-06e91a8d11cd-console-serving-cert\") pod \"973a2c88-57d4-4b8e-bb09-06e91a8d11cd\" (UID: \"973a2c88-57d4-4b8e-bb09-06e91a8d11cd\") " Apr 21 10:09:00.485689 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:00.485493 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/973a2c88-57d4-4b8e-bb09-06e91a8d11cd-console-oauth-config\") pod \"973a2c88-57d4-4b8e-bb09-06e91a8d11cd\" (UID: \"973a2c88-57d4-4b8e-bb09-06e91a8d11cd\") " Apr 21 10:09:00.485689 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:00.485594 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/973a2c88-57d4-4b8e-bb09-06e91a8d11cd-service-ca" (OuterVolumeSpecName: "service-ca") pod "973a2c88-57d4-4b8e-bb09-06e91a8d11cd" (UID: "973a2c88-57d4-4b8e-bb09-06e91a8d11cd"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:09:00.485825 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:00.485735 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/973a2c88-57d4-4b8e-bb09-06e91a8d11cd-service-ca\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:09:00.485933 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:00.485902 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/973a2c88-57d4-4b8e-bb09-06e91a8d11cd-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "973a2c88-57d4-4b8e-bb09-06e91a8d11cd" (UID: "973a2c88-57d4-4b8e-bb09-06e91a8d11cd"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:09:00.486017 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:00.485923 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/973a2c88-57d4-4b8e-bb09-06e91a8d11cd-console-config" (OuterVolumeSpecName: "console-config") pod "973a2c88-57d4-4b8e-bb09-06e91a8d11cd" (UID: "973a2c88-57d4-4b8e-bb09-06e91a8d11cd"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:09:00.486017 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:00.485912 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/973a2c88-57d4-4b8e-bb09-06e91a8d11cd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "973a2c88-57d4-4b8e-bb09-06e91a8d11cd" (UID: "973a2c88-57d4-4b8e-bb09-06e91a8d11cd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:09:00.487612 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:00.487581 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/973a2c88-57d4-4b8e-bb09-06e91a8d11cd-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "973a2c88-57d4-4b8e-bb09-06e91a8d11cd" (UID: "973a2c88-57d4-4b8e-bb09-06e91a8d11cd"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:09:00.487716 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:00.487620 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/973a2c88-57d4-4b8e-bb09-06e91a8d11cd-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "973a2c88-57d4-4b8e-bb09-06e91a8d11cd" (UID: "973a2c88-57d4-4b8e-bb09-06e91a8d11cd"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:09:00.487716 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:00.487692 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/973a2c88-57d4-4b8e-bb09-06e91a8d11cd-kube-api-access-599ml" (OuterVolumeSpecName: "kube-api-access-599ml") pod "973a2c88-57d4-4b8e-bb09-06e91a8d11cd" (UID: "973a2c88-57d4-4b8e-bb09-06e91a8d11cd"). InnerVolumeSpecName "kube-api-access-599ml". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 10:09:00.587055 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:00.587002 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-599ml\" (UniqueName: \"kubernetes.io/projected/973a2c88-57d4-4b8e-bb09-06e91a8d11cd-kube-api-access-599ml\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:09:00.587055 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:00.587051 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/973a2c88-57d4-4b8e-bb09-06e91a8d11cd-trusted-ca-bundle\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:09:00.587055 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:00.587061 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/973a2c88-57d4-4b8e-bb09-06e91a8d11cd-oauth-serving-cert\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:09:00.587055 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:00.587070 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/973a2c88-57d4-4b8e-bb09-06e91a8d11cd-console-config\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:09:00.587313 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:00.587080 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/973a2c88-57d4-4b8e-bb09-06e91a8d11cd-console-serving-cert\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:09:00.587313 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:00.587089 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/973a2c88-57d4-4b8e-bb09-06e91a8d11cd-console-oauth-config\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:09:01.187379 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:01.187351 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-75d6986bd6-bw6jc_973a2c88-57d4-4b8e-bb09-06e91a8d11cd/console/0.log" Apr 21 10:09:01.187816 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:01.187395 2577 generic.go:358] "Generic (PLEG): container finished" podID="973a2c88-57d4-4b8e-bb09-06e91a8d11cd" containerID="d76b7afa0e710caf12621462bf1c193e346eca22edd126f572e24446ed92b62b" exitCode=2 Apr 21 10:09:01.187816 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:01.187487 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-75d6986bd6-bw6jc" Apr 21 10:09:01.187816 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:01.187486 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-75d6986bd6-bw6jc" event={"ID":"973a2c88-57d4-4b8e-bb09-06e91a8d11cd","Type":"ContainerDied","Data":"d76b7afa0e710caf12621462bf1c193e346eca22edd126f572e24446ed92b62b"} Apr 21 10:09:01.187816 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:01.187534 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-75d6986bd6-bw6jc" event={"ID":"973a2c88-57d4-4b8e-bb09-06e91a8d11cd","Type":"ContainerDied","Data":"13495dfe44b9c7533a585d859494f8109112fe8fd90478d4deb4b0c1f7d048c4"} Apr 21 10:09:01.187816 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:01.187553 2577 scope.go:117] "RemoveContainer" containerID="d76b7afa0e710caf12621462bf1c193e346eca22edd126f572e24446ed92b62b" Apr 21 10:09:01.196389 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:01.196368 2577 scope.go:117] "RemoveContainer" containerID="d76b7afa0e710caf12621462bf1c193e346eca22edd126f572e24446ed92b62b" Apr 21 10:09:01.196642 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:09:01.196622 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d76b7afa0e710caf12621462bf1c193e346eca22edd126f572e24446ed92b62b\": container with ID starting with d76b7afa0e710caf12621462bf1c193e346eca22edd126f572e24446ed92b62b not found: ID does not exist" containerID="d76b7afa0e710caf12621462bf1c193e346eca22edd126f572e24446ed92b62b" Apr 21 10:09:01.196686 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:01.196651 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d76b7afa0e710caf12621462bf1c193e346eca22edd126f572e24446ed92b62b"} err="failed to get container status \"d76b7afa0e710caf12621462bf1c193e346eca22edd126f572e24446ed92b62b\": rpc error: code = NotFound desc = could not find container \"d76b7afa0e710caf12621462bf1c193e346eca22edd126f572e24446ed92b62b\": container with ID starting with d76b7afa0e710caf12621462bf1c193e346eca22edd126f572e24446ed92b62b not found: ID does not exist" Apr 21 10:09:01.208205 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:01.208179 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-75d6986bd6-bw6jc"] Apr 21 10:09:01.210684 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:01.210656 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-75d6986bd6-bw6jc"] Apr 21 10:09:02.231429 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:02.231397 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="973a2c88-57d4-4b8e-bb09-06e91a8d11cd" path="/var/lib/kubelet/pods/973a2c88-57d4-4b8e-bb09-06e91a8d11cd/volumes" Apr 21 10:09:39.592062 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:39.592018 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-697c46fd45-8tb9j"] Apr 21 10:09:39.592582 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:39.592368 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="973a2c88-57d4-4b8e-bb09-06e91a8d11cd" containerName="console" Apr 21 10:09:39.592582 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:39.592391 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="973a2c88-57d4-4b8e-bb09-06e91a8d11cd" containerName="console" Apr 21 10:09:39.592582 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:39.592468 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="973a2c88-57d4-4b8e-bb09-06e91a8d11cd" containerName="console" Apr 21 10:09:39.595486 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:39.595462 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-697c46fd45-8tb9j" Apr 21 10:09:39.602294 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:39.602267 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-697c46fd45-8tb9j"] Apr 21 10:09:39.777468 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:39.777421 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/914fac9c-114b-46b3-9d94-32255647b5bf-console-serving-cert\") pod \"console-697c46fd45-8tb9j\" (UID: \"914fac9c-114b-46b3-9d94-32255647b5bf\") " pod="openshift-console/console-697c46fd45-8tb9j" Apr 21 10:09:39.777468 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:39.777469 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/914fac9c-114b-46b3-9d94-32255647b5bf-service-ca\") pod \"console-697c46fd45-8tb9j\" (UID: \"914fac9c-114b-46b3-9d94-32255647b5bf\") " pod="openshift-console/console-697c46fd45-8tb9j" Apr 21 10:09:39.777681 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:39.777502 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/914fac9c-114b-46b3-9d94-32255647b5bf-console-oauth-config\") pod \"console-697c46fd45-8tb9j\" (UID: \"914fac9c-114b-46b3-9d94-32255647b5bf\") " pod="openshift-console/console-697c46fd45-8tb9j" Apr 21 10:09:39.777681 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:39.777522 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/914fac9c-114b-46b3-9d94-32255647b5bf-trusted-ca-bundle\") pod \"console-697c46fd45-8tb9j\" (UID: \"914fac9c-114b-46b3-9d94-32255647b5bf\") " pod="openshift-console/console-697c46fd45-8tb9j" Apr 21 10:09:39.777681 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:39.777550 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/914fac9c-114b-46b3-9d94-32255647b5bf-console-config\") pod \"console-697c46fd45-8tb9j\" (UID: \"914fac9c-114b-46b3-9d94-32255647b5bf\") " pod="openshift-console/console-697c46fd45-8tb9j" Apr 21 10:09:39.777681 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:39.777573 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/914fac9c-114b-46b3-9d94-32255647b5bf-oauth-serving-cert\") pod \"console-697c46fd45-8tb9j\" (UID: \"914fac9c-114b-46b3-9d94-32255647b5bf\") " pod="openshift-console/console-697c46fd45-8tb9j" Apr 21 10:09:39.777681 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:39.777609 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp7xs\" (UniqueName: \"kubernetes.io/projected/914fac9c-114b-46b3-9d94-32255647b5bf-kube-api-access-pp7xs\") pod \"console-697c46fd45-8tb9j\" (UID: \"914fac9c-114b-46b3-9d94-32255647b5bf\") " pod="openshift-console/console-697c46fd45-8tb9j" Apr 21 10:09:39.878573 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:39.878467 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/914fac9c-114b-46b3-9d94-32255647b5bf-console-serving-cert\") pod \"console-697c46fd45-8tb9j\" (UID: \"914fac9c-114b-46b3-9d94-32255647b5bf\") " pod="openshift-console/console-697c46fd45-8tb9j" Apr 21 10:09:39.878573 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:39.878520 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/914fac9c-114b-46b3-9d94-32255647b5bf-service-ca\") pod \"console-697c46fd45-8tb9j\" (UID: \"914fac9c-114b-46b3-9d94-32255647b5bf\") " pod="openshift-console/console-697c46fd45-8tb9j" Apr 21 10:09:39.878803 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:39.878635 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/914fac9c-114b-46b3-9d94-32255647b5bf-console-oauth-config\") pod \"console-697c46fd45-8tb9j\" (UID: \"914fac9c-114b-46b3-9d94-32255647b5bf\") " pod="openshift-console/console-697c46fd45-8tb9j" Apr 21 10:09:39.878803 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:39.878673 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/914fac9c-114b-46b3-9d94-32255647b5bf-trusted-ca-bundle\") pod \"console-697c46fd45-8tb9j\" (UID: \"914fac9c-114b-46b3-9d94-32255647b5bf\") " pod="openshift-console/console-697c46fd45-8tb9j" Apr 21 10:09:39.878803 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:39.878720 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/914fac9c-114b-46b3-9d94-32255647b5bf-console-config\") pod \"console-697c46fd45-8tb9j\" (UID: \"914fac9c-114b-46b3-9d94-32255647b5bf\") " pod="openshift-console/console-697c46fd45-8tb9j" Apr 21 10:09:39.878803 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:39.878770 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/914fac9c-114b-46b3-9d94-32255647b5bf-oauth-serving-cert\") pod \"console-697c46fd45-8tb9j\" (UID: \"914fac9c-114b-46b3-9d94-32255647b5bf\") " pod="openshift-console/console-697c46fd45-8tb9j" Apr 21 10:09:39.878803 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:39.878799 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pp7xs\" (UniqueName: \"kubernetes.io/projected/914fac9c-114b-46b3-9d94-32255647b5bf-kube-api-access-pp7xs\") pod \"console-697c46fd45-8tb9j\" (UID: \"914fac9c-114b-46b3-9d94-32255647b5bf\") " pod="openshift-console/console-697c46fd45-8tb9j" Apr 21 10:09:39.879459 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:39.879429 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/914fac9c-114b-46b3-9d94-32255647b5bf-service-ca\") pod \"console-697c46fd45-8tb9j\" (UID: \"914fac9c-114b-46b3-9d94-32255647b5bf\") " pod="openshift-console/console-697c46fd45-8tb9j" Apr 21 10:09:39.879548 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:39.879438 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/914fac9c-114b-46b3-9d94-32255647b5bf-oauth-serving-cert\") pod \"console-697c46fd45-8tb9j\" (UID: \"914fac9c-114b-46b3-9d94-32255647b5bf\") " pod="openshift-console/console-697c46fd45-8tb9j" Apr 21 10:09:39.879588 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:39.879549 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/914fac9c-114b-46b3-9d94-32255647b5bf-trusted-ca-bundle\") pod \"console-697c46fd45-8tb9j\" (UID: \"914fac9c-114b-46b3-9d94-32255647b5bf\") " pod="openshift-console/console-697c46fd45-8tb9j" Apr 21 10:09:39.879623 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:39.879603 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/914fac9c-114b-46b3-9d94-32255647b5bf-console-config\") pod \"console-697c46fd45-8tb9j\" (UID: \"914fac9c-114b-46b3-9d94-32255647b5bf\") " pod="openshift-console/console-697c46fd45-8tb9j" Apr 21 10:09:39.881011 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:39.880986 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/914fac9c-114b-46b3-9d94-32255647b5bf-console-serving-cert\") pod \"console-697c46fd45-8tb9j\" (UID: \"914fac9c-114b-46b3-9d94-32255647b5bf\") " pod="openshift-console/console-697c46fd45-8tb9j" Apr 21 10:09:39.881092 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:39.881038 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/914fac9c-114b-46b3-9d94-32255647b5bf-console-oauth-config\") pod \"console-697c46fd45-8tb9j\" (UID: \"914fac9c-114b-46b3-9d94-32255647b5bf\") " pod="openshift-console/console-697c46fd45-8tb9j" Apr 21 10:09:39.887058 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:39.887028 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pp7xs\" (UniqueName: \"kubernetes.io/projected/914fac9c-114b-46b3-9d94-32255647b5bf-kube-api-access-pp7xs\") pod \"console-697c46fd45-8tb9j\" (UID: \"914fac9c-114b-46b3-9d94-32255647b5bf\") " pod="openshift-console/console-697c46fd45-8tb9j" Apr 21 10:09:39.907012 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:39.906978 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-697c46fd45-8tb9j" Apr 21 10:09:40.033959 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:40.033915 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-697c46fd45-8tb9j"] Apr 21 10:09:40.037210 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:09:40.037171 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod914fac9c_114b_46b3_9d94_32255647b5bf.slice/crio-e75bd9ba680c573d3ee8aef12e40a239173030e6622c50befd8486e8764b7529 WatchSource:0}: Error finding container e75bd9ba680c573d3ee8aef12e40a239173030e6622c50befd8486e8764b7529: Status 404 returned error can't find the container with id e75bd9ba680c573d3ee8aef12e40a239173030e6622c50befd8486e8764b7529 Apr 21 10:09:40.039084 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:40.039068 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 10:09:40.289851 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:40.289813 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-697c46fd45-8tb9j" event={"ID":"914fac9c-114b-46b3-9d94-32255647b5bf","Type":"ContainerStarted","Data":"07c2e5832ab81e636c24a3f577fd36af53c55181392b90bdef816fdf5eb8e0d3"} Apr 21 10:09:40.289851 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:40.289854 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-697c46fd45-8tb9j" event={"ID":"914fac9c-114b-46b3-9d94-32255647b5bf","Type":"ContainerStarted","Data":"e75bd9ba680c573d3ee8aef12e40a239173030e6622c50befd8486e8764b7529"} Apr 21 10:09:40.305567 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:40.305518 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-697c46fd45-8tb9j" podStartSLOduration=1.30550408 podStartE2EDuration="1.30550408s" podCreationTimestamp="2026-04-21 10:09:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:09:40.304535468 +0000 UTC m=+348.643069485" watchObservedRunningTime="2026-04-21 10:09:40.30550408 +0000 UTC m=+348.644038097" Apr 21 10:09:49.907808 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:49.907694 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-697c46fd45-8tb9j" Apr 21 10:09:49.907808 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:49.907782 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-697c46fd45-8tb9j" Apr 21 10:09:49.912527 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:49.912503 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-697c46fd45-8tb9j" Apr 21 10:09:50.320100 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:50.320073 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-697c46fd45-8tb9j" Apr 21 10:09:50.395093 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:09:50.395059 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5f6fff848f-58wsb"] Apr 21 10:10:04.029321 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:04.029275 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-gslnz"] Apr 21 10:10:04.033576 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:04.033550 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gslnz" Apr 21 10:10:04.036169 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:04.036146 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 21 10:10:04.042233 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:04.042205 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-gslnz"] Apr 21 10:10:04.059078 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:04.059043 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6c7463f0-ead6-4060-b7c0-a5b97811f455-original-pull-secret\") pod \"global-pull-secret-syncer-gslnz\" (UID: \"6c7463f0-ead6-4060-b7c0-a5b97811f455\") " pod="kube-system/global-pull-secret-syncer-gslnz" Apr 21 10:10:04.059255 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:04.059091 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/6c7463f0-ead6-4060-b7c0-a5b97811f455-dbus\") pod \"global-pull-secret-syncer-gslnz\" (UID: \"6c7463f0-ead6-4060-b7c0-a5b97811f455\") " pod="kube-system/global-pull-secret-syncer-gslnz" Apr 21 10:10:04.059255 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:04.059151 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/6c7463f0-ead6-4060-b7c0-a5b97811f455-kubelet-config\") pod \"global-pull-secret-syncer-gslnz\" (UID: \"6c7463f0-ead6-4060-b7c0-a5b97811f455\") " pod="kube-system/global-pull-secret-syncer-gslnz" Apr 21 10:10:04.159550 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:04.159514 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6c7463f0-ead6-4060-b7c0-a5b97811f455-original-pull-secret\") pod \"global-pull-secret-syncer-gslnz\" (UID: \"6c7463f0-ead6-4060-b7c0-a5b97811f455\") " pod="kube-system/global-pull-secret-syncer-gslnz" Apr 21 10:10:04.159793 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:04.159562 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/6c7463f0-ead6-4060-b7c0-a5b97811f455-dbus\") pod \"global-pull-secret-syncer-gslnz\" (UID: \"6c7463f0-ead6-4060-b7c0-a5b97811f455\") " pod="kube-system/global-pull-secret-syncer-gslnz" Apr 21 10:10:04.159793 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:04.159586 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/6c7463f0-ead6-4060-b7c0-a5b97811f455-kubelet-config\") pod \"global-pull-secret-syncer-gslnz\" (UID: \"6c7463f0-ead6-4060-b7c0-a5b97811f455\") " pod="kube-system/global-pull-secret-syncer-gslnz" Apr 21 10:10:04.159793 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:04.159688 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/6c7463f0-ead6-4060-b7c0-a5b97811f455-kubelet-config\") pod \"global-pull-secret-syncer-gslnz\" (UID: \"6c7463f0-ead6-4060-b7c0-a5b97811f455\") " pod="kube-system/global-pull-secret-syncer-gslnz" Apr 21 10:10:04.159793 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:04.159726 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/6c7463f0-ead6-4060-b7c0-a5b97811f455-dbus\") pod \"global-pull-secret-syncer-gslnz\" (UID: \"6c7463f0-ead6-4060-b7c0-a5b97811f455\") " pod="kube-system/global-pull-secret-syncer-gslnz" Apr 21 10:10:04.161976 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:04.161952 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6c7463f0-ead6-4060-b7c0-a5b97811f455-original-pull-secret\") pod \"global-pull-secret-syncer-gslnz\" (UID: \"6c7463f0-ead6-4060-b7c0-a5b97811f455\") " pod="kube-system/global-pull-secret-syncer-gslnz" Apr 21 10:10:04.344058 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:04.343978 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gslnz" Apr 21 10:10:04.462466 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:04.462434 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-gslnz"] Apr 21 10:10:04.465978 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:10:04.465943 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c7463f0_ead6_4060_b7c0_a5b97811f455.slice/crio-fb05fb8f64004c37e13052fb95d6c8bc19527e5dc77646422762efd040b35866 WatchSource:0}: Error finding container fb05fb8f64004c37e13052fb95d6c8bc19527e5dc77646422762efd040b35866: Status 404 returned error can't find the container with id fb05fb8f64004c37e13052fb95d6c8bc19527e5dc77646422762efd040b35866 Apr 21 10:10:05.362738 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:05.362698 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-gslnz" event={"ID":"6c7463f0-ead6-4060-b7c0-a5b97811f455","Type":"ContainerStarted","Data":"fb05fb8f64004c37e13052fb95d6c8bc19527e5dc77646422762efd040b35866"} Apr 21 10:10:09.376696 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:09.376655 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-gslnz" event={"ID":"6c7463f0-ead6-4060-b7c0-a5b97811f455","Type":"ContainerStarted","Data":"ee9d4647ff94b54bd72c9ba721f2ecdfbec4c38c8559d297e0d80768643773f9"} Apr 21 10:10:09.393691 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:09.393639 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-gslnz" podStartSLOduration=1.522502476 podStartE2EDuration="5.393622767s" podCreationTimestamp="2026-04-21 10:10:04 +0000 UTC" firstStartedPulling="2026-04-21 10:10:04.467599521 +0000 UTC m=+372.806133529" lastFinishedPulling="2026-04-21 10:10:08.338719822 +0000 UTC m=+376.677253820" observedRunningTime="2026-04-21 10:10:09.392528533 +0000 UTC m=+377.731062551" watchObservedRunningTime="2026-04-21 10:10:09.393622767 +0000 UTC m=+377.732156785" Apr 21 10:10:15.414336 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:15.414289 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5f6fff848f-58wsb" podUID="146bb9bb-e778-4eee-b20b-750bf8b9c59d" containerName="console" containerID="cri-o://61328c9a4ef3149b98118a6a2ca1b64553cdc3ded2570e11ef88c63f7b4adfb1" gracePeriod=15 Apr 21 10:10:15.656271 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:15.656247 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5f6fff848f-58wsb_146bb9bb-e778-4eee-b20b-750bf8b9c59d/console/0.log" Apr 21 10:10:15.656410 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:15.656308 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f6fff848f-58wsb" Apr 21 10:10:15.752964 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:15.752929 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/146bb9bb-e778-4eee-b20b-750bf8b9c59d-console-config\") pod \"146bb9bb-e778-4eee-b20b-750bf8b9c59d\" (UID: \"146bb9bb-e778-4eee-b20b-750bf8b9c59d\") " Apr 21 10:10:15.753154 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:15.752982 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/146bb9bb-e778-4eee-b20b-750bf8b9c59d-service-ca\") pod \"146bb9bb-e778-4eee-b20b-750bf8b9c59d\" (UID: \"146bb9bb-e778-4eee-b20b-750bf8b9c59d\") " Apr 21 10:10:15.753154 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:15.753010 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lf7q6\" (UniqueName: \"kubernetes.io/projected/146bb9bb-e778-4eee-b20b-750bf8b9c59d-kube-api-access-lf7q6\") pod \"146bb9bb-e778-4eee-b20b-750bf8b9c59d\" (UID: \"146bb9bb-e778-4eee-b20b-750bf8b9c59d\") " Apr 21 10:10:15.753154 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:15.753034 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/146bb9bb-e778-4eee-b20b-750bf8b9c59d-oauth-serving-cert\") pod \"146bb9bb-e778-4eee-b20b-750bf8b9c59d\" (UID: \"146bb9bb-e778-4eee-b20b-750bf8b9c59d\") " Apr 21 10:10:15.753154 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:15.753055 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/146bb9bb-e778-4eee-b20b-750bf8b9c59d-trusted-ca-bundle\") pod \"146bb9bb-e778-4eee-b20b-750bf8b9c59d\" (UID: \"146bb9bb-e778-4eee-b20b-750bf8b9c59d\") " Apr 21 10:10:15.753154 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:15.753077 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/146bb9bb-e778-4eee-b20b-750bf8b9c59d-console-oauth-config\") pod \"146bb9bb-e778-4eee-b20b-750bf8b9c59d\" (UID: \"146bb9bb-e778-4eee-b20b-750bf8b9c59d\") " Apr 21 10:10:15.753154 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:15.753105 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/146bb9bb-e778-4eee-b20b-750bf8b9c59d-console-serving-cert\") pod \"146bb9bb-e778-4eee-b20b-750bf8b9c59d\" (UID: \"146bb9bb-e778-4eee-b20b-750bf8b9c59d\") " Apr 21 10:10:15.753522 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:15.753483 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/146bb9bb-e778-4eee-b20b-750bf8b9c59d-console-config" (OuterVolumeSpecName: "console-config") pod "146bb9bb-e778-4eee-b20b-750bf8b9c59d" (UID: "146bb9bb-e778-4eee-b20b-750bf8b9c59d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:10:15.753602 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:15.753576 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/146bb9bb-e778-4eee-b20b-750bf8b9c59d-service-ca" (OuterVolumeSpecName: "service-ca") pod "146bb9bb-e778-4eee-b20b-750bf8b9c59d" (UID: "146bb9bb-e778-4eee-b20b-750bf8b9c59d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:10:15.753602 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:15.753586 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/146bb9bb-e778-4eee-b20b-750bf8b9c59d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "146bb9bb-e778-4eee-b20b-750bf8b9c59d" (UID: "146bb9bb-e778-4eee-b20b-750bf8b9c59d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:10:15.753680 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:15.753596 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/146bb9bb-e778-4eee-b20b-750bf8b9c59d-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "146bb9bb-e778-4eee-b20b-750bf8b9c59d" (UID: "146bb9bb-e778-4eee-b20b-750bf8b9c59d"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:10:15.755290 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:15.755261 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/146bb9bb-e778-4eee-b20b-750bf8b9c59d-kube-api-access-lf7q6" (OuterVolumeSpecName: "kube-api-access-lf7q6") pod "146bb9bb-e778-4eee-b20b-750bf8b9c59d" (UID: "146bb9bb-e778-4eee-b20b-750bf8b9c59d"). InnerVolumeSpecName "kube-api-access-lf7q6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 10:10:15.755290 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:15.755274 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/146bb9bb-e778-4eee-b20b-750bf8b9c59d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "146bb9bb-e778-4eee-b20b-750bf8b9c59d" (UID: "146bb9bb-e778-4eee-b20b-750bf8b9c59d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:10:15.755457 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:15.755369 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/146bb9bb-e778-4eee-b20b-750bf8b9c59d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "146bb9bb-e778-4eee-b20b-750bf8b9c59d" (UID: "146bb9bb-e778-4eee-b20b-750bf8b9c59d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:10:15.854559 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:15.854521 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/146bb9bb-e778-4eee-b20b-750bf8b9c59d-oauth-serving-cert\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:10:15.854559 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:15.854550 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/146bb9bb-e778-4eee-b20b-750bf8b9c59d-trusted-ca-bundle\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:10:15.854559 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:15.854561 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/146bb9bb-e778-4eee-b20b-750bf8b9c59d-console-oauth-config\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:10:15.854559 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:15.854571 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/146bb9bb-e778-4eee-b20b-750bf8b9c59d-console-serving-cert\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:10:15.854846 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:15.854580 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/146bb9bb-e778-4eee-b20b-750bf8b9c59d-console-config\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:10:15.854846 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:15.854589 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/146bb9bb-e778-4eee-b20b-750bf8b9c59d-service-ca\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:10:15.854846 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:15.854598 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lf7q6\" (UniqueName: \"kubernetes.io/projected/146bb9bb-e778-4eee-b20b-750bf8b9c59d-kube-api-access-lf7q6\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:10:16.395903 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:16.395876 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5f6fff848f-58wsb_146bb9bb-e778-4eee-b20b-750bf8b9c59d/console/0.log" Apr 21 10:10:16.396077 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:16.395914 2577 generic.go:358] "Generic (PLEG): container finished" podID="146bb9bb-e778-4eee-b20b-750bf8b9c59d" containerID="61328c9a4ef3149b98118a6a2ca1b64553cdc3ded2570e11ef88c63f7b4adfb1" exitCode=2 Apr 21 10:10:16.396077 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:16.395967 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f6fff848f-58wsb" event={"ID":"146bb9bb-e778-4eee-b20b-750bf8b9c59d","Type":"ContainerDied","Data":"61328c9a4ef3149b98118a6a2ca1b64553cdc3ded2570e11ef88c63f7b4adfb1"} Apr 21 10:10:16.396077 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:16.395978 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f6fff848f-58wsb" Apr 21 10:10:16.396077 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:16.395991 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f6fff848f-58wsb" event={"ID":"146bb9bb-e778-4eee-b20b-750bf8b9c59d","Type":"ContainerDied","Data":"2a3a46bc3e4245f2bc53227c7f9f3302186347ad04dd3e415bbb180cb4b83ad3"} Apr 21 10:10:16.396077 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:16.396005 2577 scope.go:117] "RemoveContainer" containerID="61328c9a4ef3149b98118a6a2ca1b64553cdc3ded2570e11ef88c63f7b4adfb1" Apr 21 10:10:16.403424 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:16.403399 2577 scope.go:117] "RemoveContainer" containerID="61328c9a4ef3149b98118a6a2ca1b64553cdc3ded2570e11ef88c63f7b4adfb1" Apr 21 10:10:16.403707 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:10:16.403681 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61328c9a4ef3149b98118a6a2ca1b64553cdc3ded2570e11ef88c63f7b4adfb1\": container with ID starting with 61328c9a4ef3149b98118a6a2ca1b64553cdc3ded2570e11ef88c63f7b4adfb1 not found: ID does not exist" containerID="61328c9a4ef3149b98118a6a2ca1b64553cdc3ded2570e11ef88c63f7b4adfb1" Apr 21 10:10:16.403829 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:16.403718 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61328c9a4ef3149b98118a6a2ca1b64553cdc3ded2570e11ef88c63f7b4adfb1"} err="failed to get container status \"61328c9a4ef3149b98118a6a2ca1b64553cdc3ded2570e11ef88c63f7b4adfb1\": rpc error: code = NotFound desc = could not find container \"61328c9a4ef3149b98118a6a2ca1b64553cdc3ded2570e11ef88c63f7b4adfb1\": container with ID starting with 61328c9a4ef3149b98118a6a2ca1b64553cdc3ded2570e11ef88c63f7b4adfb1 not found: ID does not exist" Apr 21 10:10:16.413742 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:16.413707 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5f6fff848f-58wsb"] Apr 21 10:10:16.416211 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:16.416189 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5f6fff848f-58wsb"] Apr 21 10:10:18.231739 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:18.231695 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="146bb9bb-e778-4eee-b20b-750bf8b9c59d" path="/var/lib/kubelet/pods/146bb9bb-e778-4eee-b20b-750bf8b9c59d/volumes" Apr 21 10:10:41.886926 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:41.886884 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8b5b8cf59-hkqq4"] Apr 21 10:10:41.887573 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:41.887309 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="146bb9bb-e778-4eee-b20b-750bf8b9c59d" containerName="console" Apr 21 10:10:41.887573 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:41.887328 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="146bb9bb-e778-4eee-b20b-750bf8b9c59d" containerName="console" Apr 21 10:10:41.887573 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:41.887410 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="146bb9bb-e778-4eee-b20b-750bf8b9c59d" containerName="console" Apr 21 10:10:41.891482 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:41.891457 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8b5b8cf59-hkqq4" Apr 21 10:10:41.893941 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:41.893918 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 21 10:10:41.894945 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:41.894921 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 21 10:10:41.894945 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:41.894939 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 21 10:10:41.895114 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:41.894964 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 21 10:10:41.895114 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:41.894937 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-bndm8\"" Apr 21 10:10:41.900275 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:41.900253 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8b5b8cf59-hkqq4"] Apr 21 10:10:41.954633 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:41.954591 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlnqz\" (UniqueName: \"kubernetes.io/projected/414a0695-8b79-46f8-a95b-1455cb9afdab-kube-api-access-zlnqz\") pod \"managed-serviceaccount-addon-agent-8b5b8cf59-hkqq4\" (UID: \"414a0695-8b79-46f8-a95b-1455cb9afdab\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8b5b8cf59-hkqq4" Apr 21 10:10:41.954894 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:41.954659 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/414a0695-8b79-46f8-a95b-1455cb9afdab-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-8b5b8cf59-hkqq4\" (UID: \"414a0695-8b79-46f8-a95b-1455cb9afdab\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8b5b8cf59-hkqq4" Apr 21 10:10:41.999247 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:41.999206 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5884f6764d-tnpkj"] Apr 21 10:10:42.022038 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:42.021993 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5884f6764d-tnpkj"] Apr 21 10:10:42.022216 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:42.022117 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5884f6764d-tnpkj" Apr 21 10:10:42.024601 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:42.024579 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 21 10:10:42.024736 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:42.024704 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 21 10:10:42.024882 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:42.024832 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 21 10:10:42.024953 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:42.024882 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 21 10:10:42.055583 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:42.055541 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/414a0695-8b79-46f8-a95b-1455cb9afdab-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-8b5b8cf59-hkqq4\" (UID: \"414a0695-8b79-46f8-a95b-1455cb9afdab\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8b5b8cf59-hkqq4" Apr 21 10:10:42.055818 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:42.055637 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zlnqz\" (UniqueName: \"kubernetes.io/projected/414a0695-8b79-46f8-a95b-1455cb9afdab-kube-api-access-zlnqz\") pod \"managed-serviceaccount-addon-agent-8b5b8cf59-hkqq4\" (UID: \"414a0695-8b79-46f8-a95b-1455cb9afdab\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8b5b8cf59-hkqq4" Apr 21 10:10:42.058134 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:42.058113 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/414a0695-8b79-46f8-a95b-1455cb9afdab-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-8b5b8cf59-hkqq4\" (UID: \"414a0695-8b79-46f8-a95b-1455cb9afdab\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8b5b8cf59-hkqq4" Apr 21 10:10:42.063130 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:42.063111 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlnqz\" (UniqueName: \"kubernetes.io/projected/414a0695-8b79-46f8-a95b-1455cb9afdab-kube-api-access-zlnqz\") pod \"managed-serviceaccount-addon-agent-8b5b8cf59-hkqq4\" (UID: \"414a0695-8b79-46f8-a95b-1455cb9afdab\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8b5b8cf59-hkqq4" Apr 21 10:10:42.156495 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:42.156388 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/283bb22f-e947-4f2f-a920-2f627992ea05-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5884f6764d-tnpkj\" (UID: \"283bb22f-e947-4f2f-a920-2f627992ea05\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5884f6764d-tnpkj" Apr 21 10:10:42.156495 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:42.156464 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/283bb22f-e947-4f2f-a920-2f627992ea05-ca\") pod \"cluster-proxy-proxy-agent-5884f6764d-tnpkj\" (UID: \"283bb22f-e947-4f2f-a920-2f627992ea05\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5884f6764d-tnpkj" Apr 21 10:10:42.156724 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:42.156514 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2n65p\" (UniqueName: \"kubernetes.io/projected/283bb22f-e947-4f2f-a920-2f627992ea05-kube-api-access-2n65p\") pod \"cluster-proxy-proxy-agent-5884f6764d-tnpkj\" (UID: \"283bb22f-e947-4f2f-a920-2f627992ea05\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5884f6764d-tnpkj" Apr 21 10:10:42.156724 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:42.156548 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/283bb22f-e947-4f2f-a920-2f627992ea05-hub\") pod \"cluster-proxy-proxy-agent-5884f6764d-tnpkj\" (UID: \"283bb22f-e947-4f2f-a920-2f627992ea05\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5884f6764d-tnpkj" Apr 21 10:10:42.156724 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:42.156580 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/283bb22f-e947-4f2f-a920-2f627992ea05-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5884f6764d-tnpkj\" (UID: \"283bb22f-e947-4f2f-a920-2f627992ea05\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5884f6764d-tnpkj" Apr 21 10:10:42.156724 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:42.156600 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/283bb22f-e947-4f2f-a920-2f627992ea05-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5884f6764d-tnpkj\" (UID: \"283bb22f-e947-4f2f-a920-2f627992ea05\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5884f6764d-tnpkj" Apr 21 10:10:42.210687 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:42.210652 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8b5b8cf59-hkqq4" Apr 21 10:10:42.257989 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:42.257949 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/283bb22f-e947-4f2f-a920-2f627992ea05-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5884f6764d-tnpkj\" (UID: \"283bb22f-e947-4f2f-a920-2f627992ea05\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5884f6764d-tnpkj" Apr 21 10:10:42.257989 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:42.257992 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/283bb22f-e947-4f2f-a920-2f627992ea05-ca\") pod \"cluster-proxy-proxy-agent-5884f6764d-tnpkj\" (UID: \"283bb22f-e947-4f2f-a920-2f627992ea05\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5884f6764d-tnpkj" Apr 21 10:10:42.258237 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:42.258035 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2n65p\" (UniqueName: \"kubernetes.io/projected/283bb22f-e947-4f2f-a920-2f627992ea05-kube-api-access-2n65p\") pod \"cluster-proxy-proxy-agent-5884f6764d-tnpkj\" (UID: \"283bb22f-e947-4f2f-a920-2f627992ea05\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5884f6764d-tnpkj" Apr 21 10:10:42.258237 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:42.258069 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/283bb22f-e947-4f2f-a920-2f627992ea05-hub\") pod \"cluster-proxy-proxy-agent-5884f6764d-tnpkj\" (UID: \"283bb22f-e947-4f2f-a920-2f627992ea05\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5884f6764d-tnpkj" Apr 21 10:10:42.258237 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:42.258104 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/283bb22f-e947-4f2f-a920-2f627992ea05-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5884f6764d-tnpkj\" (UID: \"283bb22f-e947-4f2f-a920-2f627992ea05\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5884f6764d-tnpkj" Apr 21 10:10:42.258237 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:42.258127 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/283bb22f-e947-4f2f-a920-2f627992ea05-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5884f6764d-tnpkj\" (UID: \"283bb22f-e947-4f2f-a920-2f627992ea05\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5884f6764d-tnpkj" Apr 21 10:10:42.259000 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:42.258958 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/283bb22f-e947-4f2f-a920-2f627992ea05-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5884f6764d-tnpkj\" (UID: \"283bb22f-e947-4f2f-a920-2f627992ea05\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5884f6764d-tnpkj" Apr 21 10:10:42.261678 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:42.261632 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/283bb22f-e947-4f2f-a920-2f627992ea05-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5884f6764d-tnpkj\" (UID: \"283bb22f-e947-4f2f-a920-2f627992ea05\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5884f6764d-tnpkj" Apr 21 10:10:42.262332 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:42.262070 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/283bb22f-e947-4f2f-a920-2f627992ea05-ca\") pod \"cluster-proxy-proxy-agent-5884f6764d-tnpkj\" (UID: \"283bb22f-e947-4f2f-a920-2f627992ea05\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5884f6764d-tnpkj" Apr 21 10:10:42.262332 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:42.262268 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/283bb22f-e947-4f2f-a920-2f627992ea05-hub\") pod \"cluster-proxy-proxy-agent-5884f6764d-tnpkj\" (UID: \"283bb22f-e947-4f2f-a920-2f627992ea05\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5884f6764d-tnpkj" Apr 21 10:10:42.263832 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:42.263397 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/283bb22f-e947-4f2f-a920-2f627992ea05-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5884f6764d-tnpkj\" (UID: \"283bb22f-e947-4f2f-a920-2f627992ea05\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5884f6764d-tnpkj" Apr 21 10:10:42.267176 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:42.267132 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2n65p\" (UniqueName: \"kubernetes.io/projected/283bb22f-e947-4f2f-a920-2f627992ea05-kube-api-access-2n65p\") pod \"cluster-proxy-proxy-agent-5884f6764d-tnpkj\" (UID: \"283bb22f-e947-4f2f-a920-2f627992ea05\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5884f6764d-tnpkj" Apr 21 10:10:42.331344 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:42.331304 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5884f6764d-tnpkj" Apr 21 10:10:42.334709 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:42.334673 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8b5b8cf59-hkqq4"] Apr 21 10:10:42.338694 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:10:42.338667 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod414a0695_8b79_46f8_a95b_1455cb9afdab.slice/crio-443aeb5b929597b745531cab28b84b6c79b67909da9f2535dc2a832edd326909 WatchSource:0}: Error finding container 443aeb5b929597b745531cab28b84b6c79b67909da9f2535dc2a832edd326909: Status 404 returned error can't find the container with id 443aeb5b929597b745531cab28b84b6c79b67909da9f2535dc2a832edd326909 Apr 21 10:10:42.457307 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:42.457277 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5884f6764d-tnpkj"] Apr 21 10:10:42.460439 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:10:42.460410 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod283bb22f_e947_4f2f_a920_2f627992ea05.slice/crio-9fcbda9644cea15785a4c30c3cb1ec395e7525d4bb1412b93050423ca8b9f117 WatchSource:0}: Error finding container 9fcbda9644cea15785a4c30c3cb1ec395e7525d4bb1412b93050423ca8b9f117: Status 404 returned error can't find the container with id 9fcbda9644cea15785a4c30c3cb1ec395e7525d4bb1412b93050423ca8b9f117 Apr 21 10:10:42.474498 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:42.474448 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8b5b8cf59-hkqq4" event={"ID":"414a0695-8b79-46f8-a95b-1455cb9afdab","Type":"ContainerStarted","Data":"443aeb5b929597b745531cab28b84b6c79b67909da9f2535dc2a832edd326909"} Apr 21 10:10:42.475529 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:42.475497 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5884f6764d-tnpkj" event={"ID":"283bb22f-e947-4f2f-a920-2f627992ea05","Type":"ContainerStarted","Data":"9fcbda9644cea15785a4c30c3cb1ec395e7525d4bb1412b93050423ca8b9f117"} Apr 21 10:10:46.488589 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:46.488550 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8b5b8cf59-hkqq4" event={"ID":"414a0695-8b79-46f8-a95b-1455cb9afdab","Type":"ContainerStarted","Data":"04c177dff28757880061cb0d6b1f7b54733ef93b94721fafa123267cdea91ef8"} Apr 21 10:10:46.489859 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:46.489835 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5884f6764d-tnpkj" event={"ID":"283bb22f-e947-4f2f-a920-2f627992ea05","Type":"ContainerStarted","Data":"c68aedb170f2d953fb091e0570df87d1bd092306c81eadddaf8d00d7d7385f16"} Apr 21 10:10:46.503820 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:46.503734 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-8b5b8cf59-hkqq4" podStartSLOduration=1.960333321 podStartE2EDuration="5.503717616s" podCreationTimestamp="2026-04-21 10:10:41 +0000 UTC" firstStartedPulling="2026-04-21 10:10:42.340692745 +0000 UTC m=+410.679226744" lastFinishedPulling="2026-04-21 10:10:45.884077045 +0000 UTC m=+414.222611039" observedRunningTime="2026-04-21 10:10:46.502734286 +0000 UTC m=+414.841268314" watchObservedRunningTime="2026-04-21 10:10:46.503717616 +0000 UTC m=+414.842251633" Apr 21 10:10:49.500267 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:49.500231 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5884f6764d-tnpkj" event={"ID":"283bb22f-e947-4f2f-a920-2f627992ea05","Type":"ContainerStarted","Data":"d93954a39056d70c9c5e2250fa4eb0715db385a05e13bcd45759f3ef9f4ed111"} Apr 21 10:10:49.500267 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:49.500270 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5884f6764d-tnpkj" event={"ID":"283bb22f-e947-4f2f-a920-2f627992ea05","Type":"ContainerStarted","Data":"cc18f0ae2cb106bfa525360ce9e2acf1d5ff6302b9e6a4b36f3e53c57dcefc2e"} Apr 21 10:10:49.518401 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:10:49.518336 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5884f6764d-tnpkj" podStartSLOduration=1.811558324 podStartE2EDuration="8.518315595s" podCreationTimestamp="2026-04-21 10:10:41 +0000 UTC" firstStartedPulling="2026-04-21 10:10:42.462244262 +0000 UTC m=+410.800778261" lastFinishedPulling="2026-04-21 10:10:49.169001523 +0000 UTC m=+417.507535532" observedRunningTime="2026-04-21 10:10:49.516152681 +0000 UTC m=+417.854686700" watchObservedRunningTime="2026-04-21 10:10:49.518315595 +0000 UTC m=+417.856849614" Apr 21 10:12:27.283148 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:12:27.283116 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-c2h6w"] Apr 21 10:12:27.286183 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:12:27.286162 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-c2h6w" Apr 21 10:12:27.288513 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:12:27.288485 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 21 10:12:27.288631 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:12:27.288552 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 21 10:12:27.289336 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:12:27.289318 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 21 10:12:27.289439 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:12:27.289341 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-mv2mf\"" Apr 21 10:12:27.295331 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:12:27.295307 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-c2h6w"] Apr 21 10:12:27.369472 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:12:27.369427 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/728a7d4c-4b8c-45db-a0ab-ff664f9148bf-data\") pod \"seaweedfs-86cc847c5c-c2h6w\" (UID: \"728a7d4c-4b8c-45db-a0ab-ff664f9148bf\") " pod="kserve/seaweedfs-86cc847c5c-c2h6w" Apr 21 10:12:27.369472 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:12:27.369472 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt6q4\" (UniqueName: \"kubernetes.io/projected/728a7d4c-4b8c-45db-a0ab-ff664f9148bf-kube-api-access-gt6q4\") pod \"seaweedfs-86cc847c5c-c2h6w\" (UID: \"728a7d4c-4b8c-45db-a0ab-ff664f9148bf\") " pod="kserve/seaweedfs-86cc847c5c-c2h6w" Apr 21 10:12:27.470086 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:12:27.470031 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/728a7d4c-4b8c-45db-a0ab-ff664f9148bf-data\") pod \"seaweedfs-86cc847c5c-c2h6w\" (UID: \"728a7d4c-4b8c-45db-a0ab-ff664f9148bf\") " pod="kserve/seaweedfs-86cc847c5c-c2h6w" Apr 21 10:12:27.470086 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:12:27.470091 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gt6q4\" (UniqueName: \"kubernetes.io/projected/728a7d4c-4b8c-45db-a0ab-ff664f9148bf-kube-api-access-gt6q4\") pod \"seaweedfs-86cc847c5c-c2h6w\" (UID: \"728a7d4c-4b8c-45db-a0ab-ff664f9148bf\") " pod="kserve/seaweedfs-86cc847c5c-c2h6w" Apr 21 10:12:27.470439 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:12:27.470419 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/728a7d4c-4b8c-45db-a0ab-ff664f9148bf-data\") pod \"seaweedfs-86cc847c5c-c2h6w\" (UID: \"728a7d4c-4b8c-45db-a0ab-ff664f9148bf\") " pod="kserve/seaweedfs-86cc847c5c-c2h6w" Apr 21 10:12:27.489839 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:12:27.489810 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt6q4\" (UniqueName: \"kubernetes.io/projected/728a7d4c-4b8c-45db-a0ab-ff664f9148bf-kube-api-access-gt6q4\") pod \"seaweedfs-86cc847c5c-c2h6w\" (UID: \"728a7d4c-4b8c-45db-a0ab-ff664f9148bf\") " pod="kserve/seaweedfs-86cc847c5c-c2h6w" Apr 21 10:12:27.595534 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:12:27.595426 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-c2h6w" Apr 21 10:12:27.722239 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:12:27.722205 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-c2h6w"] Apr 21 10:12:27.725238 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:12:27.725207 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod728a7d4c_4b8c_45db_a0ab_ff664f9148bf.slice/crio-a349f84bc5c351b3634eebb03cc22498b9f502d609c0d4363c8442807f043318 WatchSource:0}: Error finding container a349f84bc5c351b3634eebb03cc22498b9f502d609c0d4363c8442807f043318: Status 404 returned error can't find the container with id a349f84bc5c351b3634eebb03cc22498b9f502d609c0d4363c8442807f043318 Apr 21 10:12:27.777713 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:12:27.777654 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-c2h6w" event={"ID":"728a7d4c-4b8c-45db-a0ab-ff664f9148bf","Type":"ContainerStarted","Data":"a349f84bc5c351b3634eebb03cc22498b9f502d609c0d4363c8442807f043318"} Apr 21 10:12:30.789569 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:12:30.789487 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-c2h6w" event={"ID":"728a7d4c-4b8c-45db-a0ab-ff664f9148bf","Type":"ContainerStarted","Data":"d01cd11a1e9aecb948fada40c3f9380b034312e61a99e97b48e5dd37cd18ca61"} Apr 21 10:12:30.789944 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:12:30.789611 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-c2h6w" Apr 21 10:12:30.805888 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:12:30.805827 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-c2h6w" podStartSLOduration=1.054480341 podStartE2EDuration="3.805811705s" podCreationTimestamp="2026-04-21 10:12:27 +0000 UTC" firstStartedPulling="2026-04-21 10:12:27.72639114 +0000 UTC m=+516.064925136" lastFinishedPulling="2026-04-21 10:12:30.477722505 +0000 UTC m=+518.816256500" observedRunningTime="2026-04-21 10:12:30.804389422 +0000 UTC m=+519.142923440" watchObservedRunningTime="2026-04-21 10:12:30.805811705 +0000 UTC m=+519.144345722" Apr 21 10:12:36.794815 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:12:36.794781 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-c2h6w" Apr 21 10:13:24.848916 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:13:24.848811 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-65885cbd5d-6gqst"] Apr 21 10:13:24.852358 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:13:24.852331 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65885cbd5d-6gqst" Apr 21 10:13:24.872072 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:13:24.872042 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-65885cbd5d-6gqst"] Apr 21 10:13:24.932871 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:13:24.932834 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/276784b9-3960-4ce8-b86d-cc031d35d6cd-trusted-ca-bundle\") pod \"console-65885cbd5d-6gqst\" (UID: \"276784b9-3960-4ce8-b86d-cc031d35d6cd\") " pod="openshift-console/console-65885cbd5d-6gqst" Apr 21 10:13:24.933078 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:13:24.932923 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/276784b9-3960-4ce8-b86d-cc031d35d6cd-oauth-serving-cert\") pod \"console-65885cbd5d-6gqst\" (UID: \"276784b9-3960-4ce8-b86d-cc031d35d6cd\") " pod="openshift-console/console-65885cbd5d-6gqst" Apr 21 10:13:24.933078 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:13:24.932989 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/276784b9-3960-4ce8-b86d-cc031d35d6cd-service-ca\") pod \"console-65885cbd5d-6gqst\" (UID: \"276784b9-3960-4ce8-b86d-cc031d35d6cd\") " pod="openshift-console/console-65885cbd5d-6gqst" Apr 21 10:13:24.933078 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:13:24.933068 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/276784b9-3960-4ce8-b86d-cc031d35d6cd-console-serving-cert\") pod \"console-65885cbd5d-6gqst\" (UID: \"276784b9-3960-4ce8-b86d-cc031d35d6cd\") " pod="openshift-console/console-65885cbd5d-6gqst" Apr 21 10:13:24.933461 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:13:24.933103 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/276784b9-3960-4ce8-b86d-cc031d35d6cd-console-oauth-config\") pod \"console-65885cbd5d-6gqst\" (UID: \"276784b9-3960-4ce8-b86d-cc031d35d6cd\") " pod="openshift-console/console-65885cbd5d-6gqst" Apr 21 10:13:24.933461 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:13:24.933174 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/276784b9-3960-4ce8-b86d-cc031d35d6cd-console-config\") pod \"console-65885cbd5d-6gqst\" (UID: \"276784b9-3960-4ce8-b86d-cc031d35d6cd\") " pod="openshift-console/console-65885cbd5d-6gqst" Apr 21 10:13:24.933461 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:13:24.933216 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfq9l\" (UniqueName: \"kubernetes.io/projected/276784b9-3960-4ce8-b86d-cc031d35d6cd-kube-api-access-wfq9l\") pod \"console-65885cbd5d-6gqst\" (UID: \"276784b9-3960-4ce8-b86d-cc031d35d6cd\") " pod="openshift-console/console-65885cbd5d-6gqst" Apr 21 10:13:25.033545 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:13:25.033504 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/276784b9-3960-4ce8-b86d-cc031d35d6cd-trusted-ca-bundle\") pod \"console-65885cbd5d-6gqst\" (UID: \"276784b9-3960-4ce8-b86d-cc031d35d6cd\") " pod="openshift-console/console-65885cbd5d-6gqst" Apr 21 10:13:25.033545 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:13:25.033543 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/276784b9-3960-4ce8-b86d-cc031d35d6cd-oauth-serving-cert\") pod \"console-65885cbd5d-6gqst\" (UID: \"276784b9-3960-4ce8-b86d-cc031d35d6cd\") " pod="openshift-console/console-65885cbd5d-6gqst" Apr 21 10:13:25.033850 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:13:25.033573 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/276784b9-3960-4ce8-b86d-cc031d35d6cd-service-ca\") pod \"console-65885cbd5d-6gqst\" (UID: \"276784b9-3960-4ce8-b86d-cc031d35d6cd\") " pod="openshift-console/console-65885cbd5d-6gqst" Apr 21 10:13:25.033850 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:13:25.033611 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/276784b9-3960-4ce8-b86d-cc031d35d6cd-console-serving-cert\") pod \"console-65885cbd5d-6gqst\" (UID: \"276784b9-3960-4ce8-b86d-cc031d35d6cd\") " pod="openshift-console/console-65885cbd5d-6gqst" Apr 21 10:13:25.033850 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:13:25.033632 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/276784b9-3960-4ce8-b86d-cc031d35d6cd-console-oauth-config\") pod \"console-65885cbd5d-6gqst\" (UID: \"276784b9-3960-4ce8-b86d-cc031d35d6cd\") " pod="openshift-console/console-65885cbd5d-6gqst" Apr 21 10:13:25.033850 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:13:25.033657 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/276784b9-3960-4ce8-b86d-cc031d35d6cd-console-config\") pod \"console-65885cbd5d-6gqst\" (UID: \"276784b9-3960-4ce8-b86d-cc031d35d6cd\") " pod="openshift-console/console-65885cbd5d-6gqst" Apr 21 10:13:25.033850 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:13:25.033673 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wfq9l\" (UniqueName: \"kubernetes.io/projected/276784b9-3960-4ce8-b86d-cc031d35d6cd-kube-api-access-wfq9l\") pod \"console-65885cbd5d-6gqst\" (UID: \"276784b9-3960-4ce8-b86d-cc031d35d6cd\") " pod="openshift-console/console-65885cbd5d-6gqst" Apr 21 10:13:25.034442 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:13:25.034389 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/276784b9-3960-4ce8-b86d-cc031d35d6cd-oauth-serving-cert\") pod \"console-65885cbd5d-6gqst\" (UID: \"276784b9-3960-4ce8-b86d-cc031d35d6cd\") " pod="openshift-console/console-65885cbd5d-6gqst" Apr 21 10:13:25.034574 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:13:25.034493 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/276784b9-3960-4ce8-b86d-cc031d35d6cd-service-ca\") pod \"console-65885cbd5d-6gqst\" (UID: \"276784b9-3960-4ce8-b86d-cc031d35d6cd\") " pod="openshift-console/console-65885cbd5d-6gqst" Apr 21 10:13:25.034574 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:13:25.034534 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/276784b9-3960-4ce8-b86d-cc031d35d6cd-trusted-ca-bundle\") pod \"console-65885cbd5d-6gqst\" (UID: \"276784b9-3960-4ce8-b86d-cc031d35d6cd\") " pod="openshift-console/console-65885cbd5d-6gqst" Apr 21 10:13:25.034574 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:13:25.034549 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/276784b9-3960-4ce8-b86d-cc031d35d6cd-console-config\") pod \"console-65885cbd5d-6gqst\" (UID: \"276784b9-3960-4ce8-b86d-cc031d35d6cd\") " pod="openshift-console/console-65885cbd5d-6gqst" Apr 21 10:13:25.036214 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:13:25.036187 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/276784b9-3960-4ce8-b86d-cc031d35d6cd-console-oauth-config\") pod \"console-65885cbd5d-6gqst\" (UID: \"276784b9-3960-4ce8-b86d-cc031d35d6cd\") " pod="openshift-console/console-65885cbd5d-6gqst" Apr 21 10:13:25.036453 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:13:25.036430 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/276784b9-3960-4ce8-b86d-cc031d35d6cd-console-serving-cert\") pod \"console-65885cbd5d-6gqst\" (UID: \"276784b9-3960-4ce8-b86d-cc031d35d6cd\") " pod="openshift-console/console-65885cbd5d-6gqst" Apr 21 10:13:25.042292 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:13:25.042268 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfq9l\" (UniqueName: \"kubernetes.io/projected/276784b9-3960-4ce8-b86d-cc031d35d6cd-kube-api-access-wfq9l\") pod \"console-65885cbd5d-6gqst\" (UID: \"276784b9-3960-4ce8-b86d-cc031d35d6cd\") " pod="openshift-console/console-65885cbd5d-6gqst" Apr 21 10:13:25.162346 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:13:25.162237 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-65885cbd5d-6gqst" Apr 21 10:13:25.286725 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:13:25.286681 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-65885cbd5d-6gqst"] Apr 21 10:13:25.289605 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:13:25.289555 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod276784b9_3960_4ce8_b86d_cc031d35d6cd.slice/crio-9bd9ee9fa95eec73e5b3130acc531011b367231820f3b09cdb2865aaf4e3ee16 WatchSource:0}: Error finding container 9bd9ee9fa95eec73e5b3130acc531011b367231820f3b09cdb2865aaf4e3ee16: Status 404 returned error can't find the container with id 9bd9ee9fa95eec73e5b3130acc531011b367231820f3b09cdb2865aaf4e3ee16 Apr 21 10:13:25.947631 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:13:25.947593 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65885cbd5d-6gqst" event={"ID":"276784b9-3960-4ce8-b86d-cc031d35d6cd","Type":"ContainerStarted","Data":"0d6ef439619c10111468b0648bbd003e51e92467378647af9236932886d694a2"} Apr 21 10:13:25.947631 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:13:25.947631 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-65885cbd5d-6gqst" event={"ID":"276784b9-3960-4ce8-b86d-cc031d35d6cd","Type":"ContainerStarted","Data":"9bd9ee9fa95eec73e5b3130acc531011b367231820f3b09cdb2865aaf4e3ee16"} Apr 21 10:13:25.969244 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:13:25.969195 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-65885cbd5d-6gqst" podStartSLOduration=1.969179432 podStartE2EDuration="1.969179432s" podCreationTimestamp="2026-04-21 10:13:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:13:25.967936054 +0000 UTC m=+574.306470070" watchObservedRunningTime="2026-04-21 10:13:25.969179432 +0000 UTC m=+574.307713448" Apr 21 10:13:35.162397 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:13:35.162349 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-65885cbd5d-6gqst" Apr 21 10:13:35.162397 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:13:35.162391 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-65885cbd5d-6gqst" Apr 21 10:13:35.167226 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:13:35.167202 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-65885cbd5d-6gqst" Apr 21 10:13:35.981435 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:13:35.981403 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-65885cbd5d-6gqst" Apr 21 10:13:36.019862 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:13:36.019820 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-697c46fd45-8tb9j"] Apr 21 10:13:52.135769 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:13:52.135726 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-9vsls_c7fc40ec-8080-4847-9b12-78671f916c03/console-operator/2.log" Apr 21 10:13:52.137327 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:13:52.137302 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-9vsls_c7fc40ec-8080-4847-9b12-78671f916c03/console-operator/2.log" Apr 21 10:14:01.039911 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:14:01.039872 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-697c46fd45-8tb9j" podUID="914fac9c-114b-46b3-9d94-32255647b5bf" containerName="console" containerID="cri-o://07c2e5832ab81e636c24a3f577fd36af53c55181392b90bdef816fdf5eb8e0d3" gracePeriod=15 Apr 21 10:14:01.276190 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:14:01.276166 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-697c46fd45-8tb9j_914fac9c-114b-46b3-9d94-32255647b5bf/console/0.log" Apr 21 10:14:01.276327 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:14:01.276231 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-697c46fd45-8tb9j" Apr 21 10:14:01.411647 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:14:01.411551 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/914fac9c-114b-46b3-9d94-32255647b5bf-console-config\") pod \"914fac9c-114b-46b3-9d94-32255647b5bf\" (UID: \"914fac9c-114b-46b3-9d94-32255647b5bf\") " Apr 21 10:14:01.411647 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:14:01.411592 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/914fac9c-114b-46b3-9d94-32255647b5bf-oauth-serving-cert\") pod \"914fac9c-114b-46b3-9d94-32255647b5bf\" (UID: \"914fac9c-114b-46b3-9d94-32255647b5bf\") " Apr 21 10:14:01.411647 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:14:01.411636 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/914fac9c-114b-46b3-9d94-32255647b5bf-trusted-ca-bundle\") pod \"914fac9c-114b-46b3-9d94-32255647b5bf\" (UID: \"914fac9c-114b-46b3-9d94-32255647b5bf\") " Apr 21 10:14:01.411951 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:14:01.411665 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/914fac9c-114b-46b3-9d94-32255647b5bf-console-serving-cert\") pod \"914fac9c-114b-46b3-9d94-32255647b5bf\" (UID: \"914fac9c-114b-46b3-9d94-32255647b5bf\") " Apr 21 10:14:01.411951 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:14:01.411701 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/914fac9c-114b-46b3-9d94-32255647b5bf-service-ca\") pod \"914fac9c-114b-46b3-9d94-32255647b5bf\" (UID: \"914fac9c-114b-46b3-9d94-32255647b5bf\") " Apr 21 10:14:01.411951 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:14:01.411721 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/914fac9c-114b-46b3-9d94-32255647b5bf-console-oauth-config\") pod \"914fac9c-114b-46b3-9d94-32255647b5bf\" (UID: \"914fac9c-114b-46b3-9d94-32255647b5bf\") " Apr 21 10:14:01.411951 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:14:01.411853 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pp7xs\" (UniqueName: \"kubernetes.io/projected/914fac9c-114b-46b3-9d94-32255647b5bf-kube-api-access-pp7xs\") pod \"914fac9c-114b-46b3-9d94-32255647b5bf\" (UID: \"914fac9c-114b-46b3-9d94-32255647b5bf\") " Apr 21 10:14:01.412157 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:14:01.412131 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/914fac9c-114b-46b3-9d94-32255647b5bf-console-config" (OuterVolumeSpecName: "console-config") pod "914fac9c-114b-46b3-9d94-32255647b5bf" (UID: "914fac9c-114b-46b3-9d94-32255647b5bf"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:14:01.412269 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:14:01.412248 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/914fac9c-114b-46b3-9d94-32255647b5bf-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "914fac9c-114b-46b3-9d94-32255647b5bf" (UID: "914fac9c-114b-46b3-9d94-32255647b5bf"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:14:01.412319 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:14:01.412264 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/914fac9c-114b-46b3-9d94-32255647b5bf-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "914fac9c-114b-46b3-9d94-32255647b5bf" (UID: "914fac9c-114b-46b3-9d94-32255647b5bf"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:14:01.412357 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:14:01.412305 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/914fac9c-114b-46b3-9d94-32255647b5bf-service-ca" (OuterVolumeSpecName: "service-ca") pod "914fac9c-114b-46b3-9d94-32255647b5bf" (UID: "914fac9c-114b-46b3-9d94-32255647b5bf"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:14:01.414170 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:14:01.414145 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/914fac9c-114b-46b3-9d94-32255647b5bf-kube-api-access-pp7xs" (OuterVolumeSpecName: "kube-api-access-pp7xs") pod "914fac9c-114b-46b3-9d94-32255647b5bf" (UID: "914fac9c-114b-46b3-9d94-32255647b5bf"). InnerVolumeSpecName "kube-api-access-pp7xs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 10:14:01.414270 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:14:01.414144 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/914fac9c-114b-46b3-9d94-32255647b5bf-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "914fac9c-114b-46b3-9d94-32255647b5bf" (UID: "914fac9c-114b-46b3-9d94-32255647b5bf"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:14:01.414270 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:14:01.414205 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/914fac9c-114b-46b3-9d94-32255647b5bf-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "914fac9c-114b-46b3-9d94-32255647b5bf" (UID: "914fac9c-114b-46b3-9d94-32255647b5bf"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 21 10:14:01.513465 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:14:01.513425 2577 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/914fac9c-114b-46b3-9d94-32255647b5bf-trusted-ca-bundle\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:14:01.513465 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:14:01.513457 2577 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/914fac9c-114b-46b3-9d94-32255647b5bf-console-serving-cert\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:14:01.513465 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:14:01.513467 2577 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/914fac9c-114b-46b3-9d94-32255647b5bf-service-ca\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:14:01.513465 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:14:01.513476 2577 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/914fac9c-114b-46b3-9d94-32255647b5bf-console-oauth-config\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:14:01.513735 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:14:01.513488 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pp7xs\" (UniqueName: \"kubernetes.io/projected/914fac9c-114b-46b3-9d94-32255647b5bf-kube-api-access-pp7xs\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:14:01.513735 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:14:01.513498 2577 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/914fac9c-114b-46b3-9d94-32255647b5bf-console-config\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:14:01.513735 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:14:01.513507 2577 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/914fac9c-114b-46b3-9d94-32255647b5bf-oauth-serving-cert\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:14:02.051172 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:14:02.051141 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-697c46fd45-8tb9j_914fac9c-114b-46b3-9d94-32255647b5bf/console/0.log" Apr 21 10:14:02.051538 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:14:02.051183 2577 generic.go:358] "Generic (PLEG): container finished" podID="914fac9c-114b-46b3-9d94-32255647b5bf" containerID="07c2e5832ab81e636c24a3f577fd36af53c55181392b90bdef816fdf5eb8e0d3" exitCode=2 Apr 21 10:14:02.051538 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:14:02.051280 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-697c46fd45-8tb9j" Apr 21 10:14:02.051538 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:14:02.051273 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-697c46fd45-8tb9j" event={"ID":"914fac9c-114b-46b3-9d94-32255647b5bf","Type":"ContainerDied","Data":"07c2e5832ab81e636c24a3f577fd36af53c55181392b90bdef816fdf5eb8e0d3"} Apr 21 10:14:02.051538 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:14:02.051387 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-697c46fd45-8tb9j" event={"ID":"914fac9c-114b-46b3-9d94-32255647b5bf","Type":"ContainerDied","Data":"e75bd9ba680c573d3ee8aef12e40a239173030e6622c50befd8486e8764b7529"} Apr 21 10:14:02.051538 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:14:02.051403 2577 scope.go:117] "RemoveContainer" containerID="07c2e5832ab81e636c24a3f577fd36af53c55181392b90bdef816fdf5eb8e0d3" Apr 21 10:14:02.059489 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:14:02.059461 2577 scope.go:117] "RemoveContainer" containerID="07c2e5832ab81e636c24a3f577fd36af53c55181392b90bdef816fdf5eb8e0d3" Apr 21 10:14:02.059907 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:14:02.059884 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07c2e5832ab81e636c24a3f577fd36af53c55181392b90bdef816fdf5eb8e0d3\": container with ID starting with 07c2e5832ab81e636c24a3f577fd36af53c55181392b90bdef816fdf5eb8e0d3 not found: ID does not exist" containerID="07c2e5832ab81e636c24a3f577fd36af53c55181392b90bdef816fdf5eb8e0d3" Apr 21 10:14:02.059980 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:14:02.059919 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07c2e5832ab81e636c24a3f577fd36af53c55181392b90bdef816fdf5eb8e0d3"} err="failed to get container status \"07c2e5832ab81e636c24a3f577fd36af53c55181392b90bdef816fdf5eb8e0d3\": rpc error: code = NotFound desc = could not find container \"07c2e5832ab81e636c24a3f577fd36af53c55181392b90bdef816fdf5eb8e0d3\": container with ID starting with 07c2e5832ab81e636c24a3f577fd36af53c55181392b90bdef816fdf5eb8e0d3 not found: ID does not exist" Apr 21 10:14:02.071589 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:14:02.071557 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-697c46fd45-8tb9j"] Apr 21 10:14:02.075022 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:14:02.074994 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-697c46fd45-8tb9j"] Apr 21 10:14:02.231566 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:14:02.231526 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="914fac9c-114b-46b3-9d94-32255647b5bf" path="/var/lib/kubelet/pods/914fac9c-114b-46b3-9d94-32255647b5bf/volumes" Apr 21 10:14:53.745217 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:14:53.745180 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-serving-t6nkp"] Apr 21 10:14:53.745612 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:14:53.745507 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="914fac9c-114b-46b3-9d94-32255647b5bf" containerName="console" Apr 21 10:14:53.745612 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:14:53.745519 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="914fac9c-114b-46b3-9d94-32255647b5bf" containerName="console" Apr 21 10:14:53.745612 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:14:53.745569 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="914fac9c-114b-46b3-9d94-32255647b5bf" containerName="console" Apr 21 10:14:53.748560 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:14:53.748545 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-t6nkp" Apr 21 10:14:53.750669 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:14:53.750651 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving-artifact\"" Apr 21 10:14:53.754017 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:14:53.753991 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-t6nkp"] Apr 21 10:14:53.806040 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:14:53.806008 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bd8f5\" (UniqueName: \"kubernetes.io/projected/c90f570d-18e2-4d5f-9d01-40adb6de844e-kube-api-access-bd8f5\") pod \"s3-tls-init-serving-t6nkp\" (UID: \"c90f570d-18e2-4d5f-9d01-40adb6de844e\") " pod="kserve/s3-tls-init-serving-t6nkp" Apr 21 10:14:53.906760 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:14:53.906722 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bd8f5\" (UniqueName: \"kubernetes.io/projected/c90f570d-18e2-4d5f-9d01-40adb6de844e-kube-api-access-bd8f5\") pod \"s3-tls-init-serving-t6nkp\" (UID: \"c90f570d-18e2-4d5f-9d01-40adb6de844e\") " pod="kserve/s3-tls-init-serving-t6nkp" Apr 21 10:14:53.915121 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:14:53.915090 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bd8f5\" (UniqueName: \"kubernetes.io/projected/c90f570d-18e2-4d5f-9d01-40adb6de844e-kube-api-access-bd8f5\") pod \"s3-tls-init-serving-t6nkp\" (UID: \"c90f570d-18e2-4d5f-9d01-40adb6de844e\") " pod="kserve/s3-tls-init-serving-t6nkp" Apr 21 10:14:54.058581 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:14:54.058492 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-t6nkp" Apr 21 10:14:54.178905 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:14:54.178872 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-t6nkp"] Apr 21 10:14:54.181973 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:14:54.181946 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc90f570d_18e2_4d5f_9d01_40adb6de844e.slice/crio-3b9f7dc84caa9055569146d9c41e7ebe0a63c1b5865ecdd05126f65c9fb6d867 WatchSource:0}: Error finding container 3b9f7dc84caa9055569146d9c41e7ebe0a63c1b5865ecdd05126f65c9fb6d867: Status 404 returned error can't find the container with id 3b9f7dc84caa9055569146d9c41e7ebe0a63c1b5865ecdd05126f65c9fb6d867 Apr 21 10:14:54.183875 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:14:54.183854 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 10:14:54.199496 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:14:54.199465 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-t6nkp" event={"ID":"c90f570d-18e2-4d5f-9d01-40adb6de844e","Type":"ContainerStarted","Data":"3b9f7dc84caa9055569146d9c41e7ebe0a63c1b5865ecdd05126f65c9fb6d867"} Apr 21 10:14:59.216145 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:14:59.216107 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-t6nkp" event={"ID":"c90f570d-18e2-4d5f-9d01-40adb6de844e","Type":"ContainerStarted","Data":"a4a943a667fa8b019f61387e17d5e78766e90dd7caae3525a1dcf4976c72a701"} Apr 21 10:14:59.231153 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:14:59.231093 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-serving-t6nkp" podStartSLOduration=1.7596467900000001 podStartE2EDuration="6.231077781s" podCreationTimestamp="2026-04-21 10:14:53 +0000 UTC" firstStartedPulling="2026-04-21 10:14:54.183980982 +0000 UTC m=+662.522514976" lastFinishedPulling="2026-04-21 10:14:58.655411969 +0000 UTC m=+666.993945967" observedRunningTime="2026-04-21 10:14:59.230094433 +0000 UTC m=+667.568628448" watchObservedRunningTime="2026-04-21 10:14:59.231077781 +0000 UTC m=+667.569611835" Apr 21 10:15:03.228668 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:15:03.228635 2577 generic.go:358] "Generic (PLEG): container finished" podID="c90f570d-18e2-4d5f-9d01-40adb6de844e" containerID="a4a943a667fa8b019f61387e17d5e78766e90dd7caae3525a1dcf4976c72a701" exitCode=0 Apr 21 10:15:03.229052 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:15:03.228709 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-t6nkp" event={"ID":"c90f570d-18e2-4d5f-9d01-40adb6de844e","Type":"ContainerDied","Data":"a4a943a667fa8b019f61387e17d5e78766e90dd7caae3525a1dcf4976c72a701"} Apr 21 10:15:04.352065 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:15:04.352042 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-t6nkp" Apr 21 10:15:04.398834 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:15:04.398796 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bd8f5\" (UniqueName: \"kubernetes.io/projected/c90f570d-18e2-4d5f-9d01-40adb6de844e-kube-api-access-bd8f5\") pod \"c90f570d-18e2-4d5f-9d01-40adb6de844e\" (UID: \"c90f570d-18e2-4d5f-9d01-40adb6de844e\") " Apr 21 10:15:04.400950 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:15:04.400922 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c90f570d-18e2-4d5f-9d01-40adb6de844e-kube-api-access-bd8f5" (OuterVolumeSpecName: "kube-api-access-bd8f5") pod "c90f570d-18e2-4d5f-9d01-40adb6de844e" (UID: "c90f570d-18e2-4d5f-9d01-40adb6de844e"). InnerVolumeSpecName "kube-api-access-bd8f5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 10:15:04.499549 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:15:04.499462 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bd8f5\" (UniqueName: \"kubernetes.io/projected/c90f570d-18e2-4d5f-9d01-40adb6de844e-kube-api-access-bd8f5\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:15:05.235348 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:15:05.235311 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-t6nkp" event={"ID":"c90f570d-18e2-4d5f-9d01-40adb6de844e","Type":"ContainerDied","Data":"3b9f7dc84caa9055569146d9c41e7ebe0a63c1b5865ecdd05126f65c9fb6d867"} Apr 21 10:15:05.235348 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:15:05.235345 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b9f7dc84caa9055569146d9c41e7ebe0a63c1b5865ecdd05126f65c9fb6d867" Apr 21 10:15:05.235348 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:15:05.235322 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-t6nkp" Apr 21 10:15:14.563770 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:15:14.563713 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-78bfdcb64-p9kss"] Apr 21 10:15:14.564155 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:15:14.564017 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c90f570d-18e2-4d5f-9d01-40adb6de844e" containerName="s3-tls-init-serving" Apr 21 10:15:14.564155 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:15:14.564028 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c90f570d-18e2-4d5f-9d01-40adb6de844e" containerName="s3-tls-init-serving" Apr 21 10:15:14.564155 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:15:14.564092 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="c90f570d-18e2-4d5f-9d01-40adb6de844e" containerName="s3-tls-init-serving" Apr 21 10:15:14.567115 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:15:14.567097 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-78bfdcb64-p9kss" Apr 21 10:15:14.569396 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:15:14.569370 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-pjlcm\"" Apr 21 10:15:14.574341 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:15:14.574298 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-78bfdcb64-p9kss"] Apr 21 10:15:14.689718 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:15:14.689678 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a3df4403-3e34-4fcd-9fc8-29c7df8113e1-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-78bfdcb64-p9kss\" (UID: \"a3df4403-3e34-4fcd-9fc8-29c7df8113e1\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-78bfdcb64-p9kss" Apr 21 10:15:14.790703 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:15:14.790665 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a3df4403-3e34-4fcd-9fc8-29c7df8113e1-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-78bfdcb64-p9kss\" (UID: \"a3df4403-3e34-4fcd-9fc8-29c7df8113e1\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-78bfdcb64-p9kss" Apr 21 10:15:14.791126 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:15:14.791105 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a3df4403-3e34-4fcd-9fc8-29c7df8113e1-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-78bfdcb64-p9kss\" (UID: \"a3df4403-3e34-4fcd-9fc8-29c7df8113e1\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-78bfdcb64-p9kss" Apr 21 10:15:14.878278 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:15:14.878176 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-78bfdcb64-p9kss" Apr 21 10:15:15.000462 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:15:15.000320 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-78bfdcb64-p9kss"] Apr 21 10:15:15.003832 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:15:15.003803 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3df4403_3e34_4fcd_9fc8_29c7df8113e1.slice/crio-a8110c40ce8737505202481f4fe28ad06b707c6b1ce9fa8ffe86ee9fcbf3831d WatchSource:0}: Error finding container a8110c40ce8737505202481f4fe28ad06b707c6b1ce9fa8ffe86ee9fcbf3831d: Status 404 returned error can't find the container with id a8110c40ce8737505202481f4fe28ad06b707c6b1ce9fa8ffe86ee9fcbf3831d Apr 21 10:15:15.263857 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:15:15.263820 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-78bfdcb64-p9kss" event={"ID":"a3df4403-3e34-4fcd-9fc8-29c7df8113e1","Type":"ContainerStarted","Data":"a8110c40ce8737505202481f4fe28ad06b707c6b1ce9fa8ffe86ee9fcbf3831d"} Apr 21 10:15:19.278493 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:15:19.278456 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-78bfdcb64-p9kss" event={"ID":"a3df4403-3e34-4fcd-9fc8-29c7df8113e1","Type":"ContainerStarted","Data":"9706d70994ccc3db1a5f1659c52930789b91719be111fb6f65c8b3c143ae4a49"} Apr 21 10:15:23.292154 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:15:23.292119 2577 generic.go:358] "Generic (PLEG): container finished" podID="a3df4403-3e34-4fcd-9fc8-29c7df8113e1" containerID="9706d70994ccc3db1a5f1659c52930789b91719be111fb6f65c8b3c143ae4a49" exitCode=0 Apr 21 10:15:23.292526 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:15:23.292192 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-78bfdcb64-p9kss" event={"ID":"a3df4403-3e34-4fcd-9fc8-29c7df8113e1","Type":"ContainerDied","Data":"9706d70994ccc3db1a5f1659c52930789b91719be111fb6f65c8b3c143ae4a49"} Apr 21 10:15:36.337984 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:15:36.337943 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-78bfdcb64-p9kss" event={"ID":"a3df4403-3e34-4fcd-9fc8-29c7df8113e1","Type":"ContainerStarted","Data":"4d082764f43995c95578333ee423e8cbf1a2bf59ce256f78482531928d079347"} Apr 21 10:15:39.349213 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:15:39.349175 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-78bfdcb64-p9kss" event={"ID":"a3df4403-3e34-4fcd-9fc8-29c7df8113e1","Type":"ContainerStarted","Data":"444afd33e226919b4c5cb2f4bc735dd27931c26e0f9897fcc1c078e0b8d81bc3"} Apr 21 10:15:39.349642 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:15:39.349403 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-78bfdcb64-p9kss" Apr 21 10:15:39.349642 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:15:39.349435 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-78bfdcb64-p9kss" Apr 21 10:15:39.350969 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:15:39.350922 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-78bfdcb64-p9kss" podUID="a3df4403-3e34-4fcd-9fc8-29c7df8113e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 21 10:15:39.351545 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:15:39.351520 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-78bfdcb64-p9kss" podUID="a3df4403-3e34-4fcd-9fc8-29c7df8113e1" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:15:39.366374 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:15:39.366328 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-78bfdcb64-p9kss" podStartSLOduration=1.420031823 podStartE2EDuration="25.366315436s" podCreationTimestamp="2026-04-21 10:15:14 +0000 UTC" firstStartedPulling="2026-04-21 10:15:15.005609971 +0000 UTC m=+683.344143966" lastFinishedPulling="2026-04-21 10:15:38.951893567 +0000 UTC m=+707.290427579" observedRunningTime="2026-04-21 10:15:39.363915902 +0000 UTC m=+707.702449917" watchObservedRunningTime="2026-04-21 10:15:39.366315436 +0000 UTC m=+707.704849452" Apr 21 10:15:40.352685 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:15:40.352648 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-78bfdcb64-p9kss" podUID="a3df4403-3e34-4fcd-9fc8-29c7df8113e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 21 10:15:40.353082 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:15:40.353011 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-78bfdcb64-p9kss" podUID="a3df4403-3e34-4fcd-9fc8-29c7df8113e1" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:15:50.353148 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:15:50.353040 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-78bfdcb64-p9kss" podUID="a3df4403-3e34-4fcd-9fc8-29c7df8113e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 21 10:15:50.353536 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:15:50.353484 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-78bfdcb64-p9kss" podUID="a3df4403-3e34-4fcd-9fc8-29c7df8113e1" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:16:00.353617 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:16:00.353560 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-78bfdcb64-p9kss" podUID="a3df4403-3e34-4fcd-9fc8-29c7df8113e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 21 10:16:00.354077 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:16:00.354032 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-78bfdcb64-p9kss" podUID="a3df4403-3e34-4fcd-9fc8-29c7df8113e1" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:16:10.352767 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:16:10.352700 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-78bfdcb64-p9kss" podUID="a3df4403-3e34-4fcd-9fc8-29c7df8113e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 21 10:16:10.353263 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:16:10.353216 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-78bfdcb64-p9kss" podUID="a3df4403-3e34-4fcd-9fc8-29c7df8113e1" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:16:20.353320 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:16:20.353267 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-78bfdcb64-p9kss" podUID="a3df4403-3e34-4fcd-9fc8-29c7df8113e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 21 10:16:20.353776 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:16:20.353731 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-78bfdcb64-p9kss" podUID="a3df4403-3e34-4fcd-9fc8-29c7df8113e1" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:16:30.353247 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:16:30.353198 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-78bfdcb64-p9kss" podUID="a3df4403-3e34-4fcd-9fc8-29c7df8113e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 21 10:16:30.353645 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:16:30.353622 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-78bfdcb64-p9kss" podUID="a3df4403-3e34-4fcd-9fc8-29c7df8113e1" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:16:40.353370 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:16:40.353319 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-78bfdcb64-p9kss" podUID="a3df4403-3e34-4fcd-9fc8-29c7df8113e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 21 10:16:40.353906 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:16:40.353808 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-78bfdcb64-p9kss" podUID="a3df4403-3e34-4fcd-9fc8-29c7df8113e1" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:16:50.353534 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:16:50.353502 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-78bfdcb64-p9kss" Apr 21 10:16:50.354019 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:16:50.353713 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-78bfdcb64-p9kss" Apr 21 10:16:59.634392 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:16:59.634351 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-78bfdcb64-p9kss"] Apr 21 10:16:59.635185 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:16:59.634686 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-78bfdcb64-p9kss" podUID="a3df4403-3e34-4fcd-9fc8-29c7df8113e1" containerName="kserve-container" containerID="cri-o://4d082764f43995c95578333ee423e8cbf1a2bf59ce256f78482531928d079347" gracePeriod=30 Apr 21 10:16:59.635185 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:16:59.634708 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-78bfdcb64-p9kss" podUID="a3df4403-3e34-4fcd-9fc8-29c7df8113e1" containerName="agent" containerID="cri-o://444afd33e226919b4c5cb2f4bc735dd27931c26e0f9897fcc1c078e0b8d81bc3" gracePeriod=30 Apr 21 10:16:59.735222 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:16:59.735185 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-5cdfb57445-txjcp"] Apr 21 10:16:59.738890 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:16:59.738861 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-5cdfb57445-txjcp" Apr 21 10:16:59.747230 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:16:59.747199 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-5cdfb57445-txjcp"] Apr 21 10:16:59.836450 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:16:59.836408 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41ae8c20-5407-46c8-8288-56468432d171-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-5cdfb57445-txjcp\" (UID: \"41ae8c20-5407-46c8-8288-56468432d171\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-5cdfb57445-txjcp" Apr 21 10:16:59.937099 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:16:59.937016 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41ae8c20-5407-46c8-8288-56468432d171-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-5cdfb57445-txjcp\" (UID: \"41ae8c20-5407-46c8-8288-56468432d171\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-5cdfb57445-txjcp" Apr 21 10:16:59.937421 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:16:59.937396 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41ae8c20-5407-46c8-8288-56468432d171-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-5cdfb57445-txjcp\" (UID: \"41ae8c20-5407-46c8-8288-56468432d171\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-5cdfb57445-txjcp" Apr 21 10:17:00.052439 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:17:00.052397 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-5cdfb57445-txjcp" Apr 21 10:17:00.175190 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:17:00.175159 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-5cdfb57445-txjcp"] Apr 21 10:17:00.178415 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:17:00.178383 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41ae8c20_5407_46c8_8288_56468432d171.slice/crio-0d9a88fedea6c02d80dca5bc104a84a0f6d312c82e9029eb11820ded35c5a6a0 WatchSource:0}: Error finding container 0d9a88fedea6c02d80dca5bc104a84a0f6d312c82e9029eb11820ded35c5a6a0: Status 404 returned error can't find the container with id 0d9a88fedea6c02d80dca5bc104a84a0f6d312c82e9029eb11820ded35c5a6a0 Apr 21 10:17:00.353282 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:17:00.353220 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-78bfdcb64-p9kss" podUID="a3df4403-3e34-4fcd-9fc8-29c7df8113e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 21 10:17:00.353623 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:17:00.353590 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-78bfdcb64-p9kss" podUID="a3df4403-3e34-4fcd-9fc8-29c7df8113e1" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:17:00.587638 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:17:00.587603 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-5cdfb57445-txjcp" event={"ID":"41ae8c20-5407-46c8-8288-56468432d171","Type":"ContainerStarted","Data":"ae3c353d1ae47268aa7f6ac3dc0b996a724151491fd6beb6f3d4d695587656b3"} Apr 21 10:17:00.587638 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:17:00.587639 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-5cdfb57445-txjcp" event={"ID":"41ae8c20-5407-46c8-8288-56468432d171","Type":"ContainerStarted","Data":"0d9a88fedea6c02d80dca5bc104a84a0f6d312c82e9029eb11820ded35c5a6a0"} Apr 21 10:17:04.601435 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:17:04.601400 2577 generic.go:358] "Generic (PLEG): container finished" podID="a3df4403-3e34-4fcd-9fc8-29c7df8113e1" containerID="4d082764f43995c95578333ee423e8cbf1a2bf59ce256f78482531928d079347" exitCode=0 Apr 21 10:17:04.601904 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:17:04.601465 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-78bfdcb64-p9kss" event={"ID":"a3df4403-3e34-4fcd-9fc8-29c7df8113e1","Type":"ContainerDied","Data":"4d082764f43995c95578333ee423e8cbf1a2bf59ce256f78482531928d079347"} Apr 21 10:17:04.602816 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:17:04.602790 2577 generic.go:358] "Generic (PLEG): container finished" podID="41ae8c20-5407-46c8-8288-56468432d171" containerID="ae3c353d1ae47268aa7f6ac3dc0b996a724151491fd6beb6f3d4d695587656b3" exitCode=0 Apr 21 10:17:04.602940 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:17:04.602865 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-5cdfb57445-txjcp" event={"ID":"41ae8c20-5407-46c8-8288-56468432d171","Type":"ContainerDied","Data":"ae3c353d1ae47268aa7f6ac3dc0b996a724151491fd6beb6f3d4d695587656b3"} Apr 21 10:17:05.607734 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:17:05.607700 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-5cdfb57445-txjcp" event={"ID":"41ae8c20-5407-46c8-8288-56468432d171","Type":"ContainerStarted","Data":"4301ce859e9b67a31094b0d657b4c44254b61cec13c53a638c5c0beefd1ddaf5"} Apr 21 10:17:05.607734 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:17:05.607739 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-5cdfb57445-txjcp" event={"ID":"41ae8c20-5407-46c8-8288-56468432d171","Type":"ContainerStarted","Data":"0bbfc5ce6815058b745024541aff1fc548495007444a9bc1d78e3b1c5143fe69"} Apr 21 10:17:05.608232 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:17:05.608168 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-5cdfb57445-txjcp" Apr 21 10:17:05.608232 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:17:05.608203 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-5cdfb57445-txjcp" Apr 21 10:17:05.609494 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:17:05.609466 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-5cdfb57445-txjcp" podUID="41ae8c20-5407-46c8-8288-56468432d171" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:5000: connect: connection refused" Apr 21 10:17:05.610205 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:17:05.610183 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-5cdfb57445-txjcp" podUID="41ae8c20-5407-46c8-8288-56468432d171" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:17:05.624285 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:17:05.624236 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-5cdfb57445-txjcp" podStartSLOduration=6.624223872 podStartE2EDuration="6.624223872s" podCreationTimestamp="2026-04-21 10:16:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:17:05.623686575 +0000 UTC m=+793.962220594" watchObservedRunningTime="2026-04-21 10:17:05.624223872 +0000 UTC m=+793.962757889" Apr 21 10:17:06.611256 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:17:06.611217 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-5cdfb57445-txjcp" podUID="41ae8c20-5407-46c8-8288-56468432d171" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:5000: connect: connection refused" Apr 21 10:17:06.611706 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:17:06.611478 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-5cdfb57445-txjcp" podUID="41ae8c20-5407-46c8-8288-56468432d171" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:17:10.353557 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:17:10.353505 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-78bfdcb64-p9kss" podUID="a3df4403-3e34-4fcd-9fc8-29c7df8113e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 21 10:17:10.354023 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:17:10.353796 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-78bfdcb64-p9kss" podUID="a3df4403-3e34-4fcd-9fc8-29c7df8113e1" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:17:16.611712 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:17:16.611609 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-5cdfb57445-txjcp" podUID="41ae8c20-5407-46c8-8288-56468432d171" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:5000: connect: connection refused" Apr 21 10:17:16.612160 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:17:16.612132 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-5cdfb57445-txjcp" podUID="41ae8c20-5407-46c8-8288-56468432d171" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:17:20.352973 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:17:20.352919 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-78bfdcb64-p9kss" podUID="a3df4403-3e34-4fcd-9fc8-29c7df8113e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.26:8080: connect: connection refused" Apr 21 10:17:20.353400 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:17:20.353082 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-78bfdcb64-p9kss" Apr 21 10:17:20.353400 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:17:20.353315 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-78bfdcb64-p9kss" podUID="a3df4403-3e34-4fcd-9fc8-29c7df8113e1" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:17:20.353511 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:17:20.353416 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-78bfdcb64-p9kss" Apr 21 10:17:26.611388 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:17:26.611337 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-5cdfb57445-txjcp" podUID="41ae8c20-5407-46c8-8288-56468432d171" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:5000: connect: connection refused" Apr 21 10:17:26.611817 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:17:26.611784 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-5cdfb57445-txjcp" podUID="41ae8c20-5407-46c8-8288-56468432d171" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:17:29.679725 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:17:29.679684 2577 generic.go:358] "Generic (PLEG): container finished" podID="a3df4403-3e34-4fcd-9fc8-29c7df8113e1" containerID="444afd33e226919b4c5cb2f4bc735dd27931c26e0f9897fcc1c078e0b8d81bc3" exitCode=0 Apr 21 10:17:29.680119 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:17:29.679772 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-78bfdcb64-p9kss" event={"ID":"a3df4403-3e34-4fcd-9fc8-29c7df8113e1","Type":"ContainerDied","Data":"444afd33e226919b4c5cb2f4bc735dd27931c26e0f9897fcc1c078e0b8d81bc3"} Apr 21 10:17:29.776932 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:17:29.776905 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-78bfdcb64-p9kss" Apr 21 10:17:29.858564 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:17:29.858525 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a3df4403-3e34-4fcd-9fc8-29c7df8113e1-kserve-provision-location\") pod \"a3df4403-3e34-4fcd-9fc8-29c7df8113e1\" (UID: \"a3df4403-3e34-4fcd-9fc8-29c7df8113e1\") " Apr 21 10:17:29.858959 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:17:29.858927 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3df4403-3e34-4fcd-9fc8-29c7df8113e1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a3df4403-3e34-4fcd-9fc8-29c7df8113e1" (UID: "a3df4403-3e34-4fcd-9fc8-29c7df8113e1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:17:29.959466 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:17:29.959363 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a3df4403-3e34-4fcd-9fc8-29c7df8113e1-kserve-provision-location\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:17:30.684614 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:17:30.684527 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-78bfdcb64-p9kss" event={"ID":"a3df4403-3e34-4fcd-9fc8-29c7df8113e1","Type":"ContainerDied","Data":"a8110c40ce8737505202481f4fe28ad06b707c6b1ce9fa8ffe86ee9fcbf3831d"} Apr 21 10:17:30.684614 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:17:30.684566 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-78bfdcb64-p9kss" Apr 21 10:17:30.684614 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:17:30.684577 2577 scope.go:117] "RemoveContainer" containerID="444afd33e226919b4c5cb2f4bc735dd27931c26e0f9897fcc1c078e0b8d81bc3" Apr 21 10:17:30.692792 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:17:30.692772 2577 scope.go:117] "RemoveContainer" containerID="4d082764f43995c95578333ee423e8cbf1a2bf59ce256f78482531928d079347" Apr 21 10:17:30.700325 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:17:30.700284 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-78bfdcb64-p9kss"] Apr 21 10:17:30.702205 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:17:30.702174 2577 scope.go:117] "RemoveContainer" containerID="9706d70994ccc3db1a5f1659c52930789b91719be111fb6f65c8b3c143ae4a49" Apr 21 10:17:30.703931 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:17:30.703910 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-78bfdcb64-p9kss"] Apr 21 10:17:32.231519 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:17:32.231484 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3df4403-3e34-4fcd-9fc8-29c7df8113e1" path="/var/lib/kubelet/pods/a3df4403-3e34-4fcd-9fc8-29c7df8113e1/volumes" Apr 21 10:17:36.611284 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:17:36.611227 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-5cdfb57445-txjcp" podUID="41ae8c20-5407-46c8-8288-56468432d171" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:5000: connect: connection refused" Apr 21 10:17:36.611728 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:17:36.611701 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-5cdfb57445-txjcp" podUID="41ae8c20-5407-46c8-8288-56468432d171" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:17:46.612049 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:17:46.611995 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-5cdfb57445-txjcp" podUID="41ae8c20-5407-46c8-8288-56468432d171" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:5000: connect: connection refused" Apr 21 10:17:46.612549 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:17:46.612515 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-5cdfb57445-txjcp" podUID="41ae8c20-5407-46c8-8288-56468432d171" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:17:56.611971 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:17:56.611916 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-5cdfb57445-txjcp" podUID="41ae8c20-5407-46c8-8288-56468432d171" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:5000: connect: connection refused" Apr 21 10:17:56.612367 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:17:56.612351 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-5cdfb57445-txjcp" podUID="41ae8c20-5407-46c8-8288-56468432d171" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:18:06.611251 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:06.611201 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-5cdfb57445-txjcp" podUID="41ae8c20-5407-46c8-8288-56468432d171" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:5000: connect: connection refused" Apr 21 10:18:06.611710 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:06.611656 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-5cdfb57445-txjcp" podUID="41ae8c20-5407-46c8-8288-56468432d171" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:18:16.611869 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:16.611834 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-5cdfb57445-txjcp" Apr 21 10:18:16.612286 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:16.611900 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-5cdfb57445-txjcp" Apr 21 10:18:24.818885 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:24.818849 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-5cdfb57445-txjcp"] Apr 21 10:18:24.819289 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:24.819241 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-5cdfb57445-txjcp" podUID="41ae8c20-5407-46c8-8288-56468432d171" containerName="kserve-container" containerID="cri-o://0bbfc5ce6815058b745024541aff1fc548495007444a9bc1d78e3b1c5143fe69" gracePeriod=30 Apr 21 10:18:24.819369 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:24.819320 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-5cdfb57445-txjcp" podUID="41ae8c20-5407-46c8-8288-56468432d171" containerName="agent" containerID="cri-o://4301ce859e9b67a31094b0d657b4c44254b61cec13c53a638c5c0beefd1ddaf5" gracePeriod=30 Apr 21 10:18:26.611770 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:26.611699 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-5cdfb57445-txjcp" podUID="41ae8c20-5407-46c8-8288-56468432d171" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:5000: connect: connection refused" Apr 21 10:18:26.612214 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:26.612102 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-5cdfb57445-txjcp" podUID="41ae8c20-5407-46c8-8288-56468432d171" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:18:29.863154 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:29.863121 2577 generic.go:358] "Generic (PLEG): container finished" podID="41ae8c20-5407-46c8-8288-56468432d171" containerID="0bbfc5ce6815058b745024541aff1fc548495007444a9bc1d78e3b1c5143fe69" exitCode=0 Apr 21 10:18:29.863546 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:29.863200 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-5cdfb57445-txjcp" event={"ID":"41ae8c20-5407-46c8-8288-56468432d171","Type":"ContainerDied","Data":"0bbfc5ce6815058b745024541aff1fc548495007444a9bc1d78e3b1c5143fe69"} Apr 21 10:18:34.903216 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:34.903179 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-59b65cb4c-xz8j7"] Apr 21 10:18:34.903620 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:34.903486 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a3df4403-3e34-4fcd-9fc8-29c7df8113e1" containerName="agent" Apr 21 10:18:34.903620 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:34.903496 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3df4403-3e34-4fcd-9fc8-29c7df8113e1" containerName="agent" Apr 21 10:18:34.903620 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:34.903509 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a3df4403-3e34-4fcd-9fc8-29c7df8113e1" containerName="kserve-container" Apr 21 10:18:34.903620 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:34.903515 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3df4403-3e34-4fcd-9fc8-29c7df8113e1" containerName="kserve-container" Apr 21 10:18:34.903620 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:34.903522 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a3df4403-3e34-4fcd-9fc8-29c7df8113e1" containerName="storage-initializer" Apr 21 10:18:34.903620 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:34.903528 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3df4403-3e34-4fcd-9fc8-29c7df8113e1" containerName="storage-initializer" Apr 21 10:18:34.903620 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:34.903574 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="a3df4403-3e34-4fcd-9fc8-29c7df8113e1" containerName="kserve-container" Apr 21 10:18:34.903620 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:34.903582 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="a3df4403-3e34-4fcd-9fc8-29c7df8113e1" containerName="agent" Apr 21 10:18:34.906698 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:34.906680 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-59b65cb4c-xz8j7" Apr 21 10:18:34.915636 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:34.915611 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-59b65cb4c-xz8j7"] Apr 21 10:18:34.979589 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:34.979559 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1f3f8b46-0b2e-46bf-bb52-782a17ea1662-kserve-provision-location\") pod \"isvc-logger-predictor-59b65cb4c-xz8j7\" (UID: \"1f3f8b46-0b2e-46bf-bb52-782a17ea1662\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-59b65cb4c-xz8j7" Apr 21 10:18:35.080657 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:35.080611 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1f3f8b46-0b2e-46bf-bb52-782a17ea1662-kserve-provision-location\") pod \"isvc-logger-predictor-59b65cb4c-xz8j7\" (UID: \"1f3f8b46-0b2e-46bf-bb52-782a17ea1662\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-59b65cb4c-xz8j7" Apr 21 10:18:35.081046 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:35.081022 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1f3f8b46-0b2e-46bf-bb52-782a17ea1662-kserve-provision-location\") pod \"isvc-logger-predictor-59b65cb4c-xz8j7\" (UID: \"1f3f8b46-0b2e-46bf-bb52-782a17ea1662\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-59b65cb4c-xz8j7" Apr 21 10:18:35.238244 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:35.238212 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-59b65cb4c-xz8j7" Apr 21 10:18:35.362639 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:35.362605 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-59b65cb4c-xz8j7"] Apr 21 10:18:35.365403 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:18:35.365371 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f3f8b46_0b2e_46bf_bb52_782a17ea1662.slice/crio-16ec90b8daeaa5790488c38f3463b81712f791fea7c4b2feb2c0dd7a179817b1 WatchSource:0}: Error finding container 16ec90b8daeaa5790488c38f3463b81712f791fea7c4b2feb2c0dd7a179817b1: Status 404 returned error can't find the container with id 16ec90b8daeaa5790488c38f3463b81712f791fea7c4b2feb2c0dd7a179817b1 Apr 21 10:18:35.888153 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:35.888119 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-59b65cb4c-xz8j7" event={"ID":"1f3f8b46-0b2e-46bf-bb52-782a17ea1662","Type":"ContainerStarted","Data":"10da4aa23282b465bc2019ecd94d93b6937c2f9478f0e70d13f4f7b69215ac74"} Apr 21 10:18:35.888153 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:35.888154 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-59b65cb4c-xz8j7" event={"ID":"1f3f8b46-0b2e-46bf-bb52-782a17ea1662","Type":"ContainerStarted","Data":"16ec90b8daeaa5790488c38f3463b81712f791fea7c4b2feb2c0dd7a179817b1"} Apr 21 10:18:36.611441 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:36.611387 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-5cdfb57445-txjcp" podUID="41ae8c20-5407-46c8-8288-56468432d171" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:5000: connect: connection refused" Apr 21 10:18:36.611865 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:36.611655 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-5cdfb57445-txjcp" podUID="41ae8c20-5407-46c8-8288-56468432d171" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:18:39.899844 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:39.899807 2577 generic.go:358] "Generic (PLEG): container finished" podID="1f3f8b46-0b2e-46bf-bb52-782a17ea1662" containerID="10da4aa23282b465bc2019ecd94d93b6937c2f9478f0e70d13f4f7b69215ac74" exitCode=0 Apr 21 10:18:39.900258 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:39.899881 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-59b65cb4c-xz8j7" event={"ID":"1f3f8b46-0b2e-46bf-bb52-782a17ea1662","Type":"ContainerDied","Data":"10da4aa23282b465bc2019ecd94d93b6937c2f9478f0e70d13f4f7b69215ac74"} Apr 21 10:18:40.905453 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:40.905370 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-59b65cb4c-xz8j7" event={"ID":"1f3f8b46-0b2e-46bf-bb52-782a17ea1662","Type":"ContainerStarted","Data":"a74f0900944d342d970b77563ab50d63a8905007a096e00339e655b9d04cd5b3"} Apr 21 10:18:40.905453 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:40.905412 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-59b65cb4c-xz8j7" event={"ID":"1f3f8b46-0b2e-46bf-bb52-782a17ea1662","Type":"ContainerStarted","Data":"bd8e95e1da7694270ec21dcb8483356b9837093d39dfb5fd11bc10649a48630e"} Apr 21 10:18:40.905899 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:40.905697 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-59b65cb4c-xz8j7" Apr 21 10:18:40.907097 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:40.907068 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-59b65cb4c-xz8j7" podUID="1f3f8b46-0b2e-46bf-bb52-782a17ea1662" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 21 10:18:40.923502 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:40.923456 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-predictor-59b65cb4c-xz8j7" podStartSLOduration=6.923441568 podStartE2EDuration="6.923441568s" podCreationTimestamp="2026-04-21 10:18:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:18:40.921127886 +0000 UTC m=+889.259661904" watchObservedRunningTime="2026-04-21 10:18:40.923441568 +0000 UTC m=+889.261975585" Apr 21 10:18:41.907945 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:41.907912 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-59b65cb4c-xz8j7" Apr 21 10:18:41.908431 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:41.908078 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-59b65cb4c-xz8j7" podUID="1f3f8b46-0b2e-46bf-bb52-782a17ea1662" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 21 10:18:41.909019 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:41.908997 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-59b65cb4c-xz8j7" podUID="1f3f8b46-0b2e-46bf-bb52-782a17ea1662" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:18:42.911418 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:42.911380 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-59b65cb4c-xz8j7" podUID="1f3f8b46-0b2e-46bf-bb52-782a17ea1662" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 21 10:18:42.911920 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:42.911873 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-59b65cb4c-xz8j7" podUID="1f3f8b46-0b2e-46bf-bb52-782a17ea1662" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:18:46.611942 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:46.611899 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-5cdfb57445-txjcp" podUID="41ae8c20-5407-46c8-8288-56468432d171" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.27:5000: connect: connection refused" Apr 21 10:18:46.612382 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:46.612039 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-5cdfb57445-txjcp" Apr 21 10:18:46.612382 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:46.612245 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-5cdfb57445-txjcp" podUID="41ae8c20-5407-46c8-8288-56468432d171" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:18:46.612382 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:46.612369 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-5cdfb57445-txjcp" Apr 21 10:18:52.158061 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:52.158034 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-9vsls_c7fc40ec-8080-4847-9b12-78671f916c03/console-operator/2.log" Apr 21 10:18:52.158684 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:52.158666 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-9vsls_c7fc40ec-8080-4847-9b12-78671f916c03/console-operator/2.log" Apr 21 10:18:52.912198 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:52.912146 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-59b65cb4c-xz8j7" podUID="1f3f8b46-0b2e-46bf-bb52-782a17ea1662" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 21 10:18:52.912686 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:52.912658 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-59b65cb4c-xz8j7" podUID="1f3f8b46-0b2e-46bf-bb52-782a17ea1662" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:18:54.948573 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:54.948536 2577 generic.go:358] "Generic (PLEG): container finished" podID="41ae8c20-5407-46c8-8288-56468432d171" containerID="4301ce859e9b67a31094b0d657b4c44254b61cec13c53a638c5c0beefd1ddaf5" exitCode=0 Apr 21 10:18:54.948929 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:54.948581 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-5cdfb57445-txjcp" event={"ID":"41ae8c20-5407-46c8-8288-56468432d171","Type":"ContainerDied","Data":"4301ce859e9b67a31094b0d657b4c44254b61cec13c53a638c5c0beefd1ddaf5"} Apr 21 10:18:54.970798 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:54.970769 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-5cdfb57445-txjcp" Apr 21 10:18:55.052036 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:55.051985 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41ae8c20-5407-46c8-8288-56468432d171-kserve-provision-location\") pod \"41ae8c20-5407-46c8-8288-56468432d171\" (UID: \"41ae8c20-5407-46c8-8288-56468432d171\") " Apr 21 10:18:55.052377 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:55.052350 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41ae8c20-5407-46c8-8288-56468432d171-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "41ae8c20-5407-46c8-8288-56468432d171" (UID: "41ae8c20-5407-46c8-8288-56468432d171"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:18:55.152727 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:55.152649 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41ae8c20-5407-46c8-8288-56468432d171-kserve-provision-location\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:18:55.953464 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:55.953427 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-5cdfb57445-txjcp" event={"ID":"41ae8c20-5407-46c8-8288-56468432d171","Type":"ContainerDied","Data":"0d9a88fedea6c02d80dca5bc104a84a0f6d312c82e9029eb11820ded35c5a6a0"} Apr 21 10:18:55.953464 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:55.953450 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-5cdfb57445-txjcp" Apr 21 10:18:55.954011 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:55.953482 2577 scope.go:117] "RemoveContainer" containerID="4301ce859e9b67a31094b0d657b4c44254b61cec13c53a638c5c0beefd1ddaf5" Apr 21 10:18:55.961888 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:55.961858 2577 scope.go:117] "RemoveContainer" containerID="0bbfc5ce6815058b745024541aff1fc548495007444a9bc1d78e3b1c5143fe69" Apr 21 10:18:55.969199 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:55.969180 2577 scope.go:117] "RemoveContainer" containerID="ae3c353d1ae47268aa7f6ac3dc0b996a724151491fd6beb6f3d4d695587656b3" Apr 21 10:18:55.973724 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:55.973696 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-5cdfb57445-txjcp"] Apr 21 10:18:55.977345 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:55.977320 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-5cdfb57445-txjcp"] Apr 21 10:18:56.231551 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:18:56.231516 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41ae8c20-5407-46c8-8288-56468432d171" path="/var/lib/kubelet/pods/41ae8c20-5407-46c8-8288-56468432d171/volumes" Apr 21 10:19:02.911412 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:19:02.911366 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-59b65cb4c-xz8j7" podUID="1f3f8b46-0b2e-46bf-bb52-782a17ea1662" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 21 10:19:02.913903 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:19:02.911798 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-59b65cb4c-xz8j7" podUID="1f3f8b46-0b2e-46bf-bb52-782a17ea1662" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:19:12.911542 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:19:12.911492 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-59b65cb4c-xz8j7" podUID="1f3f8b46-0b2e-46bf-bb52-782a17ea1662" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 21 10:19:12.912009 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:19:12.911898 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-59b65cb4c-xz8j7" podUID="1f3f8b46-0b2e-46bf-bb52-782a17ea1662" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:19:22.912052 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:19:22.912005 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-59b65cb4c-xz8j7" podUID="1f3f8b46-0b2e-46bf-bb52-782a17ea1662" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 21 10:19:22.912482 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:19:22.912435 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-59b65cb4c-xz8j7" podUID="1f3f8b46-0b2e-46bf-bb52-782a17ea1662" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:19:32.911393 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:19:32.911344 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-59b65cb4c-xz8j7" podUID="1f3f8b46-0b2e-46bf-bb52-782a17ea1662" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 21 10:19:32.911873 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:19:32.911786 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-59b65cb4c-xz8j7" podUID="1f3f8b46-0b2e-46bf-bb52-782a17ea1662" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:19:42.912119 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:19:42.912071 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-59b65cb4c-xz8j7" podUID="1f3f8b46-0b2e-46bf-bb52-782a17ea1662" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 21 10:19:42.912664 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:19:42.912508 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-59b65cb4c-xz8j7" podUID="1f3f8b46-0b2e-46bf-bb52-782a17ea1662" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:19:48.236393 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:19:48.233956 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-59b65cb4c-xz8j7" Apr 21 10:19:48.236393 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:19:48.234110 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-59b65cb4c-xz8j7" Apr 21 10:20:00.102218 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:00.102182 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-59b65cb4c-xz8j7"] Apr 21 10:20:00.102831 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:00.102477 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-59b65cb4c-xz8j7" podUID="1f3f8b46-0b2e-46bf-bb52-782a17ea1662" containerName="kserve-container" containerID="cri-o://bd8e95e1da7694270ec21dcb8483356b9837093d39dfb5fd11bc10649a48630e" gracePeriod=30 Apr 21 10:20:00.102831 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:00.102589 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-59b65cb4c-xz8j7" podUID="1f3f8b46-0b2e-46bf-bb52-782a17ea1662" containerName="agent" containerID="cri-o://a74f0900944d342d970b77563ab50d63a8905007a096e00339e655b9d04cd5b3" gracePeriod=30 Apr 21 10:20:00.122447 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:00.122414 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-gj6lc"] Apr 21 10:20:00.122736 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:00.122724 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="41ae8c20-5407-46c8-8288-56468432d171" containerName="kserve-container" Apr 21 10:20:00.122816 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:00.122738 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="41ae8c20-5407-46c8-8288-56468432d171" containerName="kserve-container" Apr 21 10:20:00.122816 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:00.122764 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="41ae8c20-5407-46c8-8288-56468432d171" containerName="agent" Apr 21 10:20:00.122816 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:00.122773 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="41ae8c20-5407-46c8-8288-56468432d171" containerName="agent" Apr 21 10:20:00.122816 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:00.122789 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="41ae8c20-5407-46c8-8288-56468432d171" containerName="storage-initializer" Apr 21 10:20:00.122816 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:00.122795 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="41ae8c20-5407-46c8-8288-56468432d171" containerName="storage-initializer" Apr 21 10:20:00.122985 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:00.122842 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="41ae8c20-5407-46c8-8288-56468432d171" containerName="agent" Apr 21 10:20:00.122985 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:00.122855 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="41ae8c20-5407-46c8-8288-56468432d171" containerName="kserve-container" Apr 21 10:20:00.126033 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:00.126015 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-gj6lc" Apr 21 10:20:00.133888 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:00.133862 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-gj6lc"] Apr 21 10:20:00.290457 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:00.290422 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/af166313-95e4-4282-9d51-2e96a46314d0-kserve-provision-location\") pod \"isvc-lightgbm-predictor-78c8d484d6-gj6lc\" (UID: \"af166313-95e4-4282-9d51-2e96a46314d0\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-gj6lc" Apr 21 10:20:00.391891 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:00.391788 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/af166313-95e4-4282-9d51-2e96a46314d0-kserve-provision-location\") pod \"isvc-lightgbm-predictor-78c8d484d6-gj6lc\" (UID: \"af166313-95e4-4282-9d51-2e96a46314d0\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-gj6lc" Apr 21 10:20:00.392183 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:00.392159 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/af166313-95e4-4282-9d51-2e96a46314d0-kserve-provision-location\") pod \"isvc-lightgbm-predictor-78c8d484d6-gj6lc\" (UID: \"af166313-95e4-4282-9d51-2e96a46314d0\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-gj6lc" Apr 21 10:20:00.436879 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:00.436848 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-gj6lc" Apr 21 10:20:00.561421 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:00.561394 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-gj6lc"] Apr 21 10:20:00.564220 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:20:00.564187 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf166313_95e4_4282_9d51_2e96a46314d0.slice/crio-8715a93c28b01c4c14f0f67a8e73f5e4e406dfdf6a62dc23d606e664b9a3eef1 WatchSource:0}: Error finding container 8715a93c28b01c4c14f0f67a8e73f5e4e406dfdf6a62dc23d606e664b9a3eef1: Status 404 returned error can't find the container with id 8715a93c28b01c4c14f0f67a8e73f5e4e406dfdf6a62dc23d606e664b9a3eef1 Apr 21 10:20:00.565909 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:00.565892 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 10:20:01.151429 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:01.151383 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-gj6lc" event={"ID":"af166313-95e4-4282-9d51-2e96a46314d0","Type":"ContainerStarted","Data":"fa50f142788d2987982bdab73086c61d01a808f0d215fb49e0071afc7067e8a4"} Apr 21 10:20:01.151429 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:01.151432 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-gj6lc" event={"ID":"af166313-95e4-4282-9d51-2e96a46314d0","Type":"ContainerStarted","Data":"8715a93c28b01c4c14f0f67a8e73f5e4e406dfdf6a62dc23d606e664b9a3eef1"} Apr 21 10:20:05.164109 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:05.164067 2577 generic.go:358] "Generic (PLEG): container finished" podID="af166313-95e4-4282-9d51-2e96a46314d0" containerID="fa50f142788d2987982bdab73086c61d01a808f0d215fb49e0071afc7067e8a4" exitCode=0 Apr 21 10:20:05.164568 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:05.164137 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-gj6lc" event={"ID":"af166313-95e4-4282-9d51-2e96a46314d0","Type":"ContainerDied","Data":"fa50f142788d2987982bdab73086c61d01a808f0d215fb49e0071afc7067e8a4"} Apr 21 10:20:05.166327 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:05.166300 2577 generic.go:358] "Generic (PLEG): container finished" podID="1f3f8b46-0b2e-46bf-bb52-782a17ea1662" containerID="bd8e95e1da7694270ec21dcb8483356b9837093d39dfb5fd11bc10649a48630e" exitCode=0 Apr 21 10:20:05.166406 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:05.166359 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-59b65cb4c-xz8j7" event={"ID":"1f3f8b46-0b2e-46bf-bb52-782a17ea1662","Type":"ContainerDied","Data":"bd8e95e1da7694270ec21dcb8483356b9837093d39dfb5fd11bc10649a48630e"} Apr 21 10:20:08.228298 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:08.228250 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-59b65cb4c-xz8j7" podUID="1f3f8b46-0b2e-46bf-bb52-782a17ea1662" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 21 10:20:08.228820 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:08.228791 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-59b65cb4c-xz8j7" podUID="1f3f8b46-0b2e-46bf-bb52-782a17ea1662" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:20:12.190768 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:12.190666 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-gj6lc" event={"ID":"af166313-95e4-4282-9d51-2e96a46314d0","Type":"ContainerStarted","Data":"1d8a9701ed03370091555412e07209f214b92fe92a7d8f6e19f00b30a54b29d4"} Apr 21 10:20:12.191109 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:12.190973 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-gj6lc" Apr 21 10:20:12.192353 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:12.192326 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-gj6lc" podUID="af166313-95e4-4282-9d51-2e96a46314d0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 21 10:20:12.206181 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:12.206133 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-gj6lc" podStartSLOduration=5.546562047 podStartE2EDuration="12.206120197s" podCreationTimestamp="2026-04-21 10:20:00 +0000 UTC" firstStartedPulling="2026-04-21 10:20:05.165541171 +0000 UTC m=+973.504075166" lastFinishedPulling="2026-04-21 10:20:11.825099321 +0000 UTC m=+980.163633316" observedRunningTime="2026-04-21 10:20:12.204553959 +0000 UTC m=+980.543087975" watchObservedRunningTime="2026-04-21 10:20:12.206120197 +0000 UTC m=+980.544654213" Apr 21 10:20:13.194307 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:13.194263 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-gj6lc" podUID="af166313-95e4-4282-9d51-2e96a46314d0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 21 10:20:18.228372 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:18.228320 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-59b65cb4c-xz8j7" podUID="1f3f8b46-0b2e-46bf-bb52-782a17ea1662" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 21 10:20:18.228875 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:18.228655 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-59b65cb4c-xz8j7" podUID="1f3f8b46-0b2e-46bf-bb52-782a17ea1662" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:20:23.195030 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:23.194974 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-gj6lc" podUID="af166313-95e4-4282-9d51-2e96a46314d0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 21 10:20:28.228929 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:28.228872 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-59b65cb4c-xz8j7" podUID="1f3f8b46-0b2e-46bf-bb52-782a17ea1662" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 21 10:20:28.229378 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:28.229203 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-59b65cb4c-xz8j7" podUID="1f3f8b46-0b2e-46bf-bb52-782a17ea1662" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 21 10:20:28.231792 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:28.231772 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-59b65cb4c-xz8j7" Apr 21 10:20:28.231922 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:28.231819 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-59b65cb4c-xz8j7" Apr 21 10:20:30.244497 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:30.244461 2577 generic.go:358] "Generic (PLEG): container finished" podID="1f3f8b46-0b2e-46bf-bb52-782a17ea1662" containerID="a74f0900944d342d970b77563ab50d63a8905007a096e00339e655b9d04cd5b3" exitCode=0 Apr 21 10:20:30.244876 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:30.244525 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-59b65cb4c-xz8j7" event={"ID":"1f3f8b46-0b2e-46bf-bb52-782a17ea1662","Type":"ContainerDied","Data":"a74f0900944d342d970b77563ab50d63a8905007a096e00339e655b9d04cd5b3"} Apr 21 10:20:30.745281 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:30.745255 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-59b65cb4c-xz8j7" Apr 21 10:20:30.838439 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:30.838406 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1f3f8b46-0b2e-46bf-bb52-782a17ea1662-kserve-provision-location\") pod \"1f3f8b46-0b2e-46bf-bb52-782a17ea1662\" (UID: \"1f3f8b46-0b2e-46bf-bb52-782a17ea1662\") " Apr 21 10:20:30.838737 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:30.838712 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f3f8b46-0b2e-46bf-bb52-782a17ea1662-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1f3f8b46-0b2e-46bf-bb52-782a17ea1662" (UID: "1f3f8b46-0b2e-46bf-bb52-782a17ea1662"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:20:30.939982 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:30.939889 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1f3f8b46-0b2e-46bf-bb52-782a17ea1662-kserve-provision-location\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:20:31.249688 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:31.249653 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-59b65cb4c-xz8j7" event={"ID":"1f3f8b46-0b2e-46bf-bb52-782a17ea1662","Type":"ContainerDied","Data":"16ec90b8daeaa5790488c38f3463b81712f791fea7c4b2feb2c0dd7a179817b1"} Apr 21 10:20:31.249688 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:31.249669 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-59b65cb4c-xz8j7" Apr 21 10:20:31.249688 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:31.249697 2577 scope.go:117] "RemoveContainer" containerID="a74f0900944d342d970b77563ab50d63a8905007a096e00339e655b9d04cd5b3" Apr 21 10:20:31.257930 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:31.257907 2577 scope.go:117] "RemoveContainer" containerID="bd8e95e1da7694270ec21dcb8483356b9837093d39dfb5fd11bc10649a48630e" Apr 21 10:20:31.265760 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:31.265727 2577 scope.go:117] "RemoveContainer" containerID="10da4aa23282b465bc2019ecd94d93b6937c2f9478f0e70d13f4f7b69215ac74" Apr 21 10:20:31.270307 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:31.270276 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-59b65cb4c-xz8j7"] Apr 21 10:20:31.273958 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:31.273928 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-59b65cb4c-xz8j7"] Apr 21 10:20:32.231563 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:32.231533 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f3f8b46-0b2e-46bf-bb52-782a17ea1662" path="/var/lib/kubelet/pods/1f3f8b46-0b2e-46bf-bb52-782a17ea1662/volumes" Apr 21 10:20:33.194890 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:33.194844 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-gj6lc" podUID="af166313-95e4-4282-9d51-2e96a46314d0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 21 10:20:43.194423 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:43.194373 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-gj6lc" podUID="af166313-95e4-4282-9d51-2e96a46314d0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 21 10:20:53.194916 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:20:53.194867 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-gj6lc" podUID="af166313-95e4-4282-9d51-2e96a46314d0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 21 10:21:03.194674 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:21:03.194622 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-gj6lc" podUID="af166313-95e4-4282-9d51-2e96a46314d0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 21 10:21:13.194637 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:21:13.194589 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-gj6lc" podUID="af166313-95e4-4282-9d51-2e96a46314d0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 21 10:21:23.195086 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:21:23.195044 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-gj6lc" podUID="af166313-95e4-4282-9d51-2e96a46314d0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 21 10:21:26.228360 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:21:26.228317 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-gj6lc" podUID="af166313-95e4-4282-9d51-2e96a46314d0" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 21 10:21:36.234241 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:21:36.234212 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-gj6lc" Apr 21 10:21:40.254769 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:21:40.254722 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-gj6lc"] Apr 21 10:21:40.255187 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:21:40.254969 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-gj6lc" podUID="af166313-95e4-4282-9d51-2e96a46314d0" containerName="kserve-container" containerID="cri-o://1d8a9701ed03370091555412e07209f214b92fe92a7d8f6e19f00b30a54b29d4" gracePeriod=30 Apr 21 10:21:40.338087 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:21:40.338048 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-spdh8"] Apr 21 10:21:40.338368 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:21:40.338355 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1f3f8b46-0b2e-46bf-bb52-782a17ea1662" containerName="kserve-container" Apr 21 10:21:40.338417 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:21:40.338369 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f3f8b46-0b2e-46bf-bb52-782a17ea1662" containerName="kserve-container" Apr 21 10:21:40.338417 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:21:40.338384 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1f3f8b46-0b2e-46bf-bb52-782a17ea1662" containerName="agent" Apr 21 10:21:40.338417 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:21:40.338390 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f3f8b46-0b2e-46bf-bb52-782a17ea1662" containerName="agent" Apr 21 10:21:40.338417 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:21:40.338397 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1f3f8b46-0b2e-46bf-bb52-782a17ea1662" containerName="storage-initializer" Apr 21 10:21:40.338417 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:21:40.338403 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f3f8b46-0b2e-46bf-bb52-782a17ea1662" containerName="storage-initializer" Apr 21 10:21:40.338562 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:21:40.338453 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="1f3f8b46-0b2e-46bf-bb52-782a17ea1662" containerName="agent" Apr 21 10:21:40.338562 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:21:40.338461 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="1f3f8b46-0b2e-46bf-bb52-782a17ea1662" containerName="kserve-container" Apr 21 10:21:40.341329 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:21:40.341314 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-spdh8" Apr 21 10:21:40.348454 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:21:40.348377 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-spdh8"] Apr 21 10:21:40.403554 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:21:40.403509 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/24cd6a2f-5472-4b84-a482-197432f7c172-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-64984c7cb-spdh8\" (UID: \"24cd6a2f-5472-4b84-a482-197432f7c172\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-spdh8" Apr 21 10:21:40.503996 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:21:40.503956 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/24cd6a2f-5472-4b84-a482-197432f7c172-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-64984c7cb-spdh8\" (UID: \"24cd6a2f-5472-4b84-a482-197432f7c172\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-spdh8" Apr 21 10:21:40.504337 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:21:40.504318 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/24cd6a2f-5472-4b84-a482-197432f7c172-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-64984c7cb-spdh8\" (UID: \"24cd6a2f-5472-4b84-a482-197432f7c172\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-spdh8" Apr 21 10:21:40.652261 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:21:40.652175 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-spdh8" Apr 21 10:21:40.774766 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:21:40.774626 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-spdh8"] Apr 21 10:21:40.777335 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:21:40.777305 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24cd6a2f_5472_4b84_a482_197432f7c172.slice/crio-4fe7f388cd62eec567aad001908a1359f7b5cd6b4561332fbf9af272eb100bfe WatchSource:0}: Error finding container 4fe7f388cd62eec567aad001908a1359f7b5cd6b4561332fbf9af272eb100bfe: Status 404 returned error can't find the container with id 4fe7f388cd62eec567aad001908a1359f7b5cd6b4561332fbf9af272eb100bfe Apr 21 10:21:41.447862 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:21:41.447824 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-spdh8" event={"ID":"24cd6a2f-5472-4b84-a482-197432f7c172","Type":"ContainerStarted","Data":"35997e658d658059f25212789222b2bfd0e88870930d3fa731acd1faa73ea1ba"} Apr 21 10:21:41.447862 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:21:41.447863 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-spdh8" event={"ID":"24cd6a2f-5472-4b84-a482-197432f7c172","Type":"ContainerStarted","Data":"4fe7f388cd62eec567aad001908a1359f7b5cd6b4561332fbf9af272eb100bfe"} Apr 21 10:21:44.458927 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:21:44.458890 2577 generic.go:358] "Generic (PLEG): container finished" podID="24cd6a2f-5472-4b84-a482-197432f7c172" containerID="35997e658d658059f25212789222b2bfd0e88870930d3fa731acd1faa73ea1ba" exitCode=0 Apr 21 10:21:44.459308 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:21:44.458969 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-spdh8" event={"ID":"24cd6a2f-5472-4b84-a482-197432f7c172","Type":"ContainerDied","Data":"35997e658d658059f25212789222b2bfd0e88870930d3fa731acd1faa73ea1ba"} Apr 21 10:21:45.464338 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:21:45.464300 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-spdh8" event={"ID":"24cd6a2f-5472-4b84-a482-197432f7c172","Type":"ContainerStarted","Data":"ff7cac095ff351115c09c25033e4aab62eddc1b2c5da0c78f6f87b9344cd191b"} Apr 21 10:21:45.464783 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:21:45.464607 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-spdh8" Apr 21 10:21:45.466110 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:21:45.466081 2577 generic.go:358] "Generic (PLEG): container finished" podID="af166313-95e4-4282-9d51-2e96a46314d0" containerID="1d8a9701ed03370091555412e07209f214b92fe92a7d8f6e19f00b30a54b29d4" exitCode=0 Apr 21 10:21:45.466231 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:21:45.466133 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-gj6lc" event={"ID":"af166313-95e4-4282-9d51-2e96a46314d0","Type":"ContainerDied","Data":"1d8a9701ed03370091555412e07209f214b92fe92a7d8f6e19f00b30a54b29d4"} Apr 21 10:21:45.466231 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:21:45.466204 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-spdh8" podUID="24cd6a2f-5472-4b84-a482-197432f7c172" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 21 10:21:45.486365 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:21:45.484920 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-spdh8" podStartSLOduration=5.48490097 podStartE2EDuration="5.48490097s" podCreationTimestamp="2026-04-21 10:21:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:21:45.481133667 +0000 UTC m=+1073.819667686" watchObservedRunningTime="2026-04-21 10:21:45.48490097 +0000 UTC m=+1073.823434989" Apr 21 10:21:45.498999 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:21:45.498970 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-gj6lc" Apr 21 10:21:45.544989 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:21:45.544950 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/af166313-95e4-4282-9d51-2e96a46314d0-kserve-provision-location\") pod \"af166313-95e4-4282-9d51-2e96a46314d0\" (UID: \"af166313-95e4-4282-9d51-2e96a46314d0\") " Apr 21 10:21:45.545325 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:21:45.545301 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af166313-95e4-4282-9d51-2e96a46314d0-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "af166313-95e4-4282-9d51-2e96a46314d0" (UID: "af166313-95e4-4282-9d51-2e96a46314d0"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:21:45.646026 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:21:45.645931 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/af166313-95e4-4282-9d51-2e96a46314d0-kserve-provision-location\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:21:46.470473 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:21:46.470447 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-gj6lc" Apr 21 10:21:46.470937 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:21:46.470444 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-gj6lc" event={"ID":"af166313-95e4-4282-9d51-2e96a46314d0","Type":"ContainerDied","Data":"8715a93c28b01c4c14f0f67a8e73f5e4e406dfdf6a62dc23d606e664b9a3eef1"} Apr 21 10:21:46.470937 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:21:46.470554 2577 scope.go:117] "RemoveContainer" containerID="1d8a9701ed03370091555412e07209f214b92fe92a7d8f6e19f00b30a54b29d4" Apr 21 10:21:46.471056 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:21:46.470995 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-spdh8" podUID="24cd6a2f-5472-4b84-a482-197432f7c172" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 21 10:21:46.478408 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:21:46.478390 2577 scope.go:117] "RemoveContainer" containerID="fa50f142788d2987982bdab73086c61d01a808f0d215fb49e0071afc7067e8a4" Apr 21 10:21:46.485885 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:21:46.485836 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-gj6lc"] Apr 21 10:21:46.489816 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:21:46.489757 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-gj6lc"] Apr 21 10:21:48.231710 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:21:48.231677 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af166313-95e4-4282-9d51-2e96a46314d0" path="/var/lib/kubelet/pods/af166313-95e4-4282-9d51-2e96a46314d0/volumes" Apr 21 10:21:56.471393 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:21:56.471345 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-spdh8" podUID="24cd6a2f-5472-4b84-a482-197432f7c172" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 21 10:22:06.471934 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:22:06.471888 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-spdh8" podUID="24cd6a2f-5472-4b84-a482-197432f7c172" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 21 10:22:16.471728 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:22:16.471683 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-spdh8" podUID="24cd6a2f-5472-4b84-a482-197432f7c172" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 21 10:22:26.471108 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:22:26.471056 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-spdh8" podUID="24cd6a2f-5472-4b84-a482-197432f7c172" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 21 10:22:36.470907 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:22:36.470862 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-spdh8" podUID="24cd6a2f-5472-4b84-a482-197432f7c172" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 21 10:22:46.471473 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:22:46.471427 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-spdh8" podUID="24cd6a2f-5472-4b84-a482-197432f7c172" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 21 10:22:49.228322 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:22:49.228280 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-spdh8" podUID="24cd6a2f-5472-4b84-a482-197432f7c172" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 21 10:22:59.228481 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:22:59.228437 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-spdh8" podUID="24cd6a2f-5472-4b84-a482-197432f7c172" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.30:8080: connect: connection refused" Apr 21 10:23:09.230032 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:23:09.230001 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-spdh8" Apr 21 10:23:10.844285 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:23:10.844201 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-spdh8"] Apr 21 10:23:10.844736 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:23:10.844464 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-spdh8" podUID="24cd6a2f-5472-4b84-a482-197432f7c172" containerName="kserve-container" containerID="cri-o://ff7cac095ff351115c09c25033e4aab62eddc1b2c5da0c78f6f87b9344cd191b" gracePeriod=30 Apr 21 10:23:10.929636 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:23:10.929599 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-2m2v7"] Apr 21 10:23:10.929922 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:23:10.929909 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="af166313-95e4-4282-9d51-2e96a46314d0" containerName="kserve-container" Apr 21 10:23:10.929922 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:23:10.929924 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="af166313-95e4-4282-9d51-2e96a46314d0" containerName="kserve-container" Apr 21 10:23:10.930019 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:23:10.929934 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="af166313-95e4-4282-9d51-2e96a46314d0" containerName="storage-initializer" Apr 21 10:23:10.930019 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:23:10.929940 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="af166313-95e4-4282-9d51-2e96a46314d0" containerName="storage-initializer" Apr 21 10:23:10.930019 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:23:10.929993 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="af166313-95e4-4282-9d51-2e96a46314d0" containerName="kserve-container" Apr 21 10:23:10.933042 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:23:10.933025 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-2m2v7" Apr 21 10:23:10.942970 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:23:10.942941 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-2m2v7"] Apr 21 10:23:11.000156 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:23:11.000122 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7851ad51-f5bd-4cec-a8aa-bb04b32e3bfc-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8d44c64dc-2m2v7\" (UID: \"7851ad51-f5bd-4cec-a8aa-bb04b32e3bfc\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-2m2v7" Apr 21 10:23:11.100863 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:23:11.100726 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7851ad51-f5bd-4cec-a8aa-bb04b32e3bfc-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8d44c64dc-2m2v7\" (UID: \"7851ad51-f5bd-4cec-a8aa-bb04b32e3bfc\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-2m2v7" Apr 21 10:23:11.101129 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:23:11.101107 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7851ad51-f5bd-4cec-a8aa-bb04b32e3bfc-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8d44c64dc-2m2v7\" (UID: \"7851ad51-f5bd-4cec-a8aa-bb04b32e3bfc\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-2m2v7" Apr 21 10:23:11.242874 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:23:11.242835 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-2m2v7" Apr 21 10:23:11.365501 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:23:11.365331 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-2m2v7"] Apr 21 10:23:11.368240 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:23:11.368207 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7851ad51_f5bd_4cec_a8aa_bb04b32e3bfc.slice/crio-1950b1e099d5c1c0eb42004b2fa87d34246bae944b8d294725a82ec170b2ce70 WatchSource:0}: Error finding container 1950b1e099d5c1c0eb42004b2fa87d34246bae944b8d294725a82ec170b2ce70: Status 404 returned error can't find the container with id 1950b1e099d5c1c0eb42004b2fa87d34246bae944b8d294725a82ec170b2ce70 Apr 21 10:23:11.732731 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:23:11.732698 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-2m2v7" event={"ID":"7851ad51-f5bd-4cec-a8aa-bb04b32e3bfc","Type":"ContainerStarted","Data":"13e23b5c07d7b18131eb0f10c58c78b3d9f56f4db9e4f3278b16398ad48961a5"} Apr 21 10:23:11.732731 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:23:11.732735 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-2m2v7" event={"ID":"7851ad51-f5bd-4cec-a8aa-bb04b32e3bfc","Type":"ContainerStarted","Data":"1950b1e099d5c1c0eb42004b2fa87d34246bae944b8d294725a82ec170b2ce70"} Apr 21 10:23:15.746697 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:23:15.746661 2577 generic.go:358] "Generic (PLEG): container finished" podID="7851ad51-f5bd-4cec-a8aa-bb04b32e3bfc" containerID="13e23b5c07d7b18131eb0f10c58c78b3d9f56f4db9e4f3278b16398ad48961a5" exitCode=0 Apr 21 10:23:15.747194 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:23:15.746740 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-2m2v7" event={"ID":"7851ad51-f5bd-4cec-a8aa-bb04b32e3bfc","Type":"ContainerDied","Data":"13e23b5c07d7b18131eb0f10c58c78b3d9f56f4db9e4f3278b16398ad48961a5"} Apr 21 10:23:15.748679 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:23:15.748494 2577 generic.go:358] "Generic (PLEG): container finished" podID="24cd6a2f-5472-4b84-a482-197432f7c172" containerID="ff7cac095ff351115c09c25033e4aab62eddc1b2c5da0c78f6f87b9344cd191b" exitCode=0 Apr 21 10:23:15.748679 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:23:15.748526 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-spdh8" event={"ID":"24cd6a2f-5472-4b84-a482-197432f7c172","Type":"ContainerDied","Data":"ff7cac095ff351115c09c25033e4aab62eddc1b2c5da0c78f6f87b9344cd191b"} Apr 21 10:23:15.779438 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:23:15.779414 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-spdh8" Apr 21 10:23:15.839866 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:23:15.839830 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/24cd6a2f-5472-4b84-a482-197432f7c172-kserve-provision-location\") pod \"24cd6a2f-5472-4b84-a482-197432f7c172\" (UID: \"24cd6a2f-5472-4b84-a482-197432f7c172\") " Apr 21 10:23:15.840201 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:23:15.840176 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24cd6a2f-5472-4b84-a482-197432f7c172-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "24cd6a2f-5472-4b84-a482-197432f7c172" (UID: "24cd6a2f-5472-4b84-a482-197432f7c172"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:23:15.941234 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:23:15.941148 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/24cd6a2f-5472-4b84-a482-197432f7c172-kserve-provision-location\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:23:16.756360 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:23:16.756317 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-spdh8" event={"ID":"24cd6a2f-5472-4b84-a482-197432f7c172","Type":"ContainerDied","Data":"4fe7f388cd62eec567aad001908a1359f7b5cd6b4561332fbf9af272eb100bfe"} Apr 21 10:23:16.756975 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:23:16.756373 2577 scope.go:117] "RemoveContainer" containerID="ff7cac095ff351115c09c25033e4aab62eddc1b2c5da0c78f6f87b9344cd191b" Apr 21 10:23:16.756975 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:23:16.756549 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-spdh8" Apr 21 10:23:16.772145 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:23:16.772119 2577 scope.go:117] "RemoveContainer" containerID="35997e658d658059f25212789222b2bfd0e88870930d3fa731acd1faa73ea1ba" Apr 21 10:23:16.778024 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:23:16.777993 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-spdh8"] Apr 21 10:23:16.780091 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:23:16.780057 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-spdh8"] Apr 21 10:23:18.235375 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:23:18.234958 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24cd6a2f-5472-4b84-a482-197432f7c172" path="/var/lib/kubelet/pods/24cd6a2f-5472-4b84-a482-197432f7c172/volumes" Apr 21 10:23:52.177832 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:23:52.177805 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-9vsls_c7fc40ec-8080-4847-9b12-78671f916c03/console-operator/2.log" Apr 21 10:23:52.180074 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:23:52.180051 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-9vsls_c7fc40ec-8080-4847-9b12-78671f916c03/console-operator/2.log" Apr 21 10:25:31.187486 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:25:31.187450 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-2m2v7" event={"ID":"7851ad51-f5bd-4cec-a8aa-bb04b32e3bfc","Type":"ContainerStarted","Data":"adf965100dcef1eb9efd6cc624a0daca82ae8ef517f2223d79b0b89455bf700e"} Apr 21 10:25:31.188034 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:25:31.187599 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-2m2v7" Apr 21 10:25:31.213729 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:25:31.213639 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-2m2v7" podStartSLOduration=6.499625137 podStartE2EDuration="2m21.213623634s" podCreationTimestamp="2026-04-21 10:23:10 +0000 UTC" firstStartedPulling="2026-04-21 10:23:15.748014934 +0000 UTC m=+1164.086548932" lastFinishedPulling="2026-04-21 10:25:30.462013417 +0000 UTC m=+1298.800547429" observedRunningTime="2026-04-21 10:25:31.213235995 +0000 UTC m=+1299.551770012" watchObservedRunningTime="2026-04-21 10:25:31.213623634 +0000 UTC m=+1299.552157652" Apr 21 10:26:02.195879 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:02.195847 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-2m2v7" Apr 21 10:26:11.114556 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:11.114469 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-2m2v7"] Apr 21 10:26:11.114951 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:11.114768 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-2m2v7" podUID="7851ad51-f5bd-4cec-a8aa-bb04b32e3bfc" containerName="kserve-container" containerID="cri-o://adf965100dcef1eb9efd6cc624a0daca82ae8ef517f2223d79b0b89455bf700e" gracePeriod=30 Apr 21 10:26:11.200633 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:11.200589 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-4hsr4"] Apr 21 10:26:11.200978 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:11.200962 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="24cd6a2f-5472-4b84-a482-197432f7c172" containerName="kserve-container" Apr 21 10:26:11.200978 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:11.200978 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="24cd6a2f-5472-4b84-a482-197432f7c172" containerName="kserve-container" Apr 21 10:26:11.201097 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:11.200999 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="24cd6a2f-5472-4b84-a482-197432f7c172" containerName="storage-initializer" Apr 21 10:26:11.201097 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:11.201005 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="24cd6a2f-5472-4b84-a482-197432f7c172" containerName="storage-initializer" Apr 21 10:26:11.201097 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:11.201060 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="24cd6a2f-5472-4b84-a482-197432f7c172" containerName="kserve-container" Apr 21 10:26:11.204278 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:11.204254 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-4hsr4" Apr 21 10:26:11.211008 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:11.210980 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-4hsr4"] Apr 21 10:26:11.347855 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:11.347817 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f65885b0-ad1a-45e5-8f51-a8dbcce55842-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-4hsr4\" (UID: \"f65885b0-ad1a-45e5-8f51-a8dbcce55842\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-4hsr4" Apr 21 10:26:11.449310 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:11.449211 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f65885b0-ad1a-45e5-8f51-a8dbcce55842-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-4hsr4\" (UID: \"f65885b0-ad1a-45e5-8f51-a8dbcce55842\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-4hsr4" Apr 21 10:26:11.449607 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:11.449586 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f65885b0-ad1a-45e5-8f51-a8dbcce55842-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-4hsr4\" (UID: \"f65885b0-ad1a-45e5-8f51-a8dbcce55842\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-4hsr4" Apr 21 10:26:11.515372 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:11.515333 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-4hsr4" Apr 21 10:26:11.638900 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:11.638869 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-4hsr4"] Apr 21 10:26:11.641523 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:26:11.641491 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf65885b0_ad1a_45e5_8f51_a8dbcce55842.slice/crio-3a6b04f7c2549d464904ae88ba4c170f8a528532ffa1f59566a2c63bb8995f46 WatchSource:0}: Error finding container 3a6b04f7c2549d464904ae88ba4c170f8a528532ffa1f59566a2c63bb8995f46: Status 404 returned error can't find the container with id 3a6b04f7c2549d464904ae88ba4c170f8a528532ffa1f59566a2c63bb8995f46 Apr 21 10:26:11.643384 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:11.643367 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 10:26:12.254036 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:12.254012 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-2m2v7" Apr 21 10:26:12.313819 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:12.313785 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-4hsr4" event={"ID":"f65885b0-ad1a-45e5-8f51-a8dbcce55842","Type":"ContainerStarted","Data":"986348f806cf7b1588f7d447dc1dcac4676ce02af2ebabc8c8b2c6d59db8a7ad"} Apr 21 10:26:12.313819 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:12.313827 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-4hsr4" event={"ID":"f65885b0-ad1a-45e5-8f51-a8dbcce55842","Type":"ContainerStarted","Data":"3a6b04f7c2549d464904ae88ba4c170f8a528532ffa1f59566a2c63bb8995f46"} Apr 21 10:26:12.315201 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:12.315171 2577 generic.go:358] "Generic (PLEG): container finished" podID="7851ad51-f5bd-4cec-a8aa-bb04b32e3bfc" containerID="adf965100dcef1eb9efd6cc624a0daca82ae8ef517f2223d79b0b89455bf700e" exitCode=0 Apr 21 10:26:12.315346 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:12.315236 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-2m2v7" Apr 21 10:26:12.315346 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:12.315256 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-2m2v7" event={"ID":"7851ad51-f5bd-4cec-a8aa-bb04b32e3bfc","Type":"ContainerDied","Data":"adf965100dcef1eb9efd6cc624a0daca82ae8ef517f2223d79b0b89455bf700e"} Apr 21 10:26:12.315346 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:12.315297 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-2m2v7" event={"ID":"7851ad51-f5bd-4cec-a8aa-bb04b32e3bfc","Type":"ContainerDied","Data":"1950b1e099d5c1c0eb42004b2fa87d34246bae944b8d294725a82ec170b2ce70"} Apr 21 10:26:12.315346 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:12.315317 2577 scope.go:117] "RemoveContainer" containerID="adf965100dcef1eb9efd6cc624a0daca82ae8ef517f2223d79b0b89455bf700e" Apr 21 10:26:12.323193 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:12.323175 2577 scope.go:117] "RemoveContainer" containerID="13e23b5c07d7b18131eb0f10c58c78b3d9f56f4db9e4f3278b16398ad48961a5" Apr 21 10:26:12.330920 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:12.330893 2577 scope.go:117] "RemoveContainer" containerID="adf965100dcef1eb9efd6cc624a0daca82ae8ef517f2223d79b0b89455bf700e" Apr 21 10:26:12.331242 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:26:12.331223 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adf965100dcef1eb9efd6cc624a0daca82ae8ef517f2223d79b0b89455bf700e\": container with ID starting with adf965100dcef1eb9efd6cc624a0daca82ae8ef517f2223d79b0b89455bf700e not found: ID does not exist" containerID="adf965100dcef1eb9efd6cc624a0daca82ae8ef517f2223d79b0b89455bf700e" Apr 21 10:26:12.331325 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:12.331253 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adf965100dcef1eb9efd6cc624a0daca82ae8ef517f2223d79b0b89455bf700e"} err="failed to get container status \"adf965100dcef1eb9efd6cc624a0daca82ae8ef517f2223d79b0b89455bf700e\": rpc error: code = NotFound desc = could not find container \"adf965100dcef1eb9efd6cc624a0daca82ae8ef517f2223d79b0b89455bf700e\": container with ID starting with adf965100dcef1eb9efd6cc624a0daca82ae8ef517f2223d79b0b89455bf700e not found: ID does not exist" Apr 21 10:26:12.331325 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:12.331272 2577 scope.go:117] "RemoveContainer" containerID="13e23b5c07d7b18131eb0f10c58c78b3d9f56f4db9e4f3278b16398ad48961a5" Apr 21 10:26:12.331545 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:26:12.331520 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13e23b5c07d7b18131eb0f10c58c78b3d9f56f4db9e4f3278b16398ad48961a5\": container with ID starting with 13e23b5c07d7b18131eb0f10c58c78b3d9f56f4db9e4f3278b16398ad48961a5 not found: ID does not exist" containerID="13e23b5c07d7b18131eb0f10c58c78b3d9f56f4db9e4f3278b16398ad48961a5" Apr 21 10:26:12.331602 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:12.331554 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13e23b5c07d7b18131eb0f10c58c78b3d9f56f4db9e4f3278b16398ad48961a5"} err="failed to get container status \"13e23b5c07d7b18131eb0f10c58c78b3d9f56f4db9e4f3278b16398ad48961a5\": rpc error: code = NotFound desc = could not find container \"13e23b5c07d7b18131eb0f10c58c78b3d9f56f4db9e4f3278b16398ad48961a5\": container with ID starting with 13e23b5c07d7b18131eb0f10c58c78b3d9f56f4db9e4f3278b16398ad48961a5 not found: ID does not exist" Apr 21 10:26:12.355693 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:12.355601 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7851ad51-f5bd-4cec-a8aa-bb04b32e3bfc-kserve-provision-location\") pod \"7851ad51-f5bd-4cec-a8aa-bb04b32e3bfc\" (UID: \"7851ad51-f5bd-4cec-a8aa-bb04b32e3bfc\") " Apr 21 10:26:12.356002 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:12.355979 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7851ad51-f5bd-4cec-a8aa-bb04b32e3bfc-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7851ad51-f5bd-4cec-a8aa-bb04b32e3bfc" (UID: "7851ad51-f5bd-4cec-a8aa-bb04b32e3bfc"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:26:12.456311 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:12.456275 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7851ad51-f5bd-4cec-a8aa-bb04b32e3bfc-kserve-provision-location\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:26:12.637326 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:12.637296 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-2m2v7"] Apr 21 10:26:12.639786 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:12.639742 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-2m2v7"] Apr 21 10:26:13.192264 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:13.192219 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-2m2v7" podUID="7851ad51-f5bd-4cec-a8aa-bb04b32e3bfc" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.31:8080/v2/models/isvc-lightgbm-v2-runtime/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Apr 21 10:26:14.232352 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:14.232315 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7851ad51-f5bd-4cec-a8aa-bb04b32e3bfc" path="/var/lib/kubelet/pods/7851ad51-f5bd-4cec-a8aa-bb04b32e3bfc/volumes" Apr 21 10:26:16.334032 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:16.334000 2577 generic.go:358] "Generic (PLEG): container finished" podID="f65885b0-ad1a-45e5-8f51-a8dbcce55842" containerID="986348f806cf7b1588f7d447dc1dcac4676ce02af2ebabc8c8b2c6d59db8a7ad" exitCode=0 Apr 21 10:26:16.334451 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:16.334078 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-4hsr4" event={"ID":"f65885b0-ad1a-45e5-8f51-a8dbcce55842","Type":"ContainerDied","Data":"986348f806cf7b1588f7d447dc1dcac4676ce02af2ebabc8c8b2c6d59db8a7ad"} Apr 21 10:26:17.339514 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:17.339475 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-4hsr4" event={"ID":"f65885b0-ad1a-45e5-8f51-a8dbcce55842","Type":"ContainerStarted","Data":"dfa4e898bc01c4989656d8f4ef4a7500804a7b0a3fa21574cb90cc2cea5e55f9"} Apr 21 10:26:17.340014 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:17.339858 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-4hsr4" Apr 21 10:26:17.341258 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:17.341235 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-4hsr4" podUID="f65885b0-ad1a-45e5-8f51-a8dbcce55842" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 21 10:26:17.354683 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:17.354635 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-4hsr4" podStartSLOduration=6.35462122 podStartE2EDuration="6.35462122s" podCreationTimestamp="2026-04-21 10:26:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:26:17.353659183 +0000 UTC m=+1345.692193197" watchObservedRunningTime="2026-04-21 10:26:17.35462122 +0000 UTC m=+1345.693155269" Apr 21 10:26:18.343307 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:18.343272 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-4hsr4" podUID="f65885b0-ad1a-45e5-8f51-a8dbcce55842" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.32:8080: connect: connection refused" Apr 21 10:26:28.344506 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:28.344476 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-4hsr4" Apr 21 10:26:31.233825 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:31.233793 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-4hsr4"] Apr 21 10:26:31.234258 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:31.234068 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-4hsr4" podUID="f65885b0-ad1a-45e5-8f51-a8dbcce55842" containerName="kserve-container" containerID="cri-o://dfa4e898bc01c4989656d8f4ef4a7500804a7b0a3fa21574cb90cc2cea5e55f9" gracePeriod=30 Apr 21 10:26:31.292543 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:31.292507 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-dclk4"] Apr 21 10:26:31.292821 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:31.292809 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7851ad51-f5bd-4cec-a8aa-bb04b32e3bfc" containerName="kserve-container" Apr 21 10:26:31.292821 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:31.292822 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="7851ad51-f5bd-4cec-a8aa-bb04b32e3bfc" containerName="kserve-container" Apr 21 10:26:31.292905 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:31.292834 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7851ad51-f5bd-4cec-a8aa-bb04b32e3bfc" containerName="storage-initializer" Apr 21 10:26:31.292905 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:31.292839 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="7851ad51-f5bd-4cec-a8aa-bb04b32e3bfc" containerName="storage-initializer" Apr 21 10:26:31.292905 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:31.292894 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="7851ad51-f5bd-4cec-a8aa-bb04b32e3bfc" containerName="kserve-container" Apr 21 10:26:31.295888 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:31.295859 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-dclk4" Apr 21 10:26:31.305835 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:31.305809 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-dclk4"] Apr 21 10:26:31.396175 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:31.396137 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9ff65f8c-5e82-45a9-bd70-52af8e3a4532-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5497fc5f68-dclk4\" (UID: \"9ff65f8c-5e82-45a9-bd70-52af8e3a4532\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-dclk4" Apr 21 10:26:31.497587 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:31.497477 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9ff65f8c-5e82-45a9-bd70-52af8e3a4532-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5497fc5f68-dclk4\" (UID: \"9ff65f8c-5e82-45a9-bd70-52af8e3a4532\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-dclk4" Apr 21 10:26:31.497871 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:31.497853 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9ff65f8c-5e82-45a9-bd70-52af8e3a4532-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5497fc5f68-dclk4\" (UID: \"9ff65f8c-5e82-45a9-bd70-52af8e3a4532\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-dclk4" Apr 21 10:26:31.606357 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:31.606322 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-dclk4" Apr 21 10:26:31.733569 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:31.733532 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-dclk4"] Apr 21 10:26:31.736619 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:26:31.736589 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ff65f8c_5e82_45a9_bd70_52af8e3a4532.slice/crio-9c5339d5e12db91477565502ad6811ee7d0b30f03217b7d2f2a046ea9aba812d WatchSource:0}: Error finding container 9c5339d5e12db91477565502ad6811ee7d0b30f03217b7d2f2a046ea9aba812d: Status 404 returned error can't find the container with id 9c5339d5e12db91477565502ad6811ee7d0b30f03217b7d2f2a046ea9aba812d Apr 21 10:26:31.964316 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:31.964293 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-4hsr4" Apr 21 10:26:32.000150 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:32.000102 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f65885b0-ad1a-45e5-8f51-a8dbcce55842-kserve-provision-location\") pod \"f65885b0-ad1a-45e5-8f51-a8dbcce55842\" (UID: \"f65885b0-ad1a-45e5-8f51-a8dbcce55842\") " Apr 21 10:26:32.000438 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:32.000414 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f65885b0-ad1a-45e5-8f51-a8dbcce55842-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f65885b0-ad1a-45e5-8f51-a8dbcce55842" (UID: "f65885b0-ad1a-45e5-8f51-a8dbcce55842"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:26:32.101263 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:32.101175 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f65885b0-ad1a-45e5-8f51-a8dbcce55842-kserve-provision-location\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:26:32.389066 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:32.388980 2577 generic.go:358] "Generic (PLEG): container finished" podID="f65885b0-ad1a-45e5-8f51-a8dbcce55842" containerID="dfa4e898bc01c4989656d8f4ef4a7500804a7b0a3fa21574cb90cc2cea5e55f9" exitCode=0 Apr 21 10:26:32.389066 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:32.389042 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-4hsr4" event={"ID":"f65885b0-ad1a-45e5-8f51-a8dbcce55842","Type":"ContainerDied","Data":"dfa4e898bc01c4989656d8f4ef4a7500804a7b0a3fa21574cb90cc2cea5e55f9"} Apr 21 10:26:32.389066 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:32.389055 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-4hsr4" Apr 21 10:26:32.389066 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:32.389067 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-4hsr4" event={"ID":"f65885b0-ad1a-45e5-8f51-a8dbcce55842","Type":"ContainerDied","Data":"3a6b04f7c2549d464904ae88ba4c170f8a528532ffa1f59566a2c63bb8995f46"} Apr 21 10:26:32.389643 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:32.389088 2577 scope.go:117] "RemoveContainer" containerID="dfa4e898bc01c4989656d8f4ef4a7500804a7b0a3fa21574cb90cc2cea5e55f9" Apr 21 10:26:32.390642 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:32.390622 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-dclk4" event={"ID":"9ff65f8c-5e82-45a9-bd70-52af8e3a4532","Type":"ContainerStarted","Data":"a30b3307802fcbe3c47163fc32202949b4ac250760afce4e768f9aa28f540fff"} Apr 21 10:26:32.390741 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:32.390647 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-dclk4" event={"ID":"9ff65f8c-5e82-45a9-bd70-52af8e3a4532","Type":"ContainerStarted","Data":"9c5339d5e12db91477565502ad6811ee7d0b30f03217b7d2f2a046ea9aba812d"} Apr 21 10:26:32.396701 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:32.396681 2577 scope.go:117] "RemoveContainer" containerID="986348f806cf7b1588f7d447dc1dcac4676ce02af2ebabc8c8b2c6d59db8a7ad" Apr 21 10:26:32.403835 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:32.403815 2577 scope.go:117] "RemoveContainer" containerID="dfa4e898bc01c4989656d8f4ef4a7500804a7b0a3fa21574cb90cc2cea5e55f9" Apr 21 10:26:32.404108 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:26:32.404088 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfa4e898bc01c4989656d8f4ef4a7500804a7b0a3fa21574cb90cc2cea5e55f9\": container with ID starting with dfa4e898bc01c4989656d8f4ef4a7500804a7b0a3fa21574cb90cc2cea5e55f9 not found: ID does not exist" containerID="dfa4e898bc01c4989656d8f4ef4a7500804a7b0a3fa21574cb90cc2cea5e55f9" Apr 21 10:26:32.404166 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:32.404118 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfa4e898bc01c4989656d8f4ef4a7500804a7b0a3fa21574cb90cc2cea5e55f9"} err="failed to get container status \"dfa4e898bc01c4989656d8f4ef4a7500804a7b0a3fa21574cb90cc2cea5e55f9\": rpc error: code = NotFound desc = could not find container \"dfa4e898bc01c4989656d8f4ef4a7500804a7b0a3fa21574cb90cc2cea5e55f9\": container with ID starting with dfa4e898bc01c4989656d8f4ef4a7500804a7b0a3fa21574cb90cc2cea5e55f9 not found: ID does not exist" Apr 21 10:26:32.404166 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:32.404134 2577 scope.go:117] "RemoveContainer" containerID="986348f806cf7b1588f7d447dc1dcac4676ce02af2ebabc8c8b2c6d59db8a7ad" Apr 21 10:26:32.404359 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:26:32.404343 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"986348f806cf7b1588f7d447dc1dcac4676ce02af2ebabc8c8b2c6d59db8a7ad\": container with ID starting with 986348f806cf7b1588f7d447dc1dcac4676ce02af2ebabc8c8b2c6d59db8a7ad not found: ID does not exist" containerID="986348f806cf7b1588f7d447dc1dcac4676ce02af2ebabc8c8b2c6d59db8a7ad" Apr 21 10:26:32.404398 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:32.404363 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"986348f806cf7b1588f7d447dc1dcac4676ce02af2ebabc8c8b2c6d59db8a7ad"} err="failed to get container status \"986348f806cf7b1588f7d447dc1dcac4676ce02af2ebabc8c8b2c6d59db8a7ad\": rpc error: code = NotFound desc = could not find container \"986348f806cf7b1588f7d447dc1dcac4676ce02af2ebabc8c8b2c6d59db8a7ad\": container with ID starting with 986348f806cf7b1588f7d447dc1dcac4676ce02af2ebabc8c8b2c6d59db8a7ad not found: ID does not exist" Apr 21 10:26:32.423432 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:32.423399 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-4hsr4"] Apr 21 10:26:32.426948 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:32.426922 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-4hsr4"] Apr 21 10:26:34.231960 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:34.231925 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f65885b0-ad1a-45e5-8f51-a8dbcce55842" path="/var/lib/kubelet/pods/f65885b0-ad1a-45e5-8f51-a8dbcce55842/volumes" Apr 21 10:26:36.404423 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:36.404386 2577 generic.go:358] "Generic (PLEG): container finished" podID="9ff65f8c-5e82-45a9-bd70-52af8e3a4532" containerID="a30b3307802fcbe3c47163fc32202949b4ac250760afce4e768f9aa28f540fff" exitCode=0 Apr 21 10:26:36.404853 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:36.404457 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-dclk4" event={"ID":"9ff65f8c-5e82-45a9-bd70-52af8e3a4532","Type":"ContainerDied","Data":"a30b3307802fcbe3c47163fc32202949b4ac250760afce4e768f9aa28f540fff"} Apr 21 10:26:37.409304 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:37.409267 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-dclk4" event={"ID":"9ff65f8c-5e82-45a9-bd70-52af8e3a4532","Type":"ContainerStarted","Data":"acb9749163d5eca73b1a5f20d4085d57e396c83c19d92ec7568d944e74c29377"} Apr 21 10:26:37.409781 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:37.409487 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-dclk4" Apr 21 10:26:37.424864 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:26:37.424802 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-dclk4" podStartSLOduration=6.424781576 podStartE2EDuration="6.424781576s" podCreationTimestamp="2026-04-21 10:26:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:26:37.424703618 +0000 UTC m=+1365.763237632" watchObservedRunningTime="2026-04-21 10:26:37.424781576 +0000 UTC m=+1365.763315596" Apr 21 10:27:08.417263 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:27:08.417235 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-dclk4" Apr 21 10:27:11.376264 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:27:11.376231 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-dclk4"] Apr 21 10:27:11.376648 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:27:11.376470 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-dclk4" podUID="9ff65f8c-5e82-45a9-bd70-52af8e3a4532" containerName="kserve-container" containerID="cri-o://acb9749163d5eca73b1a5f20d4085d57e396c83c19d92ec7568d944e74c29377" gracePeriod=30 Apr 21 10:27:11.449506 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:27:11.449460 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-86bf795455-r5bzx"] Apr 21 10:27:11.449871 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:27:11.449853 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f65885b0-ad1a-45e5-8f51-a8dbcce55842" containerName="kserve-container" Apr 21 10:27:11.449871 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:27:11.449872 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="f65885b0-ad1a-45e5-8f51-a8dbcce55842" containerName="kserve-container" Apr 21 10:27:11.450015 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:27:11.449889 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f65885b0-ad1a-45e5-8f51-a8dbcce55842" containerName="storage-initializer" Apr 21 10:27:11.450015 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:27:11.449896 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="f65885b0-ad1a-45e5-8f51-a8dbcce55842" containerName="storage-initializer" Apr 21 10:27:11.450015 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:27:11.449976 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="f65885b0-ad1a-45e5-8f51-a8dbcce55842" containerName="kserve-container" Apr 21 10:27:11.454533 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:27:11.454508 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-86bf795455-r5bzx" Apr 21 10:27:11.462737 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:27:11.462706 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-86bf795455-r5bzx"] Apr 21 10:27:11.499054 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:27:11.499014 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/13c88ff4-eb3b-4c48-8400-6235299efb04-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-86bf795455-r5bzx\" (UID: \"13c88ff4-eb3b-4c48-8400-6235299efb04\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-86bf795455-r5bzx" Apr 21 10:27:11.599992 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:27:11.599948 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/13c88ff4-eb3b-4c48-8400-6235299efb04-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-86bf795455-r5bzx\" (UID: \"13c88ff4-eb3b-4c48-8400-6235299efb04\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-86bf795455-r5bzx" Apr 21 10:27:11.600404 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:27:11.600378 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/13c88ff4-eb3b-4c48-8400-6235299efb04-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-86bf795455-r5bzx\" (UID: \"13c88ff4-eb3b-4c48-8400-6235299efb04\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-86bf795455-r5bzx" Apr 21 10:27:11.765662 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:27:11.765619 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-86bf795455-r5bzx" Apr 21 10:27:11.895641 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:27:11.895607 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-86bf795455-r5bzx"] Apr 21 10:27:11.898740 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:27:11.898710 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13c88ff4_eb3b_4c48_8400_6235299efb04.slice/crio-4e1b1bb40817854d010049caa65dcfd5530b85a2ebf0661da6a842c14ea72c04 WatchSource:0}: Error finding container 4e1b1bb40817854d010049caa65dcfd5530b85a2ebf0661da6a842c14ea72c04: Status 404 returned error can't find the container with id 4e1b1bb40817854d010049caa65dcfd5530b85a2ebf0661da6a842c14ea72c04 Apr 21 10:27:12.519598 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:27:12.519509 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-86bf795455-r5bzx" event={"ID":"13c88ff4-eb3b-4c48-8400-6235299efb04","Type":"ContainerStarted","Data":"8a8030c03d946b16f0e4958ab057e7bfe5a47d7c62fe04baa1e0d95ef30e6f62"} Apr 21 10:27:12.519598 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:27:12.519549 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-86bf795455-r5bzx" event={"ID":"13c88ff4-eb3b-4c48-8400-6235299efb04","Type":"ContainerStarted","Data":"4e1b1bb40817854d010049caa65dcfd5530b85a2ebf0661da6a842c14ea72c04"} Apr 21 10:27:12.818468 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:27:12.818445 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-dclk4" Apr 21 10:27:12.910546 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:27:12.910505 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9ff65f8c-5e82-45a9-bd70-52af8e3a4532-kserve-provision-location\") pod \"9ff65f8c-5e82-45a9-bd70-52af8e3a4532\" (UID: \"9ff65f8c-5e82-45a9-bd70-52af8e3a4532\") " Apr 21 10:27:12.910923 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:27:12.910899 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ff65f8c-5e82-45a9-bd70-52af8e3a4532-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9ff65f8c-5e82-45a9-bd70-52af8e3a4532" (UID: "9ff65f8c-5e82-45a9-bd70-52af8e3a4532"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:27:13.011386 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:27:13.011345 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9ff65f8c-5e82-45a9-bd70-52af8e3a4532-kserve-provision-location\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:27:13.523903 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:27:13.523866 2577 generic.go:358] "Generic (PLEG): container finished" podID="9ff65f8c-5e82-45a9-bd70-52af8e3a4532" containerID="acb9749163d5eca73b1a5f20d4085d57e396c83c19d92ec7568d944e74c29377" exitCode=0 Apr 21 10:27:13.524303 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:27:13.523934 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-dclk4" Apr 21 10:27:13.524303 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:27:13.523957 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-dclk4" event={"ID":"9ff65f8c-5e82-45a9-bd70-52af8e3a4532","Type":"ContainerDied","Data":"acb9749163d5eca73b1a5f20d4085d57e396c83c19d92ec7568d944e74c29377"} Apr 21 10:27:13.524303 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:27:13.523993 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-dclk4" event={"ID":"9ff65f8c-5e82-45a9-bd70-52af8e3a4532","Type":"ContainerDied","Data":"9c5339d5e12db91477565502ad6811ee7d0b30f03217b7d2f2a046ea9aba812d"} Apr 21 10:27:13.524303 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:27:13.524010 2577 scope.go:117] "RemoveContainer" containerID="acb9749163d5eca73b1a5f20d4085d57e396c83c19d92ec7568d944e74c29377" Apr 21 10:27:13.532519 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:27:13.532496 2577 scope.go:117] "RemoveContainer" containerID="a30b3307802fcbe3c47163fc32202949b4ac250760afce4e768f9aa28f540fff" Apr 21 10:27:13.539838 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:27:13.539820 2577 scope.go:117] "RemoveContainer" containerID="acb9749163d5eca73b1a5f20d4085d57e396c83c19d92ec7568d944e74c29377" Apr 21 10:27:13.540103 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:27:13.540085 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acb9749163d5eca73b1a5f20d4085d57e396c83c19d92ec7568d944e74c29377\": container with ID starting with acb9749163d5eca73b1a5f20d4085d57e396c83c19d92ec7568d944e74c29377 not found: ID does not exist" containerID="acb9749163d5eca73b1a5f20d4085d57e396c83c19d92ec7568d944e74c29377" Apr 21 10:27:13.540144 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:27:13.540113 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acb9749163d5eca73b1a5f20d4085d57e396c83c19d92ec7568d944e74c29377"} err="failed to get container status \"acb9749163d5eca73b1a5f20d4085d57e396c83c19d92ec7568d944e74c29377\": rpc error: code = NotFound desc = could not find container \"acb9749163d5eca73b1a5f20d4085d57e396c83c19d92ec7568d944e74c29377\": container with ID starting with acb9749163d5eca73b1a5f20d4085d57e396c83c19d92ec7568d944e74c29377 not found: ID does not exist" Apr 21 10:27:13.540144 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:27:13.540133 2577 scope.go:117] "RemoveContainer" containerID="a30b3307802fcbe3c47163fc32202949b4ac250760afce4e768f9aa28f540fff" Apr 21 10:27:13.540406 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:27:13.540387 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a30b3307802fcbe3c47163fc32202949b4ac250760afce4e768f9aa28f540fff\": container with ID starting with a30b3307802fcbe3c47163fc32202949b4ac250760afce4e768f9aa28f540fff not found: ID does not exist" containerID="a30b3307802fcbe3c47163fc32202949b4ac250760afce4e768f9aa28f540fff" Apr 21 10:27:13.540462 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:27:13.540418 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a30b3307802fcbe3c47163fc32202949b4ac250760afce4e768f9aa28f540fff"} err="failed to get container status \"a30b3307802fcbe3c47163fc32202949b4ac250760afce4e768f9aa28f540fff\": rpc error: code = NotFound desc = could not find container \"a30b3307802fcbe3c47163fc32202949b4ac250760afce4e768f9aa28f540fff\": container with ID starting with a30b3307802fcbe3c47163fc32202949b4ac250760afce4e768f9aa28f540fff not found: ID does not exist" Apr 21 10:27:13.544640 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:27:13.544611 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-dclk4"] Apr 21 10:27:13.546875 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:27:13.546851 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-dclk4"] Apr 21 10:27:14.231474 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:27:14.231441 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ff65f8c-5e82-45a9-bd70-52af8e3a4532" path="/var/lib/kubelet/pods/9ff65f8c-5e82-45a9-bd70-52af8e3a4532/volumes" Apr 21 10:27:16.536759 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:27:16.536710 2577 generic.go:358] "Generic (PLEG): container finished" podID="13c88ff4-eb3b-4c48-8400-6235299efb04" containerID="8a8030c03d946b16f0e4958ab057e7bfe5a47d7c62fe04baa1e0d95ef30e6f62" exitCode=0 Apr 21 10:27:16.537136 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:27:16.536786 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-86bf795455-r5bzx" event={"ID":"13c88ff4-eb3b-4c48-8400-6235299efb04","Type":"ContainerDied","Data":"8a8030c03d946b16f0e4958ab057e7bfe5a47d7c62fe04baa1e0d95ef30e6f62"} Apr 21 10:27:17.542582 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:27:17.542544 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-86bf795455-r5bzx" event={"ID":"13c88ff4-eb3b-4c48-8400-6235299efb04","Type":"ContainerStarted","Data":"01e6ee658b638075d27a382fcab3fe07e3f4a33a4eac5f87f6ca9373539b0029"} Apr 21 10:27:19.550764 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:27:19.550711 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-86bf795455-r5bzx" event={"ID":"13c88ff4-eb3b-4c48-8400-6235299efb04","Type":"ContainerStarted","Data":"0ba2c666fa2f61380ac518bf426452dc8d97e937bae5cb0d9a51df5bf5dac658"} Apr 21 10:27:19.551154 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:27:19.550836 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-86bf795455-r5bzx" Apr 21 10:27:19.569571 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:27:19.569518 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-86bf795455-r5bzx" podStartSLOduration=5.806073756 podStartE2EDuration="8.569502727s" podCreationTimestamp="2026-04-21 10:27:11 +0000 UTC" firstStartedPulling="2026-04-21 10:27:16.593727245 +0000 UTC m=+1404.932261241" lastFinishedPulling="2026-04-21 10:27:19.357156214 +0000 UTC m=+1407.695690212" observedRunningTime="2026-04-21 10:27:19.568333278 +0000 UTC m=+1407.906867295" watchObservedRunningTime="2026-04-21 10:27:19.569502727 +0000 UTC m=+1407.908036743" Apr 21 10:27:20.553384 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:27:20.553354 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-86bf795455-r5bzx" Apr 21 10:27:51.560621 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:27:51.560535 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-86bf795455-r5bzx" Apr 21 10:28:21.561771 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:28:21.561716 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-86bf795455-r5bzx" Apr 21 10:28:31.523329 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:28:31.523293 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-86bf795455-r5bzx"] Apr 21 10:28:31.523725 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:28:31.523674 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-86bf795455-r5bzx" podUID="13c88ff4-eb3b-4c48-8400-6235299efb04" containerName="kserve-container" containerID="cri-o://01e6ee658b638075d27a382fcab3fe07e3f4a33a4eac5f87f6ca9373539b0029" gracePeriod=30 Apr 21 10:28:31.523815 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:28:31.523729 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-86bf795455-r5bzx" podUID="13c88ff4-eb3b-4c48-8400-6235299efb04" containerName="kserve-agent" containerID="cri-o://0ba2c666fa2f61380ac518bf426452dc8d97e937bae5cb0d9a51df5bf5dac658" gracePeriod=30 Apr 21 10:28:31.581656 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:28:31.581622 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-vgsdn"] Apr 21 10:28:31.581981 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:28:31.581966 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9ff65f8c-5e82-45a9-bd70-52af8e3a4532" containerName="storage-initializer" Apr 21 10:28:31.582037 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:28:31.581984 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ff65f8c-5e82-45a9-bd70-52af8e3a4532" containerName="storage-initializer" Apr 21 10:28:31.582037 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:28:31.582002 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9ff65f8c-5e82-45a9-bd70-52af8e3a4532" containerName="kserve-container" Apr 21 10:28:31.582037 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:28:31.582008 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ff65f8c-5e82-45a9-bd70-52af8e3a4532" containerName="kserve-container" Apr 21 10:28:31.582144 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:28:31.582065 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="9ff65f8c-5e82-45a9-bd70-52af8e3a4532" containerName="kserve-container" Apr 21 10:28:31.585003 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:28:31.584987 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-vgsdn" Apr 21 10:28:31.592768 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:28:31.592727 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-vgsdn"] Apr 21 10:28:31.632403 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:28:31.632366 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3cc2d3f4-3260-449a-879f-7ffe88b9de0c-kserve-provision-location\") pod \"isvc-paddle-predictor-7dddcb4bd4-vgsdn\" (UID: \"3cc2d3f4-3260-449a-879f-7ffe88b9de0c\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-vgsdn" Apr 21 10:28:31.733582 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:28:31.733542 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3cc2d3f4-3260-449a-879f-7ffe88b9de0c-kserve-provision-location\") pod \"isvc-paddle-predictor-7dddcb4bd4-vgsdn\" (UID: \"3cc2d3f4-3260-449a-879f-7ffe88b9de0c\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-vgsdn" Apr 21 10:28:31.734018 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:28:31.733993 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3cc2d3f4-3260-449a-879f-7ffe88b9de0c-kserve-provision-location\") pod \"isvc-paddle-predictor-7dddcb4bd4-vgsdn\" (UID: \"3cc2d3f4-3260-449a-879f-7ffe88b9de0c\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-vgsdn" Apr 21 10:28:31.896539 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:28:31.896456 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-vgsdn" Apr 21 10:28:32.018569 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:28:32.018539 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-vgsdn"] Apr 21 10:28:32.022076 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:28:32.022040 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3cc2d3f4_3260_449a_879f_7ffe88b9de0c.slice/crio-f7f865df528fa9f43ced720308524c3974d61b19407fcc242e4bb79690c8a395 WatchSource:0}: Error finding container f7f865df528fa9f43ced720308524c3974d61b19407fcc242e4bb79690c8a395: Status 404 returned error can't find the container with id f7f865df528fa9f43ced720308524c3974d61b19407fcc242e4bb79690c8a395 Apr 21 10:28:32.765005 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:28:32.764968 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-vgsdn" event={"ID":"3cc2d3f4-3260-449a-879f-7ffe88b9de0c","Type":"ContainerStarted","Data":"d114665166b95f23d00485d2b50dade5a7b84d0aa02c52c57f6ce1276ef47ba0"} Apr 21 10:28:32.765005 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:28:32.765007 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-vgsdn" event={"ID":"3cc2d3f4-3260-449a-879f-7ffe88b9de0c","Type":"ContainerStarted","Data":"f7f865df528fa9f43ced720308524c3974d61b19407fcc242e4bb79690c8a395"} Apr 21 10:28:33.770433 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:28:33.770400 2577 generic.go:358] "Generic (PLEG): container finished" podID="13c88ff4-eb3b-4c48-8400-6235299efb04" containerID="01e6ee658b638075d27a382fcab3fe07e3f4a33a4eac5f87f6ca9373539b0029" exitCode=0 Apr 21 10:28:33.770832 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:28:33.770479 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-86bf795455-r5bzx" event={"ID":"13c88ff4-eb3b-4c48-8400-6235299efb04","Type":"ContainerDied","Data":"01e6ee658b638075d27a382fcab3fe07e3f4a33a4eac5f87f6ca9373539b0029"} Apr 21 10:28:37.783625 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:28:37.783590 2577 generic.go:358] "Generic (PLEG): container finished" podID="3cc2d3f4-3260-449a-879f-7ffe88b9de0c" containerID="d114665166b95f23d00485d2b50dade5a7b84d0aa02c52c57f6ce1276ef47ba0" exitCode=0 Apr 21 10:28:37.783625 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:28:37.783629 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-vgsdn" event={"ID":"3cc2d3f4-3260-449a-879f-7ffe88b9de0c","Type":"ContainerDied","Data":"d114665166b95f23d00485d2b50dade5a7b84d0aa02c52c57f6ce1276ef47ba0"} Apr 21 10:28:41.558163 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:28:41.558113 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-86bf795455-r5bzx" podUID="13c88ff4-eb3b-4c48-8400-6235299efb04" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.34:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.134.0.34:8080: connect: connection refused" Apr 21 10:28:49.823465 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:28:49.823428 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-vgsdn" event={"ID":"3cc2d3f4-3260-449a-879f-7ffe88b9de0c","Type":"ContainerStarted","Data":"e3700db2eab410f136fdce2cf3473a7bf4b4dd5e075879d8113505db5867e3d7"} Apr 21 10:28:49.824008 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:28:49.823789 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-vgsdn" Apr 21 10:28:49.825082 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:28:49.825052 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-vgsdn" podUID="3cc2d3f4-3260-449a-879f-7ffe88b9de0c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 21 10:28:49.839478 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:28:49.839426 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-vgsdn" podStartSLOduration=7.692389601 podStartE2EDuration="18.839409768s" podCreationTimestamp="2026-04-21 10:28:31 +0000 UTC" firstStartedPulling="2026-04-21 10:28:37.784715528 +0000 UTC m=+1486.123249524" lastFinishedPulling="2026-04-21 10:28:48.931735683 +0000 UTC m=+1497.270269691" observedRunningTime="2026-04-21 10:28:49.838001816 +0000 UTC m=+1498.176535832" watchObservedRunningTime="2026-04-21 10:28:49.839409768 +0000 UTC m=+1498.177943784" Apr 21 10:28:50.827004 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:28:50.826967 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-vgsdn" podUID="3cc2d3f4-3260-449a-879f-7ffe88b9de0c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 21 10:28:51.558107 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:28:51.558058 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-86bf795455-r5bzx" podUID="13c88ff4-eb3b-4c48-8400-6235299efb04" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.34:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.134.0.34:8080: connect: connection refused" Apr 21 10:28:52.202796 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:28:52.202764 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-9vsls_c7fc40ec-8080-4847-9b12-78671f916c03/console-operator/2.log" Apr 21 10:28:52.206788 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:28:52.203853 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-9vsls_c7fc40ec-8080-4847-9b12-78671f916c03/console-operator/2.log" Apr 21 10:29:00.827858 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:00.827801 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-vgsdn" podUID="3cc2d3f4-3260-449a-879f-7ffe88b9de0c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 21 10:29:01.558281 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:01.558240 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-86bf795455-r5bzx" podUID="13c88ff4-eb3b-4c48-8400-6235299efb04" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.34:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.134.0.34:8080: connect: connection refused" Apr 21 10:29:01.558440 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:01.558376 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-86bf795455-r5bzx" Apr 21 10:29:01.561630 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:01.561595 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-86bf795455-r5bzx" podUID="13c88ff4-eb3b-4c48-8400-6235299efb04" containerName="kserve-agent" probeResult="failure" output="dial tcp 10.134.0.34:9081: connect: connection refused" Apr 21 10:29:01.720401 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:01.720377 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-86bf795455-r5bzx" Apr 21 10:29:01.859738 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:01.859642 2577 generic.go:358] "Generic (PLEG): container finished" podID="13c88ff4-eb3b-4c48-8400-6235299efb04" containerID="0ba2c666fa2f61380ac518bf426452dc8d97e937bae5cb0d9a51df5bf5dac658" exitCode=137 Apr 21 10:29:01.859738 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:01.859713 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-86bf795455-r5bzx" event={"ID":"13c88ff4-eb3b-4c48-8400-6235299efb04","Type":"ContainerDied","Data":"0ba2c666fa2f61380ac518bf426452dc8d97e937bae5cb0d9a51df5bf5dac658"} Apr 21 10:29:01.859738 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:01.859737 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-86bf795455-r5bzx" Apr 21 10:29:01.860411 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:01.859742 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-86bf795455-r5bzx" event={"ID":"13c88ff4-eb3b-4c48-8400-6235299efb04","Type":"ContainerDied","Data":"4e1b1bb40817854d010049caa65dcfd5530b85a2ebf0661da6a842c14ea72c04"} Apr 21 10:29:01.860411 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:01.859785 2577 scope.go:117] "RemoveContainer" containerID="0ba2c666fa2f61380ac518bf426452dc8d97e937bae5cb0d9a51df5bf5dac658" Apr 21 10:29:01.867378 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:01.867355 2577 scope.go:117] "RemoveContainer" containerID="01e6ee658b638075d27a382fcab3fe07e3f4a33a4eac5f87f6ca9373539b0029" Apr 21 10:29:01.874523 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:01.874500 2577 scope.go:117] "RemoveContainer" containerID="8a8030c03d946b16f0e4958ab057e7bfe5a47d7c62fe04baa1e0d95ef30e6f62" Apr 21 10:29:01.881869 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:01.881842 2577 scope.go:117] "RemoveContainer" containerID="0ba2c666fa2f61380ac518bf426452dc8d97e937bae5cb0d9a51df5bf5dac658" Apr 21 10:29:01.882157 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:29:01.882137 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ba2c666fa2f61380ac518bf426452dc8d97e937bae5cb0d9a51df5bf5dac658\": container with ID starting with 0ba2c666fa2f61380ac518bf426452dc8d97e937bae5cb0d9a51df5bf5dac658 not found: ID does not exist" containerID="0ba2c666fa2f61380ac518bf426452dc8d97e937bae5cb0d9a51df5bf5dac658" Apr 21 10:29:01.882208 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:01.882168 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ba2c666fa2f61380ac518bf426452dc8d97e937bae5cb0d9a51df5bf5dac658"} err="failed to get container status \"0ba2c666fa2f61380ac518bf426452dc8d97e937bae5cb0d9a51df5bf5dac658\": rpc error: code = NotFound desc = could not find container \"0ba2c666fa2f61380ac518bf426452dc8d97e937bae5cb0d9a51df5bf5dac658\": container with ID starting with 0ba2c666fa2f61380ac518bf426452dc8d97e937bae5cb0d9a51df5bf5dac658 not found: ID does not exist" Apr 21 10:29:01.882208 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:01.882188 2577 scope.go:117] "RemoveContainer" containerID="01e6ee658b638075d27a382fcab3fe07e3f4a33a4eac5f87f6ca9373539b0029" Apr 21 10:29:01.882398 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:29:01.882382 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01e6ee658b638075d27a382fcab3fe07e3f4a33a4eac5f87f6ca9373539b0029\": container with ID starting with 01e6ee658b638075d27a382fcab3fe07e3f4a33a4eac5f87f6ca9373539b0029 not found: ID does not exist" containerID="01e6ee658b638075d27a382fcab3fe07e3f4a33a4eac5f87f6ca9373539b0029" Apr 21 10:29:01.882437 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:01.882405 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01e6ee658b638075d27a382fcab3fe07e3f4a33a4eac5f87f6ca9373539b0029"} err="failed to get container status \"01e6ee658b638075d27a382fcab3fe07e3f4a33a4eac5f87f6ca9373539b0029\": rpc error: code = NotFound desc = could not find container \"01e6ee658b638075d27a382fcab3fe07e3f4a33a4eac5f87f6ca9373539b0029\": container with ID starting with 01e6ee658b638075d27a382fcab3fe07e3f4a33a4eac5f87f6ca9373539b0029 not found: ID does not exist" Apr 21 10:29:01.882437 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:01.882421 2577 scope.go:117] "RemoveContainer" containerID="8a8030c03d946b16f0e4958ab057e7bfe5a47d7c62fe04baa1e0d95ef30e6f62" Apr 21 10:29:01.882612 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:29:01.882595 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a8030c03d946b16f0e4958ab057e7bfe5a47d7c62fe04baa1e0d95ef30e6f62\": container with ID starting with 8a8030c03d946b16f0e4958ab057e7bfe5a47d7c62fe04baa1e0d95ef30e6f62 not found: ID does not exist" containerID="8a8030c03d946b16f0e4958ab057e7bfe5a47d7c62fe04baa1e0d95ef30e6f62" Apr 21 10:29:01.882666 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:01.882616 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a8030c03d946b16f0e4958ab057e7bfe5a47d7c62fe04baa1e0d95ef30e6f62"} err="failed to get container status \"8a8030c03d946b16f0e4958ab057e7bfe5a47d7c62fe04baa1e0d95ef30e6f62\": rpc error: code = NotFound desc = could not find container \"8a8030c03d946b16f0e4958ab057e7bfe5a47d7c62fe04baa1e0d95ef30e6f62\": container with ID starting with 8a8030c03d946b16f0e4958ab057e7bfe5a47d7c62fe04baa1e0d95ef30e6f62 not found: ID does not exist" Apr 21 10:29:01.885923 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:01.885898 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/13c88ff4-eb3b-4c48-8400-6235299efb04-kserve-provision-location\") pod \"13c88ff4-eb3b-4c48-8400-6235299efb04\" (UID: \"13c88ff4-eb3b-4c48-8400-6235299efb04\") " Apr 21 10:29:01.886208 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:01.886186 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13c88ff4-eb3b-4c48-8400-6235299efb04-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "13c88ff4-eb3b-4c48-8400-6235299efb04" (UID: "13c88ff4-eb3b-4c48-8400-6235299efb04"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:29:01.987305 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:01.987270 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/13c88ff4-eb3b-4c48-8400-6235299efb04-kserve-provision-location\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:29:02.181448 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:02.181404 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-86bf795455-r5bzx"] Apr 21 10:29:02.186837 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:02.186807 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-86bf795455-r5bzx"] Apr 21 10:29:02.231489 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:02.231451 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13c88ff4-eb3b-4c48-8400-6235299efb04" path="/var/lib/kubelet/pods/13c88ff4-eb3b-4c48-8400-6235299efb04/volumes" Apr 21 10:29:10.827815 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:10.827699 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-vgsdn" podUID="3cc2d3f4-3260-449a-879f-7ffe88b9de0c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 21 10:29:20.827172 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:20.827126 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-vgsdn" podUID="3cc2d3f4-3260-449a-879f-7ffe88b9de0c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 21 10:29:30.827503 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:30.827461 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-vgsdn" podUID="3cc2d3f4-3260-449a-879f-7ffe88b9de0c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.35:8080: connect: connection refused" Apr 21 10:29:40.828416 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:40.828383 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-vgsdn" Apr 21 10:29:43.058070 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:43.058033 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-vgsdn"] Apr 21 10:29:43.058576 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:43.058317 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-vgsdn" podUID="3cc2d3f4-3260-449a-879f-7ffe88b9de0c" containerName="kserve-container" containerID="cri-o://e3700db2eab410f136fdce2cf3473a7bf4b4dd5e075879d8113505db5867e3d7" gracePeriod=30 Apr 21 10:29:43.176359 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:43.176321 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvpx2"] Apr 21 10:29:43.176668 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:43.176654 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="13c88ff4-eb3b-4c48-8400-6235299efb04" containerName="kserve-container" Apr 21 10:29:43.176712 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:43.176670 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="13c88ff4-eb3b-4c48-8400-6235299efb04" containerName="kserve-container" Apr 21 10:29:43.176712 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:43.176679 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="13c88ff4-eb3b-4c48-8400-6235299efb04" containerName="kserve-agent" Apr 21 10:29:43.176712 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:43.176684 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="13c88ff4-eb3b-4c48-8400-6235299efb04" containerName="kserve-agent" Apr 21 10:29:43.176712 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:43.176694 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="13c88ff4-eb3b-4c48-8400-6235299efb04" containerName="storage-initializer" Apr 21 10:29:43.176712 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:43.176700 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="13c88ff4-eb3b-4c48-8400-6235299efb04" containerName="storage-initializer" Apr 21 10:29:43.176933 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:43.176772 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="13c88ff4-eb3b-4c48-8400-6235299efb04" containerName="kserve-container" Apr 21 10:29:43.176933 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:43.176783 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="13c88ff4-eb3b-4c48-8400-6235299efb04" containerName="kserve-agent" Apr 21 10:29:43.191364 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:43.191335 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvpx2"] Apr 21 10:29:43.191515 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:43.191466 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvpx2" Apr 21 10:29:43.317883 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:43.317790 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/506d958a-1f0b-4b51-a98b-77a2ec0b5033-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-86b49c4466-zvpx2\" (UID: \"506d958a-1f0b-4b51-a98b-77a2ec0b5033\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvpx2" Apr 21 10:29:43.418861 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:43.418816 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/506d958a-1f0b-4b51-a98b-77a2ec0b5033-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-86b49c4466-zvpx2\" (UID: \"506d958a-1f0b-4b51-a98b-77a2ec0b5033\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvpx2" Apr 21 10:29:43.419227 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:43.419206 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/506d958a-1f0b-4b51-a98b-77a2ec0b5033-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-86b49c4466-zvpx2\" (UID: \"506d958a-1f0b-4b51-a98b-77a2ec0b5033\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvpx2" Apr 21 10:29:43.502575 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:43.502546 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvpx2" Apr 21 10:29:43.625948 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:43.625907 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvpx2"] Apr 21 10:29:43.629941 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:29:43.629909 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod506d958a_1f0b_4b51_a98b_77a2ec0b5033.slice/crio-ec34405693525ead1c0f3bb56fc2cfa112a22605414f722a6e13785b2e7a26a9 WatchSource:0}: Error finding container ec34405693525ead1c0f3bb56fc2cfa112a22605414f722a6e13785b2e7a26a9: Status 404 returned error can't find the container with id ec34405693525ead1c0f3bb56fc2cfa112a22605414f722a6e13785b2e7a26a9 Apr 21 10:29:43.990448 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:43.990416 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvpx2" event={"ID":"506d958a-1f0b-4b51-a98b-77a2ec0b5033","Type":"ContainerStarted","Data":"995bf0e9b2db481f3b8d8c5fcc228f26ec8b980384e78345f9a89f87f10ab136"} Apr 21 10:29:43.990448 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:43.990453 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvpx2" event={"ID":"506d958a-1f0b-4b51-a98b-77a2ec0b5033","Type":"ContainerStarted","Data":"ec34405693525ead1c0f3bb56fc2cfa112a22605414f722a6e13785b2e7a26a9"} Apr 21 10:29:45.993647 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:45.993618 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-vgsdn" Apr 21 10:29:45.997065 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:45.997037 2577 generic.go:358] "Generic (PLEG): container finished" podID="3cc2d3f4-3260-449a-879f-7ffe88b9de0c" containerID="e3700db2eab410f136fdce2cf3473a7bf4b4dd5e075879d8113505db5867e3d7" exitCode=0 Apr 21 10:29:45.997236 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:45.997108 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-vgsdn" event={"ID":"3cc2d3f4-3260-449a-879f-7ffe88b9de0c","Type":"ContainerDied","Data":"e3700db2eab410f136fdce2cf3473a7bf4b4dd5e075879d8113505db5867e3d7"} Apr 21 10:29:45.997236 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:45.997139 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-vgsdn" event={"ID":"3cc2d3f4-3260-449a-879f-7ffe88b9de0c","Type":"ContainerDied","Data":"f7f865df528fa9f43ced720308524c3974d61b19407fcc242e4bb79690c8a395"} Apr 21 10:29:45.997236 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:45.997161 2577 scope.go:117] "RemoveContainer" containerID="e3700db2eab410f136fdce2cf3473a7bf4b4dd5e075879d8113505db5867e3d7" Apr 21 10:29:45.997236 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:45.997181 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-vgsdn" Apr 21 10:29:46.004795 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:46.004768 2577 scope.go:117] "RemoveContainer" containerID="d114665166b95f23d00485d2b50dade5a7b84d0aa02c52c57f6ce1276ef47ba0" Apr 21 10:29:46.013458 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:46.013432 2577 scope.go:117] "RemoveContainer" containerID="e3700db2eab410f136fdce2cf3473a7bf4b4dd5e075879d8113505db5867e3d7" Apr 21 10:29:46.013837 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:29:46.013806 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3700db2eab410f136fdce2cf3473a7bf4b4dd5e075879d8113505db5867e3d7\": container with ID starting with e3700db2eab410f136fdce2cf3473a7bf4b4dd5e075879d8113505db5867e3d7 not found: ID does not exist" containerID="e3700db2eab410f136fdce2cf3473a7bf4b4dd5e075879d8113505db5867e3d7" Apr 21 10:29:46.014007 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:46.013847 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3700db2eab410f136fdce2cf3473a7bf4b4dd5e075879d8113505db5867e3d7"} err="failed to get container status \"e3700db2eab410f136fdce2cf3473a7bf4b4dd5e075879d8113505db5867e3d7\": rpc error: code = NotFound desc = could not find container \"e3700db2eab410f136fdce2cf3473a7bf4b4dd5e075879d8113505db5867e3d7\": container with ID starting with e3700db2eab410f136fdce2cf3473a7bf4b4dd5e075879d8113505db5867e3d7 not found: ID does not exist" Apr 21 10:29:46.014007 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:46.013886 2577 scope.go:117] "RemoveContainer" containerID="d114665166b95f23d00485d2b50dade5a7b84d0aa02c52c57f6ce1276ef47ba0" Apr 21 10:29:46.014242 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:29:46.014215 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d114665166b95f23d00485d2b50dade5a7b84d0aa02c52c57f6ce1276ef47ba0\": container with ID starting with d114665166b95f23d00485d2b50dade5a7b84d0aa02c52c57f6ce1276ef47ba0 not found: ID does not exist" containerID="d114665166b95f23d00485d2b50dade5a7b84d0aa02c52c57f6ce1276ef47ba0" Apr 21 10:29:46.014327 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:46.014250 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d114665166b95f23d00485d2b50dade5a7b84d0aa02c52c57f6ce1276ef47ba0"} err="failed to get container status \"d114665166b95f23d00485d2b50dade5a7b84d0aa02c52c57f6ce1276ef47ba0\": rpc error: code = NotFound desc = could not find container \"d114665166b95f23d00485d2b50dade5a7b84d0aa02c52c57f6ce1276ef47ba0\": container with ID starting with d114665166b95f23d00485d2b50dade5a7b84d0aa02c52c57f6ce1276ef47ba0 not found: ID does not exist" Apr 21 10:29:46.140839 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:46.140726 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3cc2d3f4-3260-449a-879f-7ffe88b9de0c-kserve-provision-location\") pod \"3cc2d3f4-3260-449a-879f-7ffe88b9de0c\" (UID: \"3cc2d3f4-3260-449a-879f-7ffe88b9de0c\") " Apr 21 10:29:46.150575 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:46.150543 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3cc2d3f4-3260-449a-879f-7ffe88b9de0c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3cc2d3f4-3260-449a-879f-7ffe88b9de0c" (UID: "3cc2d3f4-3260-449a-879f-7ffe88b9de0c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:29:46.242218 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:46.242190 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3cc2d3f4-3260-449a-879f-7ffe88b9de0c-kserve-provision-location\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:29:46.312913 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:46.312872 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-vgsdn"] Apr 21 10:29:46.314435 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:46.314409 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-vgsdn"] Apr 21 10:29:48.004781 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:48.004729 2577 generic.go:358] "Generic (PLEG): container finished" podID="506d958a-1f0b-4b51-a98b-77a2ec0b5033" containerID="995bf0e9b2db481f3b8d8c5fcc228f26ec8b980384e78345f9a89f87f10ab136" exitCode=0 Apr 21 10:29:48.005157 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:48.004799 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvpx2" event={"ID":"506d958a-1f0b-4b51-a98b-77a2ec0b5033","Type":"ContainerDied","Data":"995bf0e9b2db481f3b8d8c5fcc228f26ec8b980384e78345f9a89f87f10ab136"} Apr 21 10:29:48.233947 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:48.232002 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cc2d3f4-3260-449a-879f-7ffe88b9de0c" path="/var/lib/kubelet/pods/3cc2d3f4-3260-449a-879f-7ffe88b9de0c/volumes" Apr 21 10:29:49.010126 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:49.010089 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvpx2" event={"ID":"506d958a-1f0b-4b51-a98b-77a2ec0b5033","Type":"ContainerStarted","Data":"ff34eebce011b0afc5adeabb612af9196447ec2f705843eace1c5110cf0d309b"} Apr 21 10:29:49.010569 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:49.010467 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvpx2" Apr 21 10:29:49.012021 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:49.011992 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvpx2" podUID="506d958a-1f0b-4b51-a98b-77a2ec0b5033" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 21 10:29:49.028137 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:49.028080 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvpx2" podStartSLOduration=6.028064926 podStartE2EDuration="6.028064926s" podCreationTimestamp="2026-04-21 10:29:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:29:49.026856972 +0000 UTC m=+1557.365390991" watchObservedRunningTime="2026-04-21 10:29:49.028064926 +0000 UTC m=+1557.366598942" Apr 21 10:29:50.014044 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:29:50.013999 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvpx2" podUID="506d958a-1f0b-4b51-a98b-77a2ec0b5033" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 21 10:30:00.014445 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:30:00.014396 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvpx2" podUID="506d958a-1f0b-4b51-a98b-77a2ec0b5033" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 21 10:30:10.014501 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:30:10.014458 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvpx2" podUID="506d958a-1f0b-4b51-a98b-77a2ec0b5033" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 21 10:30:20.014589 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:30:20.014540 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvpx2" podUID="506d958a-1f0b-4b51-a98b-77a2ec0b5033" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 21 10:30:30.014482 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:30:30.014435 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvpx2" podUID="506d958a-1f0b-4b51-a98b-77a2ec0b5033" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.36:8080: connect: connection refused" Apr 21 10:30:40.015743 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:30:40.015709 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvpx2" Apr 21 10:30:44.550571 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:30:44.550484 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvpx2"] Apr 21 10:30:44.550991 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:30:44.550789 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvpx2" podUID="506d958a-1f0b-4b51-a98b-77a2ec0b5033" containerName="kserve-container" containerID="cri-o://ff34eebce011b0afc5adeabb612af9196447ec2f705843eace1c5110cf0d309b" gracePeriod=30 Apr 21 10:30:44.612092 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:30:44.612055 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-thvgt"] Apr 21 10:30:44.612348 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:30:44.612337 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3cc2d3f4-3260-449a-879f-7ffe88b9de0c" containerName="storage-initializer" Apr 21 10:30:44.612390 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:30:44.612350 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cc2d3f4-3260-449a-879f-7ffe88b9de0c" containerName="storage-initializer" Apr 21 10:30:44.612390 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:30:44.612367 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3cc2d3f4-3260-449a-879f-7ffe88b9de0c" containerName="kserve-container" Apr 21 10:30:44.612390 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:30:44.612372 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cc2d3f4-3260-449a-879f-7ffe88b9de0c" containerName="kserve-container" Apr 21 10:30:44.612481 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:30:44.612428 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="3cc2d3f4-3260-449a-879f-7ffe88b9de0c" containerName="kserve-container" Apr 21 10:30:44.618871 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:30:44.618840 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-thvgt" Apr 21 10:30:44.622137 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:30:44.622106 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-thvgt"] Apr 21 10:30:44.682518 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:30:44.682482 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1bc9e230-9ff4-4c1a-8152-cfa7e2379451-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7c6bd76f7b-thvgt\" (UID: \"1bc9e230-9ff4-4c1a-8152-cfa7e2379451\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-thvgt" Apr 21 10:30:44.783131 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:30:44.783090 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1bc9e230-9ff4-4c1a-8152-cfa7e2379451-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7c6bd76f7b-thvgt\" (UID: \"1bc9e230-9ff4-4c1a-8152-cfa7e2379451\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-thvgt" Apr 21 10:30:44.783498 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:30:44.783474 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1bc9e230-9ff4-4c1a-8152-cfa7e2379451-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7c6bd76f7b-thvgt\" (UID: \"1bc9e230-9ff4-4c1a-8152-cfa7e2379451\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-thvgt" Apr 21 10:30:44.932053 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:30:44.931959 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-thvgt" Apr 21 10:30:45.052739 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:30:45.052714 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-thvgt"] Apr 21 10:30:45.055322 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:30:45.055289 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1bc9e230_9ff4_4c1a_8152_cfa7e2379451.slice/crio-f618aaba4df8ada2beef11c57d18e6fd820407c689f91011c17a7741a9a5be8e WatchSource:0}: Error finding container f618aaba4df8ada2beef11c57d18e6fd820407c689f91011c17a7741a9a5be8e: Status 404 returned error can't find the container with id f618aaba4df8ada2beef11c57d18e6fd820407c689f91011c17a7741a9a5be8e Apr 21 10:30:45.172350 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:30:45.172314 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-thvgt" event={"ID":"1bc9e230-9ff4-4c1a-8152-cfa7e2379451","Type":"ContainerStarted","Data":"17b40686585813b9e477bca8b0412634358be7c7ce8d6157bc6cc9f6336f1e32"} Apr 21 10:30:45.172350 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:30:45.172354 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-thvgt" event={"ID":"1bc9e230-9ff4-4c1a-8152-cfa7e2379451","Type":"ContainerStarted","Data":"f618aaba4df8ada2beef11c57d18e6fd820407c689f91011c17a7741a9a5be8e"} Apr 21 10:30:47.489946 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:30:47.489925 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvpx2" Apr 21 10:30:47.603211 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:30:47.603120 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/506d958a-1f0b-4b51-a98b-77a2ec0b5033-kserve-provision-location\") pod \"506d958a-1f0b-4b51-a98b-77a2ec0b5033\" (UID: \"506d958a-1f0b-4b51-a98b-77a2ec0b5033\") " Apr 21 10:30:47.612961 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:30:47.612924 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/506d958a-1f0b-4b51-a98b-77a2ec0b5033-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "506d958a-1f0b-4b51-a98b-77a2ec0b5033" (UID: "506d958a-1f0b-4b51-a98b-77a2ec0b5033"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:30:47.704090 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:30:47.704046 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/506d958a-1f0b-4b51-a98b-77a2ec0b5033-kserve-provision-location\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:30:48.181551 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:30:48.181514 2577 generic.go:358] "Generic (PLEG): container finished" podID="506d958a-1f0b-4b51-a98b-77a2ec0b5033" containerID="ff34eebce011b0afc5adeabb612af9196447ec2f705843eace1c5110cf0d309b" exitCode=0 Apr 21 10:30:48.181767 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:30:48.181598 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvpx2" Apr 21 10:30:48.181767 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:30:48.181597 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvpx2" event={"ID":"506d958a-1f0b-4b51-a98b-77a2ec0b5033","Type":"ContainerDied","Data":"ff34eebce011b0afc5adeabb612af9196447ec2f705843eace1c5110cf0d309b"} Apr 21 10:30:48.181767 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:30:48.181635 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvpx2" event={"ID":"506d958a-1f0b-4b51-a98b-77a2ec0b5033","Type":"ContainerDied","Data":"ec34405693525ead1c0f3bb56fc2cfa112a22605414f722a6e13785b2e7a26a9"} Apr 21 10:30:48.181767 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:30:48.181649 2577 scope.go:117] "RemoveContainer" containerID="ff34eebce011b0afc5adeabb612af9196447ec2f705843eace1c5110cf0d309b" Apr 21 10:30:48.190515 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:30:48.190498 2577 scope.go:117] "RemoveContainer" containerID="995bf0e9b2db481f3b8d8c5fcc228f26ec8b980384e78345f9a89f87f10ab136" Apr 21 10:30:48.197822 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:30:48.197803 2577 scope.go:117] "RemoveContainer" containerID="ff34eebce011b0afc5adeabb612af9196447ec2f705843eace1c5110cf0d309b" Apr 21 10:30:48.198096 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:30:48.198078 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff34eebce011b0afc5adeabb612af9196447ec2f705843eace1c5110cf0d309b\": container with ID starting with ff34eebce011b0afc5adeabb612af9196447ec2f705843eace1c5110cf0d309b not found: ID does not exist" containerID="ff34eebce011b0afc5adeabb612af9196447ec2f705843eace1c5110cf0d309b" Apr 21 10:30:48.198151 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:30:48.198105 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff34eebce011b0afc5adeabb612af9196447ec2f705843eace1c5110cf0d309b"} err="failed to get container status \"ff34eebce011b0afc5adeabb612af9196447ec2f705843eace1c5110cf0d309b\": rpc error: code = NotFound desc = could not find container \"ff34eebce011b0afc5adeabb612af9196447ec2f705843eace1c5110cf0d309b\": container with ID starting with ff34eebce011b0afc5adeabb612af9196447ec2f705843eace1c5110cf0d309b not found: ID does not exist" Apr 21 10:30:48.198151 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:30:48.198124 2577 scope.go:117] "RemoveContainer" containerID="995bf0e9b2db481f3b8d8c5fcc228f26ec8b980384e78345f9a89f87f10ab136" Apr 21 10:30:48.198360 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:30:48.198340 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"995bf0e9b2db481f3b8d8c5fcc228f26ec8b980384e78345f9a89f87f10ab136\": container with ID starting with 995bf0e9b2db481f3b8d8c5fcc228f26ec8b980384e78345f9a89f87f10ab136 not found: ID does not exist" containerID="995bf0e9b2db481f3b8d8c5fcc228f26ec8b980384e78345f9a89f87f10ab136" Apr 21 10:30:48.198415 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:30:48.198370 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"995bf0e9b2db481f3b8d8c5fcc228f26ec8b980384e78345f9a89f87f10ab136"} err="failed to get container status \"995bf0e9b2db481f3b8d8c5fcc228f26ec8b980384e78345f9a89f87f10ab136\": rpc error: code = NotFound desc = could not find container \"995bf0e9b2db481f3b8d8c5fcc228f26ec8b980384e78345f9a89f87f10ab136\": container with ID starting with 995bf0e9b2db481f3b8d8c5fcc228f26ec8b980384e78345f9a89f87f10ab136 not found: ID does not exist" Apr 21 10:30:48.201328 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:30:48.201306 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvpx2"] Apr 21 10:30:48.203284 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:30:48.203264 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvpx2"] Apr 21 10:30:48.232127 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:30:48.232081 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="506d958a-1f0b-4b51-a98b-77a2ec0b5033" path="/var/lib/kubelet/pods/506d958a-1f0b-4b51-a98b-77a2ec0b5033/volumes" Apr 21 10:30:50.192003 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:30:50.191964 2577 generic.go:358] "Generic (PLEG): container finished" podID="1bc9e230-9ff4-4c1a-8152-cfa7e2379451" containerID="17b40686585813b9e477bca8b0412634358be7c7ce8d6157bc6cc9f6336f1e32" exitCode=0 Apr 21 10:30:50.192400 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:30:50.192038 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-thvgt" event={"ID":"1bc9e230-9ff4-4c1a-8152-cfa7e2379451","Type":"ContainerDied","Data":"17b40686585813b9e477bca8b0412634358be7c7ce8d6157bc6cc9f6336f1e32"} Apr 21 10:30:51.196969 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:30:51.196932 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-thvgt" event={"ID":"1bc9e230-9ff4-4c1a-8152-cfa7e2379451","Type":"ContainerStarted","Data":"afaf01f7daf1012bc0e27da9cdf8816a37c1cf32f99fdac488e8641bbac018a3"} Apr 21 10:30:51.197566 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:30:51.197228 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-thvgt" Apr 21 10:30:51.198765 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:30:51.198719 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-thvgt" podUID="1bc9e230-9ff4-4c1a-8152-cfa7e2379451" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 21 10:30:51.212239 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:30:51.212182 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-thvgt" podStartSLOduration=7.212166165 podStartE2EDuration="7.212166165s" podCreationTimestamp="2026-04-21 10:30:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:30:51.21098188 +0000 UTC m=+1619.549515896" watchObservedRunningTime="2026-04-21 10:30:51.212166165 +0000 UTC m=+1619.550700182" Apr 21 10:30:52.200589 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:30:52.200545 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-thvgt" podUID="1bc9e230-9ff4-4c1a-8152-cfa7e2379451" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 21 10:31:02.201095 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:31:02.201034 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-thvgt" podUID="1bc9e230-9ff4-4c1a-8152-cfa7e2379451" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 21 10:31:12.200854 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:31:12.200803 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-thvgt" podUID="1bc9e230-9ff4-4c1a-8152-cfa7e2379451" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 21 10:31:22.201062 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:31:22.201005 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-thvgt" podUID="1bc9e230-9ff4-4c1a-8152-cfa7e2379451" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 21 10:31:32.200966 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:31:32.200918 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-thvgt" podUID="1bc9e230-9ff4-4c1a-8152-cfa7e2379451" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.37:8080: connect: connection refused" Apr 21 10:31:42.201957 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:31:42.201924 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-thvgt" Apr 21 10:31:46.311850 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:31:46.311812 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-thvgt"] Apr 21 10:31:46.312249 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:31:46.312083 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-thvgt" podUID="1bc9e230-9ff4-4c1a-8152-cfa7e2379451" containerName="kserve-container" containerID="cri-o://afaf01f7daf1012bc0e27da9cdf8816a37c1cf32f99fdac488e8641bbac018a3" gracePeriod=30 Apr 21 10:31:46.409463 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:31:46.409425 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-j64zh"] Apr 21 10:31:46.409741 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:31:46.409728 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="506d958a-1f0b-4b51-a98b-77a2ec0b5033" containerName="storage-initializer" Apr 21 10:31:46.409822 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:31:46.409759 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="506d958a-1f0b-4b51-a98b-77a2ec0b5033" containerName="storage-initializer" Apr 21 10:31:46.409822 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:31:46.409784 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="506d958a-1f0b-4b51-a98b-77a2ec0b5033" containerName="kserve-container" Apr 21 10:31:46.409822 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:31:46.409791 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="506d958a-1f0b-4b51-a98b-77a2ec0b5033" containerName="kserve-container" Apr 21 10:31:46.409925 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:31:46.409850 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="506d958a-1f0b-4b51-a98b-77a2ec0b5033" containerName="kserve-container" Apr 21 10:31:46.412850 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:31:46.412834 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-j64zh" Apr 21 10:31:46.422473 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:31:46.422439 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-j64zh"] Apr 21 10:31:46.453503 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:31:46.453461 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d1c0253-c661-455d-a668-36464c700640-kserve-provision-location\") pod \"isvc-pmml-predictor-5584ffd8c9-j64zh\" (UID: \"4d1c0253-c661-455d-a668-36464c700640\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-j64zh" Apr 21 10:31:46.554441 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:31:46.554398 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d1c0253-c661-455d-a668-36464c700640-kserve-provision-location\") pod \"isvc-pmml-predictor-5584ffd8c9-j64zh\" (UID: \"4d1c0253-c661-455d-a668-36464c700640\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-j64zh" Apr 21 10:31:46.554773 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:31:46.554728 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d1c0253-c661-455d-a668-36464c700640-kserve-provision-location\") pod \"isvc-pmml-predictor-5584ffd8c9-j64zh\" (UID: \"4d1c0253-c661-455d-a668-36464c700640\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-j64zh" Apr 21 10:31:46.723553 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:31:46.723516 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-j64zh" Apr 21 10:31:46.849451 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:31:46.849419 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-j64zh"] Apr 21 10:31:46.852561 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:31:46.852528 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d1c0253_c661_455d_a668_36464c700640.slice/crio-17889f51855c61b1a378540e5e475691ddf68d5491c2eaf96376182c5ee71428 WatchSource:0}: Error finding container 17889f51855c61b1a378540e5e475691ddf68d5491c2eaf96376182c5ee71428: Status 404 returned error can't find the container with id 17889f51855c61b1a378540e5e475691ddf68d5491c2eaf96376182c5ee71428 Apr 21 10:31:46.854336 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:31:46.854317 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 10:31:47.357373 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:31:47.357333 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-j64zh" event={"ID":"4d1c0253-c661-455d-a668-36464c700640","Type":"ContainerStarted","Data":"a8cbce5bad82cc2005e6174464701d03852311d102b8ffabc87ab0fee1ced412"} Apr 21 10:31:47.357814 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:31:47.357380 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-j64zh" event={"ID":"4d1c0253-c661-455d-a668-36464c700640","Type":"ContainerStarted","Data":"17889f51855c61b1a378540e5e475691ddf68d5491c2eaf96376182c5ee71428"} Apr 21 10:31:49.260052 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:31:49.260026 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-thvgt" Apr 21 10:31:49.274150 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:31:49.274126 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1bc9e230-9ff4-4c1a-8152-cfa7e2379451-kserve-provision-location\") pod \"1bc9e230-9ff4-4c1a-8152-cfa7e2379451\" (UID: \"1bc9e230-9ff4-4c1a-8152-cfa7e2379451\") " Apr 21 10:31:49.285596 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:31:49.285557 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1bc9e230-9ff4-4c1a-8152-cfa7e2379451-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1bc9e230-9ff4-4c1a-8152-cfa7e2379451" (UID: "1bc9e230-9ff4-4c1a-8152-cfa7e2379451"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:31:49.365001 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:31:49.364915 2577 generic.go:358] "Generic (PLEG): container finished" podID="1bc9e230-9ff4-4c1a-8152-cfa7e2379451" containerID="afaf01f7daf1012bc0e27da9cdf8816a37c1cf32f99fdac488e8641bbac018a3" exitCode=0 Apr 21 10:31:49.365001 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:31:49.364989 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-thvgt" Apr 21 10:31:49.365001 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:31:49.364996 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-thvgt" event={"ID":"1bc9e230-9ff4-4c1a-8152-cfa7e2379451","Type":"ContainerDied","Data":"afaf01f7daf1012bc0e27da9cdf8816a37c1cf32f99fdac488e8641bbac018a3"} Apr 21 10:31:49.365244 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:31:49.365024 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-thvgt" event={"ID":"1bc9e230-9ff4-4c1a-8152-cfa7e2379451","Type":"ContainerDied","Data":"f618aaba4df8ada2beef11c57d18e6fd820407c689f91011c17a7741a9a5be8e"} Apr 21 10:31:49.365244 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:31:49.365040 2577 scope.go:117] "RemoveContainer" containerID="afaf01f7daf1012bc0e27da9cdf8816a37c1cf32f99fdac488e8641bbac018a3" Apr 21 10:31:49.373689 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:31:49.373671 2577 scope.go:117] "RemoveContainer" containerID="17b40686585813b9e477bca8b0412634358be7c7ce8d6157bc6cc9f6336f1e32" Apr 21 10:31:49.374569 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:31:49.374521 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1bc9e230-9ff4-4c1a-8152-cfa7e2379451-kserve-provision-location\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:31:49.381150 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:31:49.381132 2577 scope.go:117] "RemoveContainer" containerID="afaf01f7daf1012bc0e27da9cdf8816a37c1cf32f99fdac488e8641bbac018a3" Apr 21 10:31:49.381418 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:31:49.381398 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afaf01f7daf1012bc0e27da9cdf8816a37c1cf32f99fdac488e8641bbac018a3\": container with ID starting with afaf01f7daf1012bc0e27da9cdf8816a37c1cf32f99fdac488e8641bbac018a3 not found: ID does not exist" containerID="afaf01f7daf1012bc0e27da9cdf8816a37c1cf32f99fdac488e8641bbac018a3" Apr 21 10:31:49.381468 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:31:49.381429 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afaf01f7daf1012bc0e27da9cdf8816a37c1cf32f99fdac488e8641bbac018a3"} err="failed to get container status \"afaf01f7daf1012bc0e27da9cdf8816a37c1cf32f99fdac488e8641bbac018a3\": rpc error: code = NotFound desc = could not find container \"afaf01f7daf1012bc0e27da9cdf8816a37c1cf32f99fdac488e8641bbac018a3\": container with ID starting with afaf01f7daf1012bc0e27da9cdf8816a37c1cf32f99fdac488e8641bbac018a3 not found: ID does not exist" Apr 21 10:31:49.381468 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:31:49.381448 2577 scope.go:117] "RemoveContainer" containerID="17b40686585813b9e477bca8b0412634358be7c7ce8d6157bc6cc9f6336f1e32" Apr 21 10:31:49.381660 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:31:49.381644 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17b40686585813b9e477bca8b0412634358be7c7ce8d6157bc6cc9f6336f1e32\": container with ID starting with 17b40686585813b9e477bca8b0412634358be7c7ce8d6157bc6cc9f6336f1e32 not found: ID does not exist" containerID="17b40686585813b9e477bca8b0412634358be7c7ce8d6157bc6cc9f6336f1e32" Apr 21 10:31:49.381697 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:31:49.381665 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17b40686585813b9e477bca8b0412634358be7c7ce8d6157bc6cc9f6336f1e32"} err="failed to get container status \"17b40686585813b9e477bca8b0412634358be7c7ce8d6157bc6cc9f6336f1e32\": rpc error: code = NotFound desc = could not find container \"17b40686585813b9e477bca8b0412634358be7c7ce8d6157bc6cc9f6336f1e32\": container with ID starting with 17b40686585813b9e477bca8b0412634358be7c7ce8d6157bc6cc9f6336f1e32 not found: ID does not exist" Apr 21 10:31:49.385499 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:31:49.385476 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-thvgt"] Apr 21 10:31:49.389677 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:31:49.389651 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-thvgt"] Apr 21 10:31:50.231237 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:31:50.231201 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bc9e230-9ff4-4c1a-8152-cfa7e2379451" path="/var/lib/kubelet/pods/1bc9e230-9ff4-4c1a-8152-cfa7e2379451/volumes" Apr 21 10:31:51.373674 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:31:51.373643 2577 generic.go:358] "Generic (PLEG): container finished" podID="4d1c0253-c661-455d-a668-36464c700640" containerID="a8cbce5bad82cc2005e6174464701d03852311d102b8ffabc87ab0fee1ced412" exitCode=0 Apr 21 10:31:51.374078 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:31:51.373715 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-j64zh" event={"ID":"4d1c0253-c661-455d-a668-36464c700640","Type":"ContainerDied","Data":"a8cbce5bad82cc2005e6174464701d03852311d102b8ffabc87ab0fee1ced412"} Apr 21 10:31:59.409240 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:31:59.409200 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-j64zh" event={"ID":"4d1c0253-c661-455d-a668-36464c700640","Type":"ContainerStarted","Data":"2e88f85668b5db17271baae3c54f79fb45101b38db5874beb374b183cf906bac"} Apr 21 10:31:59.409770 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:31:59.409546 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-j64zh" Apr 21 10:31:59.410891 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:31:59.410867 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-j64zh" podUID="4d1c0253-c661-455d-a668-36464c700640" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 21 10:31:59.425688 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:31:59.425643 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-j64zh" podStartSLOduration=6.458418658 podStartE2EDuration="13.425629732s" podCreationTimestamp="2026-04-21 10:31:46 +0000 UTC" firstStartedPulling="2026-04-21 10:31:51.374881501 +0000 UTC m=+1679.713415495" lastFinishedPulling="2026-04-21 10:31:58.342092571 +0000 UTC m=+1686.680626569" observedRunningTime="2026-04-21 10:31:59.423195226 +0000 UTC m=+1687.761729242" watchObservedRunningTime="2026-04-21 10:31:59.425629732 +0000 UTC m=+1687.764163829" Apr 21 10:32:00.412706 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:32:00.412670 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-j64zh" podUID="4d1c0253-c661-455d-a668-36464c700640" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 21 10:32:10.413147 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:32:10.413100 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-j64zh" podUID="4d1c0253-c661-455d-a668-36464c700640" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 21 10:32:20.413247 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:32:20.413205 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-j64zh" podUID="4d1c0253-c661-455d-a668-36464c700640" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 21 10:32:30.412984 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:32:30.412933 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-j64zh" podUID="4d1c0253-c661-455d-a668-36464c700640" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 21 10:32:40.413248 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:32:40.413199 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-j64zh" podUID="4d1c0253-c661-455d-a668-36464c700640" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 21 10:32:50.412929 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:32:50.412881 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-j64zh" podUID="4d1c0253-c661-455d-a668-36464c700640" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 21 10:33:00.412834 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:33:00.412785 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-j64zh" podUID="4d1c0253-c661-455d-a668-36464c700640" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 21 10:33:10.413578 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:33:10.413530 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-j64zh" podUID="4d1c0253-c661-455d-a668-36464c700640" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 21 10:33:20.414444 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:33:20.414414 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-j64zh" Apr 21 10:33:27.449447 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:33:27.449409 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-j64zh"] Apr 21 10:33:27.449913 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:33:27.449718 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-j64zh" podUID="4d1c0253-c661-455d-a668-36464c700640" containerName="kserve-container" containerID="cri-o://2e88f85668b5db17271baae3c54f79fb45101b38db5874beb374b183cf906bac" gracePeriod=30 Apr 21 10:33:27.555672 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:33:27.555637 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dcjj4"] Apr 21 10:33:27.556067 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:33:27.556052 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1bc9e230-9ff4-4c1a-8152-cfa7e2379451" containerName="storage-initializer" Apr 21 10:33:27.556115 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:33:27.556069 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bc9e230-9ff4-4c1a-8152-cfa7e2379451" containerName="storage-initializer" Apr 21 10:33:27.556115 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:33:27.556084 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1bc9e230-9ff4-4c1a-8152-cfa7e2379451" containerName="kserve-container" Apr 21 10:33:27.556115 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:33:27.556092 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bc9e230-9ff4-4c1a-8152-cfa7e2379451" containerName="kserve-container" Apr 21 10:33:27.556210 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:33:27.556171 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="1bc9e230-9ff4-4c1a-8152-cfa7e2379451" containerName="kserve-container" Apr 21 10:33:27.559242 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:33:27.559226 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dcjj4" Apr 21 10:33:27.567527 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:33:27.567287 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dcjj4"] Apr 21 10:33:27.638937 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:33:27.638897 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5d383273-d751-42bf-b0f8-06633c766310-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-7576f6b69f-dcjj4\" (UID: \"5d383273-d751-42bf-b0f8-06633c766310\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dcjj4" Apr 21 10:33:27.739829 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:33:27.739783 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5d383273-d751-42bf-b0f8-06633c766310-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-7576f6b69f-dcjj4\" (UID: \"5d383273-d751-42bf-b0f8-06633c766310\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dcjj4" Apr 21 10:33:27.740173 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:33:27.740149 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5d383273-d751-42bf-b0f8-06633c766310-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-7576f6b69f-dcjj4\" (UID: \"5d383273-d751-42bf-b0f8-06633c766310\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dcjj4" Apr 21 10:33:27.872017 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:33:27.871979 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dcjj4" Apr 21 10:33:27.992076 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:33:27.991996 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dcjj4"] Apr 21 10:33:27.995578 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:33:27.995548 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d383273_d751_42bf_b0f8_06633c766310.slice/crio-fb1ab16b93f1e3f1ead0c1d61d0c47c4d207b4dea269d1904519d9358d0f1efc WatchSource:0}: Error finding container fb1ab16b93f1e3f1ead0c1d61d0c47c4d207b4dea269d1904519d9358d0f1efc: Status 404 returned error can't find the container with id fb1ab16b93f1e3f1ead0c1d61d0c47c4d207b4dea269d1904519d9358d0f1efc Apr 21 10:33:28.658114 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:33:28.658071 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dcjj4" event={"ID":"5d383273-d751-42bf-b0f8-06633c766310","Type":"ContainerStarted","Data":"73d7a5549fa475bef3f131e34c66cfd92129afb5ad44eb81542b2c3110a90d1b"} Apr 21 10:33:28.658114 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:33:28.658111 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dcjj4" event={"ID":"5d383273-d751-42bf-b0f8-06633c766310","Type":"ContainerStarted","Data":"fb1ab16b93f1e3f1ead0c1d61d0c47c4d207b4dea269d1904519d9358d0f1efc"} Apr 21 10:33:30.413105 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:33:30.413063 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-j64zh" podUID="4d1c0253-c661-455d-a668-36464c700640" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.38:8080: connect: connection refused" Apr 21 10:33:31.198045 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:33:31.198015 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-j64zh" Apr 21 10:33:31.268541 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:33:31.268446 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d1c0253-c661-455d-a668-36464c700640-kserve-provision-location\") pod \"4d1c0253-c661-455d-a668-36464c700640\" (UID: \"4d1c0253-c661-455d-a668-36464c700640\") " Apr 21 10:33:31.268823 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:33:31.268799 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d1c0253-c661-455d-a668-36464c700640-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4d1c0253-c661-455d-a668-36464c700640" (UID: "4d1c0253-c661-455d-a668-36464c700640"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:33:31.369094 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:33:31.369051 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d1c0253-c661-455d-a668-36464c700640-kserve-provision-location\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:33:31.669383 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:33:31.669351 2577 generic.go:358] "Generic (PLEG): container finished" podID="4d1c0253-c661-455d-a668-36464c700640" containerID="2e88f85668b5db17271baae3c54f79fb45101b38db5874beb374b183cf906bac" exitCode=0 Apr 21 10:33:31.669856 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:33:31.669442 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-j64zh" Apr 21 10:33:31.669856 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:33:31.669436 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-j64zh" event={"ID":"4d1c0253-c661-455d-a668-36464c700640","Type":"ContainerDied","Data":"2e88f85668b5db17271baae3c54f79fb45101b38db5874beb374b183cf906bac"} Apr 21 10:33:31.669856 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:33:31.669566 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-j64zh" event={"ID":"4d1c0253-c661-455d-a668-36464c700640","Type":"ContainerDied","Data":"17889f51855c61b1a378540e5e475691ddf68d5491c2eaf96376182c5ee71428"} Apr 21 10:33:31.669856 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:33:31.669603 2577 scope.go:117] "RemoveContainer" containerID="2e88f85668b5db17271baae3c54f79fb45101b38db5874beb374b183cf906bac" Apr 21 10:33:31.671038 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:33:31.671008 2577 generic.go:358] "Generic (PLEG): container finished" podID="5d383273-d751-42bf-b0f8-06633c766310" containerID="73d7a5549fa475bef3f131e34c66cfd92129afb5ad44eb81542b2c3110a90d1b" exitCode=0 Apr 21 10:33:31.671150 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:33:31.671035 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dcjj4" event={"ID":"5d383273-d751-42bf-b0f8-06633c766310","Type":"ContainerDied","Data":"73d7a5549fa475bef3f131e34c66cfd92129afb5ad44eb81542b2c3110a90d1b"} Apr 21 10:33:31.678432 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:33:31.678412 2577 scope.go:117] "RemoveContainer" containerID="a8cbce5bad82cc2005e6174464701d03852311d102b8ffabc87ab0fee1ced412" Apr 21 10:33:31.686287 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:33:31.686253 2577 scope.go:117] "RemoveContainer" containerID="2e88f85668b5db17271baae3c54f79fb45101b38db5874beb374b183cf906bac" Apr 21 10:33:31.686849 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:33:31.686817 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e88f85668b5db17271baae3c54f79fb45101b38db5874beb374b183cf906bac\": container with ID starting with 2e88f85668b5db17271baae3c54f79fb45101b38db5874beb374b183cf906bac not found: ID does not exist" containerID="2e88f85668b5db17271baae3c54f79fb45101b38db5874beb374b183cf906bac" Apr 21 10:33:31.687015 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:33:31.686862 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e88f85668b5db17271baae3c54f79fb45101b38db5874beb374b183cf906bac"} err="failed to get container status \"2e88f85668b5db17271baae3c54f79fb45101b38db5874beb374b183cf906bac\": rpc error: code = NotFound desc = could not find container \"2e88f85668b5db17271baae3c54f79fb45101b38db5874beb374b183cf906bac\": container with ID starting with 2e88f85668b5db17271baae3c54f79fb45101b38db5874beb374b183cf906bac not found: ID does not exist" Apr 21 10:33:31.687015 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:33:31.686891 2577 scope.go:117] "RemoveContainer" containerID="a8cbce5bad82cc2005e6174464701d03852311d102b8ffabc87ab0fee1ced412" Apr 21 10:33:31.687220 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:33:31.687192 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8cbce5bad82cc2005e6174464701d03852311d102b8ffabc87ab0fee1ced412\": container with ID starting with a8cbce5bad82cc2005e6174464701d03852311d102b8ffabc87ab0fee1ced412 not found: ID does not exist" containerID="a8cbce5bad82cc2005e6174464701d03852311d102b8ffabc87ab0fee1ced412" Apr 21 10:33:31.687320 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:33:31.687229 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8cbce5bad82cc2005e6174464701d03852311d102b8ffabc87ab0fee1ced412"} err="failed to get container status \"a8cbce5bad82cc2005e6174464701d03852311d102b8ffabc87ab0fee1ced412\": rpc error: code = NotFound desc = could not find container \"a8cbce5bad82cc2005e6174464701d03852311d102b8ffabc87ab0fee1ced412\": container with ID starting with a8cbce5bad82cc2005e6174464701d03852311d102b8ffabc87ab0fee1ced412 not found: ID does not exist" Apr 21 10:33:31.697329 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:33:31.697297 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-j64zh"] Apr 21 10:33:31.699015 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:33:31.698992 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-j64zh"] Apr 21 10:33:32.233075 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:33:32.233045 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d1c0253-c661-455d-a668-36464c700640" path="/var/lib/kubelet/pods/4d1c0253-c661-455d-a668-36464c700640/volumes" Apr 21 10:33:32.675926 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:33:32.675842 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dcjj4" event={"ID":"5d383273-d751-42bf-b0f8-06633c766310","Type":"ContainerStarted","Data":"295e21de7179b8e2a142ded0e3b5029866007f5aba5380a57674affa1d063464"} Apr 21 10:33:32.676316 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:33:32.676141 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dcjj4" Apr 21 10:33:32.677486 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:33:32.677460 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dcjj4" podUID="5d383273-d751-42bf-b0f8-06633c766310" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 21 10:33:32.690003 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:33:32.689943 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dcjj4" podStartSLOduration=5.689925554 podStartE2EDuration="5.689925554s" podCreationTimestamp="2026-04-21 10:33:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:33:32.689262972 +0000 UTC m=+1781.027796989" watchObservedRunningTime="2026-04-21 10:33:32.689925554 +0000 UTC m=+1781.028459575" Apr 21 10:33:33.679068 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:33:33.679029 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dcjj4" podUID="5d383273-d751-42bf-b0f8-06633c766310" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 21 10:33:43.679610 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:33:43.679518 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dcjj4" podUID="5d383273-d751-42bf-b0f8-06633c766310" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 21 10:33:52.228468 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:33:52.228438 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-9vsls_c7fc40ec-8080-4847-9b12-78671f916c03/console-operator/2.log" Apr 21 10:33:52.234352 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:33:52.234318 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-9vsls_c7fc40ec-8080-4847-9b12-78671f916c03/console-operator/2.log" Apr 21 10:33:53.679944 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:33:53.679895 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dcjj4" podUID="5d383273-d751-42bf-b0f8-06633c766310" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 21 10:34:03.679861 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:34:03.679814 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dcjj4" podUID="5d383273-d751-42bf-b0f8-06633c766310" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 21 10:34:13.679443 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:34:13.679399 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dcjj4" podUID="5d383273-d751-42bf-b0f8-06633c766310" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 21 10:34:23.679252 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:34:23.679204 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dcjj4" podUID="5d383273-d751-42bf-b0f8-06633c766310" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 21 10:34:33.680086 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:34:33.680040 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dcjj4" podUID="5d383273-d751-42bf-b0f8-06633c766310" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 21 10:34:43.679912 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:34:43.679865 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dcjj4" podUID="5d383273-d751-42bf-b0f8-06633c766310" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.39:8080: connect: connection refused" Apr 21 10:34:53.679970 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:34:53.679942 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dcjj4" Apr 21 10:34:58.652962 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:34:58.652927 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dcjj4"] Apr 21 10:34:58.653450 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:34:58.653184 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dcjj4" podUID="5d383273-d751-42bf-b0f8-06633c766310" containerName="kserve-container" containerID="cri-o://295e21de7179b8e2a142ded0e3b5029866007f5aba5380a57674affa1d063464" gracePeriod=30 Apr 21 10:34:58.743344 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:34:58.743305 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-lkh6t"] Apr 21 10:34:58.743616 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:34:58.743604 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4d1c0253-c661-455d-a668-36464c700640" containerName="storage-initializer" Apr 21 10:34:58.743671 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:34:58.743617 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d1c0253-c661-455d-a668-36464c700640" containerName="storage-initializer" Apr 21 10:34:58.743671 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:34:58.743649 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4d1c0253-c661-455d-a668-36464c700640" containerName="kserve-container" Apr 21 10:34:58.743671 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:34:58.743657 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d1c0253-c661-455d-a668-36464c700640" containerName="kserve-container" Apr 21 10:34:58.743780 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:34:58.743707 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="4d1c0253-c661-455d-a668-36464c700640" containerName="kserve-container" Apr 21 10:34:58.746704 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:34:58.746686 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-lkh6t" Apr 21 10:34:58.762996 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:34:58.762964 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-lkh6t"] Apr 21 10:34:58.872420 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:34:58.872388 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7fcbd1ef-e46a-4d07-9851-8e27c96e2744-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-75b87ff64c-lkh6t\" (UID: \"7fcbd1ef-e46a-4d07-9851-8e27c96e2744\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-lkh6t" Apr 21 10:34:58.973483 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:34:58.973443 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7fcbd1ef-e46a-4d07-9851-8e27c96e2744-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-75b87ff64c-lkh6t\" (UID: \"7fcbd1ef-e46a-4d07-9851-8e27c96e2744\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-lkh6t" Apr 21 10:34:58.973859 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:34:58.973838 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7fcbd1ef-e46a-4d07-9851-8e27c96e2744-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-75b87ff64c-lkh6t\" (UID: \"7fcbd1ef-e46a-4d07-9851-8e27c96e2744\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-lkh6t" Apr 21 10:34:59.056580 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:34:59.056549 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-lkh6t" Apr 21 10:34:59.177321 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:34:59.177285 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-lkh6t"] Apr 21 10:34:59.180787 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:34:59.180737 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7fcbd1ef_e46a_4d07_9851_8e27c96e2744.slice/crio-63be5c9ca49ea1359ccd31f78db33603a8d39effccbf7213e9cf64bb7de3603a WatchSource:0}: Error finding container 63be5c9ca49ea1359ccd31f78db33603a8d39effccbf7213e9cf64bb7de3603a: Status 404 returned error can't find the container with id 63be5c9ca49ea1359ccd31f78db33603a8d39effccbf7213e9cf64bb7de3603a Apr 21 10:34:59.939609 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:34:59.939566 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-lkh6t" event={"ID":"7fcbd1ef-e46a-4d07-9851-8e27c96e2744","Type":"ContainerStarted","Data":"676298af78708c706a023c9e2ffe4f6930e0cddce903ffefeadd44a579f8fde9"} Apr 21 10:34:59.939609 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:34:59.939611 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-lkh6t" event={"ID":"7fcbd1ef-e46a-4d07-9851-8e27c96e2744","Type":"ContainerStarted","Data":"63be5c9ca49ea1359ccd31f78db33603a8d39effccbf7213e9cf64bb7de3603a"} Apr 21 10:35:02.392068 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:35:02.392044 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dcjj4" Apr 21 10:35:02.503701 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:35:02.503668 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5d383273-d751-42bf-b0f8-06633c766310-kserve-provision-location\") pod \"5d383273-d751-42bf-b0f8-06633c766310\" (UID: \"5d383273-d751-42bf-b0f8-06633c766310\") " Apr 21 10:35:02.504049 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:35:02.504023 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d383273-d751-42bf-b0f8-06633c766310-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5d383273-d751-42bf-b0f8-06633c766310" (UID: "5d383273-d751-42bf-b0f8-06633c766310"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:35:02.604565 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:35:02.604530 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5d383273-d751-42bf-b0f8-06633c766310-kserve-provision-location\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:35:02.949022 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:35:02.948926 2577 generic.go:358] "Generic (PLEG): container finished" podID="5d383273-d751-42bf-b0f8-06633c766310" containerID="295e21de7179b8e2a142ded0e3b5029866007f5aba5380a57674affa1d063464" exitCode=0 Apr 21 10:35:02.949022 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:35:02.948992 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dcjj4" Apr 21 10:35:02.949022 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:35:02.949008 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dcjj4" event={"ID":"5d383273-d751-42bf-b0f8-06633c766310","Type":"ContainerDied","Data":"295e21de7179b8e2a142ded0e3b5029866007f5aba5380a57674affa1d063464"} Apr 21 10:35:02.949283 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:35:02.949040 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dcjj4" event={"ID":"5d383273-d751-42bf-b0f8-06633c766310","Type":"ContainerDied","Data":"fb1ab16b93f1e3f1ead0c1d61d0c47c4d207b4dea269d1904519d9358d0f1efc"} Apr 21 10:35:02.949283 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:35:02.949058 2577 scope.go:117] "RemoveContainer" containerID="295e21de7179b8e2a142ded0e3b5029866007f5aba5380a57674affa1d063464" Apr 21 10:35:02.950436 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:35:02.950413 2577 generic.go:358] "Generic (PLEG): container finished" podID="7fcbd1ef-e46a-4d07-9851-8e27c96e2744" containerID="676298af78708c706a023c9e2ffe4f6930e0cddce903ffefeadd44a579f8fde9" exitCode=0 Apr 21 10:35:02.950539 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:35:02.950447 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-lkh6t" event={"ID":"7fcbd1ef-e46a-4d07-9851-8e27c96e2744","Type":"ContainerDied","Data":"676298af78708c706a023c9e2ffe4f6930e0cddce903ffefeadd44a579f8fde9"} Apr 21 10:35:02.957507 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:35:02.957357 2577 scope.go:117] "RemoveContainer" containerID="73d7a5549fa475bef3f131e34c66cfd92129afb5ad44eb81542b2c3110a90d1b" Apr 21 10:35:02.965451 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:35:02.965425 2577 scope.go:117] "RemoveContainer" containerID="295e21de7179b8e2a142ded0e3b5029866007f5aba5380a57674affa1d063464" Apr 21 10:35:02.965793 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:35:02.965724 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"295e21de7179b8e2a142ded0e3b5029866007f5aba5380a57674affa1d063464\": container with ID starting with 295e21de7179b8e2a142ded0e3b5029866007f5aba5380a57674affa1d063464 not found: ID does not exist" containerID="295e21de7179b8e2a142ded0e3b5029866007f5aba5380a57674affa1d063464" Apr 21 10:35:02.965793 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:35:02.965770 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"295e21de7179b8e2a142ded0e3b5029866007f5aba5380a57674affa1d063464"} err="failed to get container status \"295e21de7179b8e2a142ded0e3b5029866007f5aba5380a57674affa1d063464\": rpc error: code = NotFound desc = could not find container \"295e21de7179b8e2a142ded0e3b5029866007f5aba5380a57674affa1d063464\": container with ID starting with 295e21de7179b8e2a142ded0e3b5029866007f5aba5380a57674affa1d063464 not found: ID does not exist" Apr 21 10:35:02.965793 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:35:02.965790 2577 scope.go:117] "RemoveContainer" containerID="73d7a5549fa475bef3f131e34c66cfd92129afb5ad44eb81542b2c3110a90d1b" Apr 21 10:35:02.966048 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:35:02.966026 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73d7a5549fa475bef3f131e34c66cfd92129afb5ad44eb81542b2c3110a90d1b\": container with ID starting with 73d7a5549fa475bef3f131e34c66cfd92129afb5ad44eb81542b2c3110a90d1b not found: ID does not exist" containerID="73d7a5549fa475bef3f131e34c66cfd92129afb5ad44eb81542b2c3110a90d1b" Apr 21 10:35:02.966109 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:35:02.966056 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73d7a5549fa475bef3f131e34c66cfd92129afb5ad44eb81542b2c3110a90d1b"} err="failed to get container status \"73d7a5549fa475bef3f131e34c66cfd92129afb5ad44eb81542b2c3110a90d1b\": rpc error: code = NotFound desc = could not find container \"73d7a5549fa475bef3f131e34c66cfd92129afb5ad44eb81542b2c3110a90d1b\": container with ID starting with 73d7a5549fa475bef3f131e34c66cfd92129afb5ad44eb81542b2c3110a90d1b not found: ID does not exist" Apr 21 10:35:02.981487 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:35:02.981458 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dcjj4"] Apr 21 10:35:02.984986 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:35:02.984964 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-dcjj4"] Apr 21 10:35:03.955510 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:35:03.955476 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-lkh6t" event={"ID":"7fcbd1ef-e46a-4d07-9851-8e27c96e2744","Type":"ContainerStarted","Data":"b513c9f103716d09404be88e0ae8229cf4f01104f0625f80e470e97a24ce91ec"} Apr 21 10:35:03.955928 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:35:03.955776 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-lkh6t" Apr 21 10:35:03.957056 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:35:03.957029 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-lkh6t" podUID="7fcbd1ef-e46a-4d07-9851-8e27c96e2744" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 21 10:35:03.973487 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:35:03.973437 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-lkh6t" podStartSLOduration=5.973421205 podStartE2EDuration="5.973421205s" podCreationTimestamp="2026-04-21 10:34:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:35:03.971926883 +0000 UTC m=+1872.310460900" watchObservedRunningTime="2026-04-21 10:35:03.973421205 +0000 UTC m=+1872.311955220" Apr 21 10:35:04.232245 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:35:04.232213 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d383273-d751-42bf-b0f8-06633c766310" path="/var/lib/kubelet/pods/5d383273-d751-42bf-b0f8-06633c766310/volumes" Apr 21 10:35:04.958768 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:35:04.958713 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-lkh6t" podUID="7fcbd1ef-e46a-4d07-9851-8e27c96e2744" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 21 10:35:14.959013 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:35:14.958918 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-lkh6t" podUID="7fcbd1ef-e46a-4d07-9851-8e27c96e2744" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 21 10:35:24.959663 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:35:24.959611 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-lkh6t" podUID="7fcbd1ef-e46a-4d07-9851-8e27c96e2744" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 21 10:35:34.959142 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:35:34.959092 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-lkh6t" podUID="7fcbd1ef-e46a-4d07-9851-8e27c96e2744" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 21 10:35:44.959494 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:35:44.959444 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-lkh6t" podUID="7fcbd1ef-e46a-4d07-9851-8e27c96e2744" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 21 10:35:54.959710 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:35:54.959662 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-lkh6t" podUID="7fcbd1ef-e46a-4d07-9851-8e27c96e2744" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 21 10:36:04.958977 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:36:04.958930 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-lkh6t" podUID="7fcbd1ef-e46a-4d07-9851-8e27c96e2744" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 21 10:36:07.228066 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:36:07.228027 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-lkh6t" podUID="7fcbd1ef-e46a-4d07-9851-8e27c96e2744" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 21 10:36:17.228107 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:36:17.228047 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-lkh6t" podUID="7fcbd1ef-e46a-4d07-9851-8e27c96e2744" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.40:8080: connect: connection refused" Apr 21 10:36:27.228840 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:36:27.228802 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-lkh6t" Apr 21 10:36:29.775052 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:36:29.775021 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-lkh6t"] Apr 21 10:36:29.775469 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:36:29.775254 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-lkh6t" podUID="7fcbd1ef-e46a-4d07-9851-8e27c96e2744" containerName="kserve-container" containerID="cri-o://b513c9f103716d09404be88e0ae8229cf4f01104f0625f80e470e97a24ce91ec" gracePeriod=30 Apr 21 10:36:29.927362 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:36:29.927325 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-6cd07a-predictor-859f97c7ff-dpxdg"] Apr 21 10:36:29.927624 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:36:29.927612 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d383273-d751-42bf-b0f8-06633c766310" containerName="storage-initializer" Apr 21 10:36:29.927672 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:36:29.927625 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d383273-d751-42bf-b0f8-06633c766310" containerName="storage-initializer" Apr 21 10:36:29.927672 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:36:29.927657 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d383273-d751-42bf-b0f8-06633c766310" containerName="kserve-container" Apr 21 10:36:29.927672 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:36:29.927666 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d383273-d751-42bf-b0f8-06633c766310" containerName="kserve-container" Apr 21 10:36:29.927814 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:36:29.927724 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="5d383273-d751-42bf-b0f8-06633c766310" containerName="kserve-container" Apr 21 10:36:29.930619 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:36:29.930600 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-6cd07a-predictor-859f97c7ff-dpxdg" Apr 21 10:36:29.940843 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:36:29.940808 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-6cd07a-predictor-859f97c7ff-dpxdg"] Apr 21 10:36:30.009192 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:36:30.009154 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9b1aa616-4bfe-4fa8-85bc-a7fdc15697d3-kserve-provision-location\") pod \"isvc-primary-6cd07a-predictor-859f97c7ff-dpxdg\" (UID: \"9b1aa616-4bfe-4fa8-85bc-a7fdc15697d3\") " pod="kserve-ci-e2e-test/isvc-primary-6cd07a-predictor-859f97c7ff-dpxdg" Apr 21 10:36:30.110064 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:36:30.109975 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9b1aa616-4bfe-4fa8-85bc-a7fdc15697d3-kserve-provision-location\") pod \"isvc-primary-6cd07a-predictor-859f97c7ff-dpxdg\" (UID: \"9b1aa616-4bfe-4fa8-85bc-a7fdc15697d3\") " pod="kserve-ci-e2e-test/isvc-primary-6cd07a-predictor-859f97c7ff-dpxdg" Apr 21 10:36:30.110354 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:36:30.110335 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9b1aa616-4bfe-4fa8-85bc-a7fdc15697d3-kserve-provision-location\") pod \"isvc-primary-6cd07a-predictor-859f97c7ff-dpxdg\" (UID: \"9b1aa616-4bfe-4fa8-85bc-a7fdc15697d3\") " pod="kserve-ci-e2e-test/isvc-primary-6cd07a-predictor-859f97c7ff-dpxdg" Apr 21 10:36:30.241309 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:36:30.241270 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-6cd07a-predictor-859f97c7ff-dpxdg" Apr 21 10:36:30.370052 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:36:30.369930 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-6cd07a-predictor-859f97c7ff-dpxdg"] Apr 21 10:36:30.372786 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:36:30.372731 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b1aa616_4bfe_4fa8_85bc_a7fdc15697d3.slice/crio-3d9024695cd15dd21e4340dd2171b1b8c9ef50bf1555f0c48a425a23064f60dc WatchSource:0}: Error finding container 3d9024695cd15dd21e4340dd2171b1b8c9ef50bf1555f0c48a425a23064f60dc: Status 404 returned error can't find the container with id 3d9024695cd15dd21e4340dd2171b1b8c9ef50bf1555f0c48a425a23064f60dc Apr 21 10:36:31.216093 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:36:31.216060 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-6cd07a-predictor-859f97c7ff-dpxdg" event={"ID":"9b1aa616-4bfe-4fa8-85bc-a7fdc15697d3","Type":"ContainerStarted","Data":"7623ea2cb5e211c9d95395fde2a5e11d57ac97b5329a1e81a8e9c5f606fe3e69"} Apr 21 10:36:31.216093 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:36:31.216095 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-6cd07a-predictor-859f97c7ff-dpxdg" event={"ID":"9b1aa616-4bfe-4fa8-85bc-a7fdc15697d3","Type":"ContainerStarted","Data":"3d9024695cd15dd21e4340dd2171b1b8c9ef50bf1555f0c48a425a23064f60dc"} Apr 21 10:36:33.719437 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:36:33.719406 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-lkh6t" Apr 21 10:36:33.741526 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:36:33.741496 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7fcbd1ef-e46a-4d07-9851-8e27c96e2744-kserve-provision-location\") pod \"7fcbd1ef-e46a-4d07-9851-8e27c96e2744\" (UID: \"7fcbd1ef-e46a-4d07-9851-8e27c96e2744\") " Apr 21 10:36:33.741860 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:36:33.741835 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fcbd1ef-e46a-4d07-9851-8e27c96e2744-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7fcbd1ef-e46a-4d07-9851-8e27c96e2744" (UID: "7fcbd1ef-e46a-4d07-9851-8e27c96e2744"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:36:33.843003 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:36:33.842914 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7fcbd1ef-e46a-4d07-9851-8e27c96e2744-kserve-provision-location\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:36:34.225831 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:36:34.225791 2577 generic.go:358] "Generic (PLEG): container finished" podID="7fcbd1ef-e46a-4d07-9851-8e27c96e2744" containerID="b513c9f103716d09404be88e0ae8229cf4f01104f0625f80e470e97a24ce91ec" exitCode=0 Apr 21 10:36:34.226079 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:36:34.225859 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-lkh6t" Apr 21 10:36:34.226079 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:36:34.225880 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-lkh6t" event={"ID":"7fcbd1ef-e46a-4d07-9851-8e27c96e2744","Type":"ContainerDied","Data":"b513c9f103716d09404be88e0ae8229cf4f01104f0625f80e470e97a24ce91ec"} Apr 21 10:36:34.226079 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:36:34.225932 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-lkh6t" event={"ID":"7fcbd1ef-e46a-4d07-9851-8e27c96e2744","Type":"ContainerDied","Data":"63be5c9ca49ea1359ccd31f78db33603a8d39effccbf7213e9cf64bb7de3603a"} Apr 21 10:36:34.226079 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:36:34.225949 2577 scope.go:117] "RemoveContainer" containerID="b513c9f103716d09404be88e0ae8229cf4f01104f0625f80e470e97a24ce91ec" Apr 21 10:36:34.227320 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:36:34.227280 2577 generic.go:358] "Generic (PLEG): container finished" podID="9b1aa616-4bfe-4fa8-85bc-a7fdc15697d3" containerID="7623ea2cb5e211c9d95395fde2a5e11d57ac97b5329a1e81a8e9c5f606fe3e69" exitCode=0 Apr 21 10:36:34.231657 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:36:34.231626 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-6cd07a-predictor-859f97c7ff-dpxdg" event={"ID":"9b1aa616-4bfe-4fa8-85bc-a7fdc15697d3","Type":"ContainerDied","Data":"7623ea2cb5e211c9d95395fde2a5e11d57ac97b5329a1e81a8e9c5f606fe3e69"} Apr 21 10:36:34.234049 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:36:34.234026 2577 scope.go:117] "RemoveContainer" containerID="676298af78708c706a023c9e2ffe4f6930e0cddce903ffefeadd44a579f8fde9" Apr 21 10:36:34.241654 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:36:34.241624 2577 scope.go:117] "RemoveContainer" containerID="b513c9f103716d09404be88e0ae8229cf4f01104f0625f80e470e97a24ce91ec" Apr 21 10:36:34.241955 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:36:34.241927 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b513c9f103716d09404be88e0ae8229cf4f01104f0625f80e470e97a24ce91ec\": container with ID starting with b513c9f103716d09404be88e0ae8229cf4f01104f0625f80e470e97a24ce91ec not found: ID does not exist" containerID="b513c9f103716d09404be88e0ae8229cf4f01104f0625f80e470e97a24ce91ec" Apr 21 10:36:34.242003 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:36:34.241967 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b513c9f103716d09404be88e0ae8229cf4f01104f0625f80e470e97a24ce91ec"} err="failed to get container status \"b513c9f103716d09404be88e0ae8229cf4f01104f0625f80e470e97a24ce91ec\": rpc error: code = NotFound desc = could not find container \"b513c9f103716d09404be88e0ae8229cf4f01104f0625f80e470e97a24ce91ec\": container with ID starting with b513c9f103716d09404be88e0ae8229cf4f01104f0625f80e470e97a24ce91ec not found: ID does not exist" Apr 21 10:36:34.242003 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:36:34.241995 2577 scope.go:117] "RemoveContainer" containerID="676298af78708c706a023c9e2ffe4f6930e0cddce903ffefeadd44a579f8fde9" Apr 21 10:36:34.242249 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:36:34.242230 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"676298af78708c706a023c9e2ffe4f6930e0cddce903ffefeadd44a579f8fde9\": container with ID starting with 676298af78708c706a023c9e2ffe4f6930e0cddce903ffefeadd44a579f8fde9 not found: ID does not exist" containerID="676298af78708c706a023c9e2ffe4f6930e0cddce903ffefeadd44a579f8fde9" Apr 21 10:36:34.242316 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:36:34.242258 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"676298af78708c706a023c9e2ffe4f6930e0cddce903ffefeadd44a579f8fde9"} err="failed to get container status \"676298af78708c706a023c9e2ffe4f6930e0cddce903ffefeadd44a579f8fde9\": rpc error: code = NotFound desc = could not find container \"676298af78708c706a023c9e2ffe4f6930e0cddce903ffefeadd44a579f8fde9\": container with ID starting with 676298af78708c706a023c9e2ffe4f6930e0cddce903ffefeadd44a579f8fde9 not found: ID does not exist" Apr 21 10:36:34.264287 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:36:34.264250 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-lkh6t"] Apr 21 10:36:34.270081 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:36:34.270049 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-lkh6t"] Apr 21 10:36:35.232906 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:36:35.232875 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-6cd07a-predictor-859f97c7ff-dpxdg" event={"ID":"9b1aa616-4bfe-4fa8-85bc-a7fdc15697d3","Type":"ContainerStarted","Data":"0e762e709061bc006b114d836606c467d89f14cef251b59d2ce74944029634ce"} Apr 21 10:36:35.233410 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:36:35.233207 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-6cd07a-predictor-859f97c7ff-dpxdg" Apr 21 10:36:35.234540 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:36:35.234507 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-6cd07a-predictor-859f97c7ff-dpxdg" podUID="9b1aa616-4bfe-4fa8-85bc-a7fdc15697d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 21 10:36:35.255284 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:36:35.255236 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-6cd07a-predictor-859f97c7ff-dpxdg" podStartSLOduration=6.255220292 podStartE2EDuration="6.255220292s" podCreationTimestamp="2026-04-21 10:36:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:36:35.252345292 +0000 UTC m=+1963.590879309" watchObservedRunningTime="2026-04-21 10:36:35.255220292 +0000 UTC m=+1963.593754308" Apr 21 10:36:36.231618 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:36:36.231580 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fcbd1ef-e46a-4d07-9851-8e27c96e2744" path="/var/lib/kubelet/pods/7fcbd1ef-e46a-4d07-9851-8e27c96e2744/volumes" Apr 21 10:36:36.235510 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:36:36.235477 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-6cd07a-predictor-859f97c7ff-dpxdg" podUID="9b1aa616-4bfe-4fa8-85bc-a7fdc15697d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 21 10:36:46.235705 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:36:46.235618 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-6cd07a-predictor-859f97c7ff-dpxdg" podUID="9b1aa616-4bfe-4fa8-85bc-a7fdc15697d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 21 10:36:56.235698 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:36:56.235648 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-6cd07a-predictor-859f97c7ff-dpxdg" podUID="9b1aa616-4bfe-4fa8-85bc-a7fdc15697d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 21 10:37:06.235870 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:37:06.235825 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-6cd07a-predictor-859f97c7ff-dpxdg" podUID="9b1aa616-4bfe-4fa8-85bc-a7fdc15697d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 21 10:37:16.235448 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:37:16.235406 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-6cd07a-predictor-859f97c7ff-dpxdg" podUID="9b1aa616-4bfe-4fa8-85bc-a7fdc15697d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 21 10:37:26.235561 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:37:26.235516 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-6cd07a-predictor-859f97c7ff-dpxdg" podUID="9b1aa616-4bfe-4fa8-85bc-a7fdc15697d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 21 10:37:36.236391 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:37:36.236342 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-6cd07a-predictor-859f97c7ff-dpxdg" podUID="9b1aa616-4bfe-4fa8-85bc-a7fdc15697d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.41:8080: connect: connection refused" Apr 21 10:37:45.229532 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:37:45.229500 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-6cd07a-predictor-859f97c7ff-dpxdg" Apr 21 10:37:49.981758 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:37:49.981718 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-6cd07a-predictor-7576f8c97d-zgfh6"] Apr 21 10:37:49.982229 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:37:49.982044 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7fcbd1ef-e46a-4d07-9851-8e27c96e2744" containerName="kserve-container" Apr 21 10:37:49.982229 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:37:49.982055 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fcbd1ef-e46a-4d07-9851-8e27c96e2744" containerName="kserve-container" Apr 21 10:37:49.982229 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:37:49.982064 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7fcbd1ef-e46a-4d07-9851-8e27c96e2744" containerName="storage-initializer" Apr 21 10:37:49.982229 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:37:49.982070 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fcbd1ef-e46a-4d07-9851-8e27c96e2744" containerName="storage-initializer" Apr 21 10:37:49.982229 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:37:49.982116 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="7fcbd1ef-e46a-4d07-9851-8e27c96e2744" containerName="kserve-container" Apr 21 10:37:49.986973 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:37:49.985634 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-6cd07a-predictor-7576f8c97d-zgfh6" Apr 21 10:37:49.988803 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:37:49.988769 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-6cd07a\"" Apr 21 10:37:49.988803 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:37:49.988799 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-6cd07a-dockercfg-8wzj4\"" Apr 21 10:37:49.989288 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:37:49.989275 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 21 10:37:49.995237 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:37:49.995213 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-6cd07a-predictor-7576f8c97d-zgfh6"] Apr 21 10:37:50.055972 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:37:50.055929 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/24580193-9d49-483e-8f0e-de94069bbc90-kserve-provision-location\") pod \"isvc-secondary-6cd07a-predictor-7576f8c97d-zgfh6\" (UID: \"24580193-9d49-483e-8f0e-de94069bbc90\") " pod="kserve-ci-e2e-test/isvc-secondary-6cd07a-predictor-7576f8c97d-zgfh6" Apr 21 10:37:50.055972 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:37:50.055970 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/24580193-9d49-483e-8f0e-de94069bbc90-cabundle-cert\") pod \"isvc-secondary-6cd07a-predictor-7576f8c97d-zgfh6\" (UID: \"24580193-9d49-483e-8f0e-de94069bbc90\") " pod="kserve-ci-e2e-test/isvc-secondary-6cd07a-predictor-7576f8c97d-zgfh6" Apr 21 10:37:50.156537 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:37:50.156503 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/24580193-9d49-483e-8f0e-de94069bbc90-kserve-provision-location\") pod \"isvc-secondary-6cd07a-predictor-7576f8c97d-zgfh6\" (UID: \"24580193-9d49-483e-8f0e-de94069bbc90\") " pod="kserve-ci-e2e-test/isvc-secondary-6cd07a-predictor-7576f8c97d-zgfh6" Apr 21 10:37:50.156537 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:37:50.156540 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/24580193-9d49-483e-8f0e-de94069bbc90-cabundle-cert\") pod \"isvc-secondary-6cd07a-predictor-7576f8c97d-zgfh6\" (UID: \"24580193-9d49-483e-8f0e-de94069bbc90\") " pod="kserve-ci-e2e-test/isvc-secondary-6cd07a-predictor-7576f8c97d-zgfh6" Apr 21 10:37:50.156938 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:37:50.156918 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/24580193-9d49-483e-8f0e-de94069bbc90-kserve-provision-location\") pod \"isvc-secondary-6cd07a-predictor-7576f8c97d-zgfh6\" (UID: \"24580193-9d49-483e-8f0e-de94069bbc90\") " pod="kserve-ci-e2e-test/isvc-secondary-6cd07a-predictor-7576f8c97d-zgfh6" Apr 21 10:37:50.157174 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:37:50.157156 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/24580193-9d49-483e-8f0e-de94069bbc90-cabundle-cert\") pod \"isvc-secondary-6cd07a-predictor-7576f8c97d-zgfh6\" (UID: \"24580193-9d49-483e-8f0e-de94069bbc90\") " pod="kserve-ci-e2e-test/isvc-secondary-6cd07a-predictor-7576f8c97d-zgfh6" Apr 21 10:37:50.298094 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:37:50.297975 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-6cd07a-predictor-7576f8c97d-zgfh6" Apr 21 10:37:50.420346 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:37:50.420312 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-6cd07a-predictor-7576f8c97d-zgfh6"] Apr 21 10:37:50.423501 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:37:50.423472 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24580193_9d49_483e_8f0e_de94069bbc90.slice/crio-edb53f0239df90f58b5d883e15d10d2bd66f955c2828ad52c0eca951c8b9389c WatchSource:0}: Error finding container edb53f0239df90f58b5d883e15d10d2bd66f955c2828ad52c0eca951c8b9389c: Status 404 returned error can't find the container with id edb53f0239df90f58b5d883e15d10d2bd66f955c2828ad52c0eca951c8b9389c Apr 21 10:37:50.425731 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:37:50.425713 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 10:37:50.452169 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:37:50.452142 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-6cd07a-predictor-7576f8c97d-zgfh6" event={"ID":"24580193-9d49-483e-8f0e-de94069bbc90","Type":"ContainerStarted","Data":"edb53f0239df90f58b5d883e15d10d2bd66f955c2828ad52c0eca951c8b9389c"} Apr 21 10:37:51.456351 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:37:51.456315 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-6cd07a-predictor-7576f8c97d-zgfh6" event={"ID":"24580193-9d49-483e-8f0e-de94069bbc90","Type":"ContainerStarted","Data":"d2124bf70b7afb240229cb964f5776094dbe8d5a550518a57c58dc392de255ff"} Apr 21 10:37:56.472577 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:37:56.472548 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-6cd07a-predictor-7576f8c97d-zgfh6_24580193-9d49-483e-8f0e-de94069bbc90/storage-initializer/0.log" Apr 21 10:37:56.472991 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:37:56.472587 2577 generic.go:358] "Generic (PLEG): container finished" podID="24580193-9d49-483e-8f0e-de94069bbc90" containerID="d2124bf70b7afb240229cb964f5776094dbe8d5a550518a57c58dc392de255ff" exitCode=1 Apr 21 10:37:56.472991 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:37:56.472620 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-6cd07a-predictor-7576f8c97d-zgfh6" event={"ID":"24580193-9d49-483e-8f0e-de94069bbc90","Type":"ContainerDied","Data":"d2124bf70b7afb240229cb964f5776094dbe8d5a550518a57c58dc392de255ff"} Apr 21 10:37:57.477099 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:37:57.477071 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-6cd07a-predictor-7576f8c97d-zgfh6_24580193-9d49-483e-8f0e-de94069bbc90/storage-initializer/0.log" Apr 21 10:37:57.477541 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:37:57.477151 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-6cd07a-predictor-7576f8c97d-zgfh6" event={"ID":"24580193-9d49-483e-8f0e-de94069bbc90","Type":"ContainerStarted","Data":"7eaf5c9e2a60d5157b6bbfb71ad37ddd8d27af79b9dd47eac33c7d7ec90cd5ed"} Apr 21 10:38:00.486572 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:00.486547 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-6cd07a-predictor-7576f8c97d-zgfh6_24580193-9d49-483e-8f0e-de94069bbc90/storage-initializer/1.log" Apr 21 10:38:00.486995 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:00.486884 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-6cd07a-predictor-7576f8c97d-zgfh6_24580193-9d49-483e-8f0e-de94069bbc90/storage-initializer/0.log" Apr 21 10:38:00.486995 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:00.486915 2577 generic.go:358] "Generic (PLEG): container finished" podID="24580193-9d49-483e-8f0e-de94069bbc90" containerID="7eaf5c9e2a60d5157b6bbfb71ad37ddd8d27af79b9dd47eac33c7d7ec90cd5ed" exitCode=1 Apr 21 10:38:00.486995 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:00.486953 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-6cd07a-predictor-7576f8c97d-zgfh6" event={"ID":"24580193-9d49-483e-8f0e-de94069bbc90","Type":"ContainerDied","Data":"7eaf5c9e2a60d5157b6bbfb71ad37ddd8d27af79b9dd47eac33c7d7ec90cd5ed"} Apr 21 10:38:00.486995 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:00.486985 2577 scope.go:117] "RemoveContainer" containerID="d2124bf70b7afb240229cb964f5776094dbe8d5a550518a57c58dc392de255ff" Apr 21 10:38:00.487329 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:00.487304 2577 scope.go:117] "RemoveContainer" containerID="d2124bf70b7afb240229cb964f5776094dbe8d5a550518a57c58dc392de255ff" Apr 21 10:38:00.497330 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:38:00.497294 2577 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-6cd07a-predictor-7576f8c97d-zgfh6_kserve-ci-e2e-test_24580193-9d49-483e-8f0e-de94069bbc90_0 in pod sandbox edb53f0239df90f58b5d883e15d10d2bd66f955c2828ad52c0eca951c8b9389c from index: no such id: 'd2124bf70b7afb240229cb964f5776094dbe8d5a550518a57c58dc392de255ff'" containerID="d2124bf70b7afb240229cb964f5776094dbe8d5a550518a57c58dc392de255ff" Apr 21 10:38:00.497398 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:38:00.497352 2577 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-6cd07a-predictor-7576f8c97d-zgfh6_kserve-ci-e2e-test_24580193-9d49-483e-8f0e-de94069bbc90_0 in pod sandbox edb53f0239df90f58b5d883e15d10d2bd66f955c2828ad52c0eca951c8b9389c from index: no such id: 'd2124bf70b7afb240229cb964f5776094dbe8d5a550518a57c58dc392de255ff'; Skipping pod \"isvc-secondary-6cd07a-predictor-7576f8c97d-zgfh6_kserve-ci-e2e-test(24580193-9d49-483e-8f0e-de94069bbc90)\"" logger="UnhandledError" Apr 21 10:38:00.498700 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:38:00.498680 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-6cd07a-predictor-7576f8c97d-zgfh6_kserve-ci-e2e-test(24580193-9d49-483e-8f0e-de94069bbc90)\"" pod="kserve-ci-e2e-test/isvc-secondary-6cd07a-predictor-7576f8c97d-zgfh6" podUID="24580193-9d49-483e-8f0e-de94069bbc90" Apr 21 10:38:01.491237 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:01.491209 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-6cd07a-predictor-7576f8c97d-zgfh6_24580193-9d49-483e-8f0e-de94069bbc90/storage-initializer/1.log" Apr 21 10:38:08.071804 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:08.071740 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-6cd07a-predictor-7576f8c97d-zgfh6"] Apr 21 10:38:08.113929 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:08.113895 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-6cd07a-predictor-859f97c7ff-dpxdg"] Apr 21 10:38:08.114176 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:08.114155 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-6cd07a-predictor-859f97c7ff-dpxdg" podUID="9b1aa616-4bfe-4fa8-85bc-a7fdc15697d3" containerName="kserve-container" containerID="cri-o://0e762e709061bc006b114d836606c467d89f14cef251b59d2ce74944029634ce" gracePeriod=30 Apr 21 10:38:08.190952 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:08.190928 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-4b8009-predictor-6789dddf56-ql7gr"] Apr 21 10:38:08.195847 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:08.195826 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-4b8009-predictor-6789dddf56-ql7gr" Apr 21 10:38:08.198353 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:08.198334 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-4b8009-dockercfg-zl49c\"" Apr 21 10:38:08.198433 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:08.198337 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-4b8009\"" Apr 21 10:38:08.202654 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:08.202631 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-4b8009-predictor-6789dddf56-ql7gr"] Apr 21 10:38:08.206132 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:08.206114 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-6cd07a-predictor-7576f8c97d-zgfh6_24580193-9d49-483e-8f0e-de94069bbc90/storage-initializer/1.log" Apr 21 10:38:08.206225 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:08.206168 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-6cd07a-predictor-7576f8c97d-zgfh6" Apr 21 10:38:08.303338 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:08.303296 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/24580193-9d49-483e-8f0e-de94069bbc90-kserve-provision-location\") pod \"24580193-9d49-483e-8f0e-de94069bbc90\" (UID: \"24580193-9d49-483e-8f0e-de94069bbc90\") " Apr 21 10:38:08.303517 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:08.303377 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/24580193-9d49-483e-8f0e-de94069bbc90-cabundle-cert\") pod \"24580193-9d49-483e-8f0e-de94069bbc90\" (UID: \"24580193-9d49-483e-8f0e-de94069bbc90\") " Apr 21 10:38:08.303619 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:08.303597 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9e06a947-b21b-4d88-b8a3-efa18b3e11a8-kserve-provision-location\") pod \"isvc-init-fail-4b8009-predictor-6789dddf56-ql7gr\" (UID: \"9e06a947-b21b-4d88-b8a3-efa18b3e11a8\") " pod="kserve-ci-e2e-test/isvc-init-fail-4b8009-predictor-6789dddf56-ql7gr" Apr 21 10:38:08.303679 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:08.303625 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24580193-9d49-483e-8f0e-de94069bbc90-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "24580193-9d49-483e-8f0e-de94069bbc90" (UID: "24580193-9d49-483e-8f0e-de94069bbc90"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:38:08.303679 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:08.303642 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/9e06a947-b21b-4d88-b8a3-efa18b3e11a8-cabundle-cert\") pod \"isvc-init-fail-4b8009-predictor-6789dddf56-ql7gr\" (UID: \"9e06a947-b21b-4d88-b8a3-efa18b3e11a8\") " pod="kserve-ci-e2e-test/isvc-init-fail-4b8009-predictor-6789dddf56-ql7gr" Apr 21 10:38:08.303777 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:08.303720 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/24580193-9d49-483e-8f0e-de94069bbc90-kserve-provision-location\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:38:08.303777 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:08.303722 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24580193-9d49-483e-8f0e-de94069bbc90-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "24580193-9d49-483e-8f0e-de94069bbc90" (UID: "24580193-9d49-483e-8f0e-de94069bbc90"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:38:08.405119 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:08.405015 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9e06a947-b21b-4d88-b8a3-efa18b3e11a8-kserve-provision-location\") pod \"isvc-init-fail-4b8009-predictor-6789dddf56-ql7gr\" (UID: \"9e06a947-b21b-4d88-b8a3-efa18b3e11a8\") " pod="kserve-ci-e2e-test/isvc-init-fail-4b8009-predictor-6789dddf56-ql7gr" Apr 21 10:38:08.405119 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:08.405073 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/9e06a947-b21b-4d88-b8a3-efa18b3e11a8-cabundle-cert\") pod \"isvc-init-fail-4b8009-predictor-6789dddf56-ql7gr\" (UID: \"9e06a947-b21b-4d88-b8a3-efa18b3e11a8\") " pod="kserve-ci-e2e-test/isvc-init-fail-4b8009-predictor-6789dddf56-ql7gr" Apr 21 10:38:08.405119 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:08.405107 2577 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/24580193-9d49-483e-8f0e-de94069bbc90-cabundle-cert\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:38:08.405438 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:08.405416 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9e06a947-b21b-4d88-b8a3-efa18b3e11a8-kserve-provision-location\") pod \"isvc-init-fail-4b8009-predictor-6789dddf56-ql7gr\" (UID: \"9e06a947-b21b-4d88-b8a3-efa18b3e11a8\") " pod="kserve-ci-e2e-test/isvc-init-fail-4b8009-predictor-6789dddf56-ql7gr" Apr 21 10:38:08.405673 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:08.405657 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/9e06a947-b21b-4d88-b8a3-efa18b3e11a8-cabundle-cert\") pod \"isvc-init-fail-4b8009-predictor-6789dddf56-ql7gr\" (UID: \"9e06a947-b21b-4d88-b8a3-efa18b3e11a8\") " pod="kserve-ci-e2e-test/isvc-init-fail-4b8009-predictor-6789dddf56-ql7gr" Apr 21 10:38:08.511503 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:08.511476 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-6cd07a-predictor-7576f8c97d-zgfh6_24580193-9d49-483e-8f0e-de94069bbc90/storage-initializer/1.log" Apr 21 10:38:08.511681 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:08.511534 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-6cd07a-predictor-7576f8c97d-zgfh6" event={"ID":"24580193-9d49-483e-8f0e-de94069bbc90","Type":"ContainerDied","Data":"edb53f0239df90f58b5d883e15d10d2bd66f955c2828ad52c0eca951c8b9389c"} Apr 21 10:38:08.511681 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:08.511581 2577 scope.go:117] "RemoveContainer" containerID="7eaf5c9e2a60d5157b6bbfb71ad37ddd8d27af79b9dd47eac33c7d7ec90cd5ed" Apr 21 10:38:08.511681 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:08.511628 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-6cd07a-predictor-7576f8c97d-zgfh6" Apr 21 10:38:08.517559 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:08.517528 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-4b8009-predictor-6789dddf56-ql7gr" Apr 21 10:38:08.549191 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:08.549160 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-6cd07a-predictor-7576f8c97d-zgfh6"] Apr 21 10:38:08.554624 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:08.554596 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-6cd07a-predictor-7576f8c97d-zgfh6"] Apr 21 10:38:08.642243 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:08.642210 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-4b8009-predictor-6789dddf56-ql7gr"] Apr 21 10:38:08.645512 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:38:08.645474 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e06a947_b21b_4d88_b8a3_efa18b3e11a8.slice/crio-801fe80798897a1368f5dc4b24a8ca3083e005c6dbb12583d37ee03274416829 WatchSource:0}: Error finding container 801fe80798897a1368f5dc4b24a8ca3083e005c6dbb12583d37ee03274416829: Status 404 returned error can't find the container with id 801fe80798897a1368f5dc4b24a8ca3083e005c6dbb12583d37ee03274416829 Apr 21 10:38:09.518892 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:09.518857 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-4b8009-predictor-6789dddf56-ql7gr" event={"ID":"9e06a947-b21b-4d88-b8a3-efa18b3e11a8","Type":"ContainerStarted","Data":"f0fef29a371a7955e13699df06d7af0e46b00197f184c9afa8ab42366ec49692"} Apr 21 10:38:09.518892 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:09.518889 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-4b8009-predictor-6789dddf56-ql7gr" event={"ID":"9e06a947-b21b-4d88-b8a3-efa18b3e11a8","Type":"ContainerStarted","Data":"801fe80798897a1368f5dc4b24a8ca3083e005c6dbb12583d37ee03274416829"} Apr 21 10:38:10.231869 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:10.231824 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24580193-9d49-483e-8f0e-de94069bbc90" path="/var/lib/kubelet/pods/24580193-9d49-483e-8f0e-de94069bbc90/volumes" Apr 21 10:38:12.528565 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:12.528537 2577 generic.go:358] "Generic (PLEG): container finished" podID="9b1aa616-4bfe-4fa8-85bc-a7fdc15697d3" containerID="0e762e709061bc006b114d836606c467d89f14cef251b59d2ce74944029634ce" exitCode=0 Apr 21 10:38:12.528945 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:12.528617 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-6cd07a-predictor-859f97c7ff-dpxdg" event={"ID":"9b1aa616-4bfe-4fa8-85bc-a7fdc15697d3","Type":"ContainerDied","Data":"0e762e709061bc006b114d836606c467d89f14cef251b59d2ce74944029634ce"} Apr 21 10:38:12.651062 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:12.651038 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-6cd07a-predictor-859f97c7ff-dpxdg" Apr 21 10:38:12.745968 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:12.745936 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9b1aa616-4bfe-4fa8-85bc-a7fdc15697d3-kserve-provision-location\") pod \"9b1aa616-4bfe-4fa8-85bc-a7fdc15697d3\" (UID: \"9b1aa616-4bfe-4fa8-85bc-a7fdc15697d3\") " Apr 21 10:38:12.746329 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:12.746303 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b1aa616-4bfe-4fa8-85bc-a7fdc15697d3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9b1aa616-4bfe-4fa8-85bc-a7fdc15697d3" (UID: "9b1aa616-4bfe-4fa8-85bc-a7fdc15697d3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:38:12.847150 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:12.847056 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9b1aa616-4bfe-4fa8-85bc-a7fdc15697d3-kserve-provision-location\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:38:13.532433 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:13.532405 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-4b8009-predictor-6789dddf56-ql7gr_9e06a947-b21b-4d88-b8a3-efa18b3e11a8/storage-initializer/0.log" Apr 21 10:38:13.532939 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:13.532443 2577 generic.go:358] "Generic (PLEG): container finished" podID="9e06a947-b21b-4d88-b8a3-efa18b3e11a8" containerID="f0fef29a371a7955e13699df06d7af0e46b00197f184c9afa8ab42366ec49692" exitCode=1 Apr 21 10:38:13.532939 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:13.532529 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-4b8009-predictor-6789dddf56-ql7gr" event={"ID":"9e06a947-b21b-4d88-b8a3-efa18b3e11a8","Type":"ContainerDied","Data":"f0fef29a371a7955e13699df06d7af0e46b00197f184c9afa8ab42366ec49692"} Apr 21 10:38:13.534054 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:13.534031 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-6cd07a-predictor-859f97c7ff-dpxdg" event={"ID":"9b1aa616-4bfe-4fa8-85bc-a7fdc15697d3","Type":"ContainerDied","Data":"3d9024695cd15dd21e4340dd2171b1b8c9ef50bf1555f0c48a425a23064f60dc"} Apr 21 10:38:13.534184 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:13.534063 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-6cd07a-predictor-859f97c7ff-dpxdg" Apr 21 10:38:13.534184 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:13.534069 2577 scope.go:117] "RemoveContainer" containerID="0e762e709061bc006b114d836606c467d89f14cef251b59d2ce74944029634ce" Apr 21 10:38:13.543354 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:13.543331 2577 scope.go:117] "RemoveContainer" containerID="7623ea2cb5e211c9d95395fde2a5e11d57ac97b5329a1e81a8e9c5f606fe3e69" Apr 21 10:38:13.566242 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:13.566213 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-6cd07a-predictor-859f97c7ff-dpxdg"] Apr 21 10:38:13.569424 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:13.569393 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-6cd07a-predictor-859f97c7ff-dpxdg"] Apr 21 10:38:14.231837 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:14.231805 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b1aa616-4bfe-4fa8-85bc-a7fdc15697d3" path="/var/lib/kubelet/pods/9b1aa616-4bfe-4fa8-85bc-a7fdc15697d3/volumes" Apr 21 10:38:14.539358 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:14.539282 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-4b8009-predictor-6789dddf56-ql7gr_9e06a947-b21b-4d88-b8a3-efa18b3e11a8/storage-initializer/0.log" Apr 21 10:38:14.539715 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:14.539355 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-4b8009-predictor-6789dddf56-ql7gr" event={"ID":"9e06a947-b21b-4d88-b8a3-efa18b3e11a8","Type":"ContainerStarted","Data":"adfed68b53befda69f53b1ed68cdd288bc5a9a471bae7cca0beea4f89289a36a"} Apr 21 10:38:16.546570 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:16.546540 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-4b8009-predictor-6789dddf56-ql7gr_9e06a947-b21b-4d88-b8a3-efa18b3e11a8/storage-initializer/1.log" Apr 21 10:38:16.546982 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:16.546903 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-4b8009-predictor-6789dddf56-ql7gr_9e06a947-b21b-4d88-b8a3-efa18b3e11a8/storage-initializer/0.log" Apr 21 10:38:16.546982 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:16.546933 2577 generic.go:358] "Generic (PLEG): container finished" podID="9e06a947-b21b-4d88-b8a3-efa18b3e11a8" containerID="adfed68b53befda69f53b1ed68cdd288bc5a9a471bae7cca0beea4f89289a36a" exitCode=1 Apr 21 10:38:16.547081 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:16.546982 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-4b8009-predictor-6789dddf56-ql7gr" event={"ID":"9e06a947-b21b-4d88-b8a3-efa18b3e11a8","Type":"ContainerDied","Data":"adfed68b53befda69f53b1ed68cdd288bc5a9a471bae7cca0beea4f89289a36a"} Apr 21 10:38:16.547081 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:16.547013 2577 scope.go:117] "RemoveContainer" containerID="f0fef29a371a7955e13699df06d7af0e46b00197f184c9afa8ab42366ec49692" Apr 21 10:38:16.547398 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:16.547381 2577 scope.go:117] "RemoveContainer" containerID="f0fef29a371a7955e13699df06d7af0e46b00197f184c9afa8ab42366ec49692" Apr 21 10:38:16.557334 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:38:16.557299 2577 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-init-fail-4b8009-predictor-6789dddf56-ql7gr_kserve-ci-e2e-test_9e06a947-b21b-4d88-b8a3-efa18b3e11a8_0 in pod sandbox 801fe80798897a1368f5dc4b24a8ca3083e005c6dbb12583d37ee03274416829 from index: no such id: 'f0fef29a371a7955e13699df06d7af0e46b00197f184c9afa8ab42366ec49692'" containerID="f0fef29a371a7955e13699df06d7af0e46b00197f184c9afa8ab42366ec49692" Apr 21 10:38:16.557392 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:38:16.557359 2577 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-init-fail-4b8009-predictor-6789dddf56-ql7gr_kserve-ci-e2e-test_9e06a947-b21b-4d88-b8a3-efa18b3e11a8_0 in pod sandbox 801fe80798897a1368f5dc4b24a8ca3083e005c6dbb12583d37ee03274416829 from index: no such id: 'f0fef29a371a7955e13699df06d7af0e46b00197f184c9afa8ab42366ec49692'; Skipping pod \"isvc-init-fail-4b8009-predictor-6789dddf56-ql7gr_kserve-ci-e2e-test(9e06a947-b21b-4d88-b8a3-efa18b3e11a8)\"" logger="UnhandledError" Apr 21 10:38:16.558682 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:38:16.558663 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-init-fail-4b8009-predictor-6789dddf56-ql7gr_kserve-ci-e2e-test(9e06a947-b21b-4d88-b8a3-efa18b3e11a8)\"" pod="kserve-ci-e2e-test/isvc-init-fail-4b8009-predictor-6789dddf56-ql7gr" podUID="9e06a947-b21b-4d88-b8a3-efa18b3e11a8" Apr 21 10:38:17.551023 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:17.550992 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-4b8009-predictor-6789dddf56-ql7gr_9e06a947-b21b-4d88-b8a3-efa18b3e11a8/storage-initializer/1.log" Apr 21 10:38:18.326070 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:18.326039 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-4b8009-predictor-6789dddf56-ql7gr"] Apr 21 10:38:18.403339 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:18.403309 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hj2m4"] Apr 21 10:38:18.403608 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:18.403596 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9b1aa616-4bfe-4fa8-85bc-a7fdc15697d3" containerName="kserve-container" Apr 21 10:38:18.403657 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:18.403610 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b1aa616-4bfe-4fa8-85bc-a7fdc15697d3" containerName="kserve-container" Apr 21 10:38:18.403657 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:18.403623 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9b1aa616-4bfe-4fa8-85bc-a7fdc15697d3" containerName="storage-initializer" Apr 21 10:38:18.403657 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:18.403629 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b1aa616-4bfe-4fa8-85bc-a7fdc15697d3" containerName="storage-initializer" Apr 21 10:38:18.403657 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:18.403639 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="24580193-9d49-483e-8f0e-de94069bbc90" containerName="storage-initializer" Apr 21 10:38:18.403657 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:18.403644 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="24580193-9d49-483e-8f0e-de94069bbc90" containerName="storage-initializer" Apr 21 10:38:18.403657 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:18.403658 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="24580193-9d49-483e-8f0e-de94069bbc90" containerName="storage-initializer" Apr 21 10:38:18.403857 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:18.403663 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="24580193-9d49-483e-8f0e-de94069bbc90" containerName="storage-initializer" Apr 21 10:38:18.403857 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:18.403707 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="24580193-9d49-483e-8f0e-de94069bbc90" containerName="storage-initializer" Apr 21 10:38:18.403857 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:18.403716 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="9b1aa616-4bfe-4fa8-85bc-a7fdc15697d3" containerName="kserve-container" Apr 21 10:38:18.403857 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:18.403835 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="24580193-9d49-483e-8f0e-de94069bbc90" containerName="storage-initializer" Apr 21 10:38:18.407823 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:18.407804 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hj2m4" Apr 21 10:38:18.414199 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:18.414173 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-pjlcm\"" Apr 21 10:38:18.424887 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:18.424856 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hj2m4"] Apr 21 10:38:18.459764 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:18.459727 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-4b8009-predictor-6789dddf56-ql7gr_9e06a947-b21b-4d88-b8a3-efa18b3e11a8/storage-initializer/1.log" Apr 21 10:38:18.459913 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:18.459811 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-4b8009-predictor-6789dddf56-ql7gr" Apr 21 10:38:18.496622 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:18.496583 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/12d664fb-d722-4fe9-aafb-2f31294ae810-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-85bccb8945-hj2m4\" (UID: \"12d664fb-d722-4fe9-aafb-2f31294ae810\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hj2m4" Apr 21 10:38:18.555999 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:18.555972 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-4b8009-predictor-6789dddf56-ql7gr_9e06a947-b21b-4d88-b8a3-efa18b3e11a8/storage-initializer/1.log" Apr 21 10:38:18.556384 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:18.556021 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-4b8009-predictor-6789dddf56-ql7gr" event={"ID":"9e06a947-b21b-4d88-b8a3-efa18b3e11a8","Type":"ContainerDied","Data":"801fe80798897a1368f5dc4b24a8ca3083e005c6dbb12583d37ee03274416829"} Apr 21 10:38:18.556384 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:18.556063 2577 scope.go:117] "RemoveContainer" containerID="adfed68b53befda69f53b1ed68cdd288bc5a9a471bae7cca0beea4f89289a36a" Apr 21 10:38:18.556384 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:18.556090 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-4b8009-predictor-6789dddf56-ql7gr" Apr 21 10:38:18.597561 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:18.597481 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9e06a947-b21b-4d88-b8a3-efa18b3e11a8-kserve-provision-location\") pod \"9e06a947-b21b-4d88-b8a3-efa18b3e11a8\" (UID: \"9e06a947-b21b-4d88-b8a3-efa18b3e11a8\") " Apr 21 10:38:18.597561 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:18.597543 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/9e06a947-b21b-4d88-b8a3-efa18b3e11a8-cabundle-cert\") pod \"9e06a947-b21b-4d88-b8a3-efa18b3e11a8\" (UID: \"9e06a947-b21b-4d88-b8a3-efa18b3e11a8\") " Apr 21 10:38:18.597727 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:18.597659 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/12d664fb-d722-4fe9-aafb-2f31294ae810-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-85bccb8945-hj2m4\" (UID: \"12d664fb-d722-4fe9-aafb-2f31294ae810\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hj2m4" Apr 21 10:38:18.597815 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:18.597791 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e06a947-b21b-4d88-b8a3-efa18b3e11a8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9e06a947-b21b-4d88-b8a3-efa18b3e11a8" (UID: "9e06a947-b21b-4d88-b8a3-efa18b3e11a8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:38:18.597902 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:18.597882 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e06a947-b21b-4d88-b8a3-efa18b3e11a8-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "9e06a947-b21b-4d88-b8a3-efa18b3e11a8" (UID: "9e06a947-b21b-4d88-b8a3-efa18b3e11a8"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 10:38:18.597988 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:18.597975 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/12d664fb-d722-4fe9-aafb-2f31294ae810-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-85bccb8945-hj2m4\" (UID: \"12d664fb-d722-4fe9-aafb-2f31294ae810\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hj2m4" Apr 21 10:38:18.698957 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:18.698914 2577 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/9e06a947-b21b-4d88-b8a3-efa18b3e11a8-cabundle-cert\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:38:18.698957 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:18.698951 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9e06a947-b21b-4d88-b8a3-efa18b3e11a8-kserve-provision-location\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:38:18.718885 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:18.718860 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hj2m4" Apr 21 10:38:18.847833 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:18.847779 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hj2m4"] Apr 21 10:38:18.850457 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:38:18.850428 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12d664fb_d722_4fe9_aafb_2f31294ae810.slice/crio-3a298bf6ff8edfbc252e34620b241e5fc7684790c6ef4ef458e5b2210c9f31cd WatchSource:0}: Error finding container 3a298bf6ff8edfbc252e34620b241e5fc7684790c6ef4ef458e5b2210c9f31cd: Status 404 returned error can't find the container with id 3a298bf6ff8edfbc252e34620b241e5fc7684790c6ef4ef458e5b2210c9f31cd Apr 21 10:38:18.891027 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:18.890996 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-4b8009-predictor-6789dddf56-ql7gr"] Apr 21 10:38:18.894675 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:18.894647 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-4b8009-predictor-6789dddf56-ql7gr"] Apr 21 10:38:19.561426 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:19.561376 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hj2m4" event={"ID":"12d664fb-d722-4fe9-aafb-2f31294ae810","Type":"ContainerStarted","Data":"30cf3d6e762bf69163eaa5c47f1a6e9b78d8da46ec4ce4e0b062bc0d24ff0342"} Apr 21 10:38:19.561426 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:19.561423 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hj2m4" event={"ID":"12d664fb-d722-4fe9-aafb-2f31294ae810","Type":"ContainerStarted","Data":"3a298bf6ff8edfbc252e34620b241e5fc7684790c6ef4ef458e5b2210c9f31cd"} Apr 21 10:38:20.231413 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:20.231371 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e06a947-b21b-4d88-b8a3-efa18b3e11a8" path="/var/lib/kubelet/pods/9e06a947-b21b-4d88-b8a3-efa18b3e11a8/volumes" Apr 21 10:38:22.570524 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:22.570484 2577 generic.go:358] "Generic (PLEG): container finished" podID="12d664fb-d722-4fe9-aafb-2f31294ae810" containerID="30cf3d6e762bf69163eaa5c47f1a6e9b78d8da46ec4ce4e0b062bc0d24ff0342" exitCode=0 Apr 21 10:38:22.571050 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:22.570563 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hj2m4" event={"ID":"12d664fb-d722-4fe9-aafb-2f31294ae810","Type":"ContainerDied","Data":"30cf3d6e762bf69163eaa5c47f1a6e9b78d8da46ec4ce4e0b062bc0d24ff0342"} Apr 21 10:38:44.640642 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:44.640604 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hj2m4" event={"ID":"12d664fb-d722-4fe9-aafb-2f31294ae810","Type":"ContainerStarted","Data":"94e21dcd8873786512937cb493930e7b3d3f4c4ff97e5d8d24c69c37447d6482"} Apr 21 10:38:44.641148 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:44.640899 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hj2m4" Apr 21 10:38:44.642259 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:44.642233 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hj2m4" podUID="12d664fb-d722-4fe9-aafb-2f31294ae810" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 21 10:38:44.657791 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:44.657723 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hj2m4" podStartSLOduration=5.618958364 podStartE2EDuration="26.65770714s" podCreationTimestamp="2026-04-21 10:38:18 +0000 UTC" firstStartedPulling="2026-04-21 10:38:22.57181722 +0000 UTC m=+2070.910351219" lastFinishedPulling="2026-04-21 10:38:43.610565995 +0000 UTC m=+2091.949099995" observedRunningTime="2026-04-21 10:38:44.656707837 +0000 UTC m=+2092.995241853" watchObservedRunningTime="2026-04-21 10:38:44.65770714 +0000 UTC m=+2092.996241157" Apr 21 10:38:45.644664 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:45.644627 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hj2m4" podUID="12d664fb-d722-4fe9-aafb-2f31294ae810" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 21 10:38:52.258124 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:52.258096 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-9vsls_c7fc40ec-8080-4847-9b12-78671f916c03/console-operator/2.log" Apr 21 10:38:52.261163 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:52.261138 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-9vsls_c7fc40ec-8080-4847-9b12-78671f916c03/console-operator/2.log" Apr 21 10:38:55.645644 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:38:55.645537 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hj2m4" podUID="12d664fb-d722-4fe9-aafb-2f31294ae810" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 21 10:39:05.645327 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:39:05.645285 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hj2m4" podUID="12d664fb-d722-4fe9-aafb-2f31294ae810" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 21 10:39:15.645077 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:39:15.645025 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hj2m4" podUID="12d664fb-d722-4fe9-aafb-2f31294ae810" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 21 10:39:25.645046 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:39:25.644996 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hj2m4" podUID="12d664fb-d722-4fe9-aafb-2f31294ae810" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 21 10:39:35.644694 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:39:35.644650 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hj2m4" podUID="12d664fb-d722-4fe9-aafb-2f31294ae810" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 21 10:39:45.645179 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:39:45.645081 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hj2m4" podUID="12d664fb-d722-4fe9-aafb-2f31294ae810" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 21 10:39:55.645052 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:39:55.645001 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hj2m4" podUID="12d664fb-d722-4fe9-aafb-2f31294ae810" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.44:8080: connect: connection refused" Apr 21 10:39:58.238759 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:39:58.238712 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hj2m4" Apr 21 10:40:08.532014 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:40:08.531923 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hj2m4"] Apr 21 10:40:08.532461 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:40:08.532282 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hj2m4" podUID="12d664fb-d722-4fe9-aafb-2f31294ae810" containerName="kserve-container" containerID="cri-o://94e21dcd8873786512937cb493930e7b3d3f4c4ff97e5d8d24c69c37447d6482" gracePeriod=30 Apr 21 10:40:08.618444 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:40:08.618407 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-mm46h"] Apr 21 10:40:08.618784 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:40:08.618760 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9e06a947-b21b-4d88-b8a3-efa18b3e11a8" containerName="storage-initializer" Apr 21 10:40:08.618784 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:40:08.618779 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e06a947-b21b-4d88-b8a3-efa18b3e11a8" containerName="storage-initializer" Apr 21 10:40:08.618784 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:40:08.618791 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9e06a947-b21b-4d88-b8a3-efa18b3e11a8" containerName="storage-initializer" Apr 21 10:40:08.618971 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:40:08.618797 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e06a947-b21b-4d88-b8a3-efa18b3e11a8" containerName="storage-initializer" Apr 21 10:40:08.618971 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:40:08.618858 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="9e06a947-b21b-4d88-b8a3-efa18b3e11a8" containerName="storage-initializer" Apr 21 10:40:08.618971 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:40:08.618867 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="9e06a947-b21b-4d88-b8a3-efa18b3e11a8" containerName="storage-initializer" Apr 21 10:40:08.621833 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:40:08.621814 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-mm46h" Apr 21 10:40:08.631869 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:40:08.631848 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-mm46h"] Apr 21 10:40:08.680158 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:40:08.680121 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2ebd96fc-0a2c-4fdb-991b-328a59cb432d-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-794b54b9b4-mm46h\" (UID: \"2ebd96fc-0a2c-4fdb-991b-328a59cb432d\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-mm46h" Apr 21 10:40:08.781379 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:40:08.781337 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2ebd96fc-0a2c-4fdb-991b-328a59cb432d-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-794b54b9b4-mm46h\" (UID: \"2ebd96fc-0a2c-4fdb-991b-328a59cb432d\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-mm46h" Apr 21 10:40:08.781732 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:40:08.781712 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2ebd96fc-0a2c-4fdb-991b-328a59cb432d-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-794b54b9b4-mm46h\" (UID: \"2ebd96fc-0a2c-4fdb-991b-328a59cb432d\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-mm46h" Apr 21 10:40:08.931971 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:40:08.931871 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-mm46h" Apr 21 10:40:09.058869 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:40:09.058833 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-mm46h"] Apr 21 10:40:09.061897 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:40:09.061863 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ebd96fc_0a2c_4fdb_991b_328a59cb432d.slice/crio-c4e7e41cf57bc4dd00d9cfb85d2038e729007dc3af5b57d8d615becce884d912 WatchSource:0}: Error finding container c4e7e41cf57bc4dd00d9cfb85d2038e729007dc3af5b57d8d615becce884d912: Status 404 returned error can't find the container with id c4e7e41cf57bc4dd00d9cfb85d2038e729007dc3af5b57d8d615becce884d912 Apr 21 10:40:09.887186 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:40:09.887144 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-mm46h" event={"ID":"2ebd96fc-0a2c-4fdb-991b-328a59cb432d","Type":"ContainerStarted","Data":"cdca78ee531994cd726719ea643dba691225f014903c24254f0e4054e5082d9a"} Apr 21 10:40:09.887186 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:40:09.887186 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-mm46h" event={"ID":"2ebd96fc-0a2c-4fdb-991b-328a59cb432d","Type":"ContainerStarted","Data":"c4e7e41cf57bc4dd00d9cfb85d2038e729007dc3af5b57d8d615becce884d912"} Apr 21 10:40:12.896809 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:40:12.896719 2577 generic.go:358] "Generic (PLEG): container finished" podID="2ebd96fc-0a2c-4fdb-991b-328a59cb432d" containerID="cdca78ee531994cd726719ea643dba691225f014903c24254f0e4054e5082d9a" exitCode=0 Apr 21 10:40:12.896809 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:40:12.896795 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-mm46h" event={"ID":"2ebd96fc-0a2c-4fdb-991b-328a59cb432d","Type":"ContainerDied","Data":"cdca78ee531994cd726719ea643dba691225f014903c24254f0e4054e5082d9a"} Apr 21 10:40:13.667821 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:40:13.667797 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hj2m4" Apr 21 10:40:13.719234 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:40:13.719183 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/12d664fb-d722-4fe9-aafb-2f31294ae810-kserve-provision-location\") pod \"12d664fb-d722-4fe9-aafb-2f31294ae810\" (UID: \"12d664fb-d722-4fe9-aafb-2f31294ae810\") " Apr 21 10:40:13.719530 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:40:13.719504 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12d664fb-d722-4fe9-aafb-2f31294ae810-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "12d664fb-d722-4fe9-aafb-2f31294ae810" (UID: "12d664fb-d722-4fe9-aafb-2f31294ae810"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:40:13.819809 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:40:13.819710 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/12d664fb-d722-4fe9-aafb-2f31294ae810-kserve-provision-location\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:40:13.901236 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:40:13.901199 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-mm46h" event={"ID":"2ebd96fc-0a2c-4fdb-991b-328a59cb432d","Type":"ContainerStarted","Data":"2de8e15a752fe6664c46b73053ea1e9ba2a35f38d5d1ab09548bc378a1854381"} Apr 21 10:40:13.901722 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:40:13.901503 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-mm46h" Apr 21 10:40:13.902720 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:40:13.902694 2577 generic.go:358] "Generic (PLEG): container finished" podID="12d664fb-d722-4fe9-aafb-2f31294ae810" containerID="94e21dcd8873786512937cb493930e7b3d3f4c4ff97e5d8d24c69c37447d6482" exitCode=0 Apr 21 10:40:13.902854 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:40:13.902728 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hj2m4" event={"ID":"12d664fb-d722-4fe9-aafb-2f31294ae810","Type":"ContainerDied","Data":"94e21dcd8873786512937cb493930e7b3d3f4c4ff97e5d8d24c69c37447d6482"} Apr 21 10:40:13.902854 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:40:13.902768 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hj2m4" event={"ID":"12d664fb-d722-4fe9-aafb-2f31294ae810","Type":"ContainerDied","Data":"3a298bf6ff8edfbc252e34620b241e5fc7684790c6ef4ef458e5b2210c9f31cd"} Apr 21 10:40:13.902854 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:40:13.902772 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hj2m4" Apr 21 10:40:13.902854 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:40:13.902787 2577 scope.go:117] "RemoveContainer" containerID="94e21dcd8873786512937cb493930e7b3d3f4c4ff97e5d8d24c69c37447d6482" Apr 21 10:40:13.902854 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:40:13.902815 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-mm46h" podUID="2ebd96fc-0a2c-4fdb-991b-328a59cb432d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 21 10:40:13.911225 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:40:13.911203 2577 scope.go:117] "RemoveContainer" containerID="30cf3d6e762bf69163eaa5c47f1a6e9b78d8da46ec4ce4e0b062bc0d24ff0342" Apr 21 10:40:13.918306 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:40:13.918255 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-mm46h" podStartSLOduration=5.918240245 podStartE2EDuration="5.918240245s" podCreationTimestamp="2026-04-21 10:40:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:40:13.9175896 +0000 UTC m=+2182.256123628" watchObservedRunningTime="2026-04-21 10:40:13.918240245 +0000 UTC m=+2182.256774263" Apr 21 10:40:13.919062 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:40:13.919043 2577 scope.go:117] "RemoveContainer" containerID="94e21dcd8873786512937cb493930e7b3d3f4c4ff97e5d8d24c69c37447d6482" Apr 21 10:40:13.919351 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:40:13.919330 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94e21dcd8873786512937cb493930e7b3d3f4c4ff97e5d8d24c69c37447d6482\": container with ID starting with 94e21dcd8873786512937cb493930e7b3d3f4c4ff97e5d8d24c69c37447d6482 not found: ID does not exist" containerID="94e21dcd8873786512937cb493930e7b3d3f4c4ff97e5d8d24c69c37447d6482" Apr 21 10:40:13.919425 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:40:13.919364 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94e21dcd8873786512937cb493930e7b3d3f4c4ff97e5d8d24c69c37447d6482"} err="failed to get container status \"94e21dcd8873786512937cb493930e7b3d3f4c4ff97e5d8d24c69c37447d6482\": rpc error: code = NotFound desc = could not find container \"94e21dcd8873786512937cb493930e7b3d3f4c4ff97e5d8d24c69c37447d6482\": container with ID starting with 94e21dcd8873786512937cb493930e7b3d3f4c4ff97e5d8d24c69c37447d6482 not found: ID does not exist" Apr 21 10:40:13.919425 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:40:13.919391 2577 scope.go:117] "RemoveContainer" containerID="30cf3d6e762bf69163eaa5c47f1a6e9b78d8da46ec4ce4e0b062bc0d24ff0342" Apr 21 10:40:13.919627 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:40:13.919611 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30cf3d6e762bf69163eaa5c47f1a6e9b78d8da46ec4ce4e0b062bc0d24ff0342\": container with ID starting with 30cf3d6e762bf69163eaa5c47f1a6e9b78d8da46ec4ce4e0b062bc0d24ff0342 not found: ID does not exist" containerID="30cf3d6e762bf69163eaa5c47f1a6e9b78d8da46ec4ce4e0b062bc0d24ff0342" Apr 21 10:40:13.919672 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:40:13.919634 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30cf3d6e762bf69163eaa5c47f1a6e9b78d8da46ec4ce4e0b062bc0d24ff0342"} err="failed to get container status \"30cf3d6e762bf69163eaa5c47f1a6e9b78d8da46ec4ce4e0b062bc0d24ff0342\": rpc error: code = NotFound desc = could not find container \"30cf3d6e762bf69163eaa5c47f1a6e9b78d8da46ec4ce4e0b062bc0d24ff0342\": container with ID starting with 30cf3d6e762bf69163eaa5c47f1a6e9b78d8da46ec4ce4e0b062bc0d24ff0342 not found: ID does not exist" Apr 21 10:40:13.929448 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:40:13.929417 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hj2m4"] Apr 21 10:40:13.933374 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:40:13.933350 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-hj2m4"] Apr 21 10:40:14.232227 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:40:14.232192 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12d664fb-d722-4fe9-aafb-2f31294ae810" path="/var/lib/kubelet/pods/12d664fb-d722-4fe9-aafb-2f31294ae810/volumes" Apr 21 10:40:14.907202 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:40:14.907163 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-mm46h" podUID="2ebd96fc-0a2c-4fdb-991b-328a59cb432d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 21 10:40:24.907165 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:40:24.907114 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-mm46h" podUID="2ebd96fc-0a2c-4fdb-991b-328a59cb432d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 21 10:40:34.907327 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:40:34.907282 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-mm46h" podUID="2ebd96fc-0a2c-4fdb-991b-328a59cb432d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 21 10:40:44.907334 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:40:44.907290 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-mm46h" podUID="2ebd96fc-0a2c-4fdb-991b-328a59cb432d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 21 10:40:54.908058 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:40:54.908011 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-mm46h" podUID="2ebd96fc-0a2c-4fdb-991b-328a59cb432d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 21 10:41:04.907684 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:41:04.907639 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-mm46h" podUID="2ebd96fc-0a2c-4fdb-991b-328a59cb432d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 21 10:41:14.907374 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:41:14.907271 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-mm46h" podUID="2ebd96fc-0a2c-4fdb-991b-328a59cb432d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 21 10:41:24.907536 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:41:24.907484 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-mm46h" podUID="2ebd96fc-0a2c-4fdb-991b-328a59cb432d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 21 10:41:32.231484 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:41:32.231455 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-mm46h" Apr 21 10:41:38.742232 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:41:38.742199 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-mm46h"] Apr 21 10:41:38.742776 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:41:38.742458 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-mm46h" podUID="2ebd96fc-0a2c-4fdb-991b-328a59cb432d" containerName="kserve-container" containerID="cri-o://2de8e15a752fe6664c46b73053ea1e9ba2a35f38d5d1ab09548bc378a1854381" gracePeriod=30 Apr 21 10:41:38.824979 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:41:38.824943 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-lb7s9"] Apr 21 10:41:38.825266 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:41:38.825250 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="12d664fb-d722-4fe9-aafb-2f31294ae810" containerName="storage-initializer" Apr 21 10:41:38.825310 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:41:38.825270 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="12d664fb-d722-4fe9-aafb-2f31294ae810" containerName="storage-initializer" Apr 21 10:41:38.825310 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:41:38.825298 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="12d664fb-d722-4fe9-aafb-2f31294ae810" containerName="kserve-container" Apr 21 10:41:38.825310 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:41:38.825305 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="12d664fb-d722-4fe9-aafb-2f31294ae810" containerName="kserve-container" Apr 21 10:41:38.825403 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:41:38.825382 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="12d664fb-d722-4fe9-aafb-2f31294ae810" containerName="kserve-container" Apr 21 10:41:38.828482 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:41:38.828464 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-lb7s9" Apr 21 10:41:38.838786 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:41:38.838738 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-lb7s9"] Apr 21 10:41:38.903090 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:41:38.903044 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/55804ecb-a8d4-4a18-a1f7-de70c0b4b719-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-669896799c-lb7s9\" (UID: \"55804ecb-a8d4-4a18-a1f7-de70c0b4b719\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-lb7s9" Apr 21 10:41:39.004554 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:41:39.004462 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/55804ecb-a8d4-4a18-a1f7-de70c0b4b719-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-669896799c-lb7s9\" (UID: \"55804ecb-a8d4-4a18-a1f7-de70c0b4b719\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-lb7s9" Apr 21 10:41:39.004872 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:41:39.004852 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/55804ecb-a8d4-4a18-a1f7-de70c0b4b719-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-669896799c-lb7s9\" (UID: \"55804ecb-a8d4-4a18-a1f7-de70c0b4b719\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-lb7s9" Apr 21 10:41:39.139047 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:41:39.139008 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-lb7s9" Apr 21 10:41:39.266383 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:41:39.266306 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-lb7s9"] Apr 21 10:41:39.269342 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:41:39.269309 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55804ecb_a8d4_4a18_a1f7_de70c0b4b719.slice/crio-34b912b40a1774e2034dbb3db55ad44c470c6aef5c9be13fb593cf537dd3f991 WatchSource:0}: Error finding container 34b912b40a1774e2034dbb3db55ad44c470c6aef5c9be13fb593cf537dd3f991: Status 404 returned error can't find the container with id 34b912b40a1774e2034dbb3db55ad44c470c6aef5c9be13fb593cf537dd3f991 Apr 21 10:41:40.167382 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:41:40.167344 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-lb7s9" event={"ID":"55804ecb-a8d4-4a18-a1f7-de70c0b4b719","Type":"ContainerStarted","Data":"9886a03d4f6b06a572f976767347b2874056384041295611f828a9ac591897f5"} Apr 21 10:41:40.167382 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:41:40.167384 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-lb7s9" event={"ID":"55804ecb-a8d4-4a18-a1f7-de70c0b4b719","Type":"ContainerStarted","Data":"34b912b40a1774e2034dbb3db55ad44c470c6aef5c9be13fb593cf537dd3f991"} Apr 21 10:41:42.230172 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:41:42.230129 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-mm46h" podUID="2ebd96fc-0a2c-4fdb-991b-328a59cb432d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.45:8080: connect: connection refused" Apr 21 10:41:43.179607 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:41:43.179518 2577 generic.go:358] "Generic (PLEG): container finished" podID="55804ecb-a8d4-4a18-a1f7-de70c0b4b719" containerID="9886a03d4f6b06a572f976767347b2874056384041295611f828a9ac591897f5" exitCode=0 Apr 21 10:41:43.179607 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:41:43.179593 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-lb7s9" event={"ID":"55804ecb-a8d4-4a18-a1f7-de70c0b4b719","Type":"ContainerDied","Data":"9886a03d4f6b06a572f976767347b2874056384041295611f828a9ac591897f5"} Apr 21 10:41:43.978483 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:41:43.978458 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-mm46h" Apr 21 10:41:44.045083 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:41:44.045037 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2ebd96fc-0a2c-4fdb-991b-328a59cb432d-kserve-provision-location\") pod \"2ebd96fc-0a2c-4fdb-991b-328a59cb432d\" (UID: \"2ebd96fc-0a2c-4fdb-991b-328a59cb432d\") " Apr 21 10:41:44.045356 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:41:44.045333 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ebd96fc-0a2c-4fdb-991b-328a59cb432d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2ebd96fc-0a2c-4fdb-991b-328a59cb432d" (UID: "2ebd96fc-0a2c-4fdb-991b-328a59cb432d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:41:44.145984 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:41:44.145896 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2ebd96fc-0a2c-4fdb-991b-328a59cb432d-kserve-provision-location\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:41:44.184272 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:41:44.184232 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-lb7s9" event={"ID":"55804ecb-a8d4-4a18-a1f7-de70c0b4b719","Type":"ContainerStarted","Data":"724ee3bce4295ed1a5098ad5e8c85800d990cd42bfc424152dfef00db629185c"} Apr 21 10:41:44.184560 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:41:44.184536 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-lb7s9" Apr 21 10:41:44.185644 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:41:44.185622 2577 generic.go:358] "Generic (PLEG): container finished" podID="2ebd96fc-0a2c-4fdb-991b-328a59cb432d" containerID="2de8e15a752fe6664c46b73053ea1e9ba2a35f38d5d1ab09548bc378a1854381" exitCode=0 Apr 21 10:41:44.185781 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:41:44.185688 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-mm46h" event={"ID":"2ebd96fc-0a2c-4fdb-991b-328a59cb432d","Type":"ContainerDied","Data":"2de8e15a752fe6664c46b73053ea1e9ba2a35f38d5d1ab09548bc378a1854381"} Apr 21 10:41:44.185781 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:41:44.185691 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-mm46h" Apr 21 10:41:44.185781 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:41:44.185705 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-mm46h" event={"ID":"2ebd96fc-0a2c-4fdb-991b-328a59cb432d","Type":"ContainerDied","Data":"c4e7e41cf57bc4dd00d9cfb85d2038e729007dc3af5b57d8d615becce884d912"} Apr 21 10:41:44.185781 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:41:44.185721 2577 scope.go:117] "RemoveContainer" containerID="2de8e15a752fe6664c46b73053ea1e9ba2a35f38d5d1ab09548bc378a1854381" Apr 21 10:41:44.186124 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:41:44.186095 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-lb7s9" podUID="55804ecb-a8d4-4a18-a1f7-de70c0b4b719" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 21 10:41:44.194537 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:41:44.194513 2577 scope.go:117] "RemoveContainer" containerID="cdca78ee531994cd726719ea643dba691225f014903c24254f0e4054e5082d9a" Apr 21 10:41:44.201370 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:41:44.201316 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-lb7s9" podStartSLOduration=6.201297272 podStartE2EDuration="6.201297272s" podCreationTimestamp="2026-04-21 10:41:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:41:44.199555024 +0000 UTC m=+2272.538089040" watchObservedRunningTime="2026-04-21 10:41:44.201297272 +0000 UTC m=+2272.539831303" Apr 21 10:41:44.201988 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:41:44.201969 2577 scope.go:117] "RemoveContainer" containerID="2de8e15a752fe6664c46b73053ea1e9ba2a35f38d5d1ab09548bc378a1854381" Apr 21 10:41:44.202320 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:41:44.202294 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2de8e15a752fe6664c46b73053ea1e9ba2a35f38d5d1ab09548bc378a1854381\": container with ID starting with 2de8e15a752fe6664c46b73053ea1e9ba2a35f38d5d1ab09548bc378a1854381 not found: ID does not exist" containerID="2de8e15a752fe6664c46b73053ea1e9ba2a35f38d5d1ab09548bc378a1854381" Apr 21 10:41:44.202409 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:41:44.202332 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2de8e15a752fe6664c46b73053ea1e9ba2a35f38d5d1ab09548bc378a1854381"} err="failed to get container status \"2de8e15a752fe6664c46b73053ea1e9ba2a35f38d5d1ab09548bc378a1854381\": rpc error: code = NotFound desc = could not find container \"2de8e15a752fe6664c46b73053ea1e9ba2a35f38d5d1ab09548bc378a1854381\": container with ID starting with 2de8e15a752fe6664c46b73053ea1e9ba2a35f38d5d1ab09548bc378a1854381 not found: ID does not exist" Apr 21 10:41:44.202409 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:41:44.202351 2577 scope.go:117] "RemoveContainer" containerID="cdca78ee531994cd726719ea643dba691225f014903c24254f0e4054e5082d9a" Apr 21 10:41:44.202606 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:41:44.202589 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdca78ee531994cd726719ea643dba691225f014903c24254f0e4054e5082d9a\": container with ID starting with cdca78ee531994cd726719ea643dba691225f014903c24254f0e4054e5082d9a not found: ID does not exist" containerID="cdca78ee531994cd726719ea643dba691225f014903c24254f0e4054e5082d9a" Apr 21 10:41:44.202676 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:41:44.202608 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdca78ee531994cd726719ea643dba691225f014903c24254f0e4054e5082d9a"} err="failed to get container status \"cdca78ee531994cd726719ea643dba691225f014903c24254f0e4054e5082d9a\": rpc error: code = NotFound desc = could not find container \"cdca78ee531994cd726719ea643dba691225f014903c24254f0e4054e5082d9a\": container with ID starting with cdca78ee531994cd726719ea643dba691225f014903c24254f0e4054e5082d9a not found: ID does not exist" Apr 21 10:41:44.211353 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:41:44.211328 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-mm46h"] Apr 21 10:41:44.217312 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:41:44.217285 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-mm46h"] Apr 21 10:41:44.231952 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:41:44.231925 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ebd96fc-0a2c-4fdb-991b-328a59cb432d" path="/var/lib/kubelet/pods/2ebd96fc-0a2c-4fdb-991b-328a59cb432d/volumes" Apr 21 10:41:45.189573 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:41:45.189531 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-lb7s9" podUID="55804ecb-a8d4-4a18-a1f7-de70c0b4b719" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 21 10:41:55.190001 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:41:55.189951 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-lb7s9" podUID="55804ecb-a8d4-4a18-a1f7-de70c0b4b719" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 21 10:42:05.189570 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:42:05.189525 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-lb7s9" podUID="55804ecb-a8d4-4a18-a1f7-de70c0b4b719" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 21 10:42:15.190100 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:42:15.190053 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-lb7s9" podUID="55804ecb-a8d4-4a18-a1f7-de70c0b4b719" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 21 10:42:25.189897 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:42:25.189840 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-lb7s9" podUID="55804ecb-a8d4-4a18-a1f7-de70c0b4b719" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 21 10:42:35.189954 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:42:35.189902 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-lb7s9" podUID="55804ecb-a8d4-4a18-a1f7-de70c0b4b719" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 21 10:42:45.190184 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:42:45.190083 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-lb7s9" podUID="55804ecb-a8d4-4a18-a1f7-de70c0b4b719" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 21 10:42:55.190106 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:42:55.190060 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-lb7s9" podUID="55804ecb-a8d4-4a18-a1f7-de70c0b4b719" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 21 10:42:57.228583 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:42:57.228542 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-lb7s9" podUID="55804ecb-a8d4-4a18-a1f7-de70c0b4b719" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.46:8080: connect: connection refused" Apr 21 10:43:07.229979 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:43:07.229942 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-lb7s9" Apr 21 10:43:09.052992 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:43:09.052954 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-lb7s9"] Apr 21 10:43:09.053492 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:43:09.053254 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-lb7s9" podUID="55804ecb-a8d4-4a18-a1f7-de70c0b4b719" containerName="kserve-container" containerID="cri-o://724ee3bce4295ed1a5098ad5e8c85800d990cd42bfc424152dfef00db629185c" gracePeriod=30 Apr 21 10:43:09.149248 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:43:09.149211 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-vg2pg"] Apr 21 10:43:09.149547 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:43:09.149533 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2ebd96fc-0a2c-4fdb-991b-328a59cb432d" containerName="kserve-container" Apr 21 10:43:09.149594 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:43:09.149549 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ebd96fc-0a2c-4fdb-991b-328a59cb432d" containerName="kserve-container" Apr 21 10:43:09.149594 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:43:09.149569 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2ebd96fc-0a2c-4fdb-991b-328a59cb432d" containerName="storage-initializer" Apr 21 10:43:09.149594 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:43:09.149575 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ebd96fc-0a2c-4fdb-991b-328a59cb432d" containerName="storage-initializer" Apr 21 10:43:09.149691 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:43:09.149625 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="2ebd96fc-0a2c-4fdb-991b-328a59cb432d" containerName="kserve-container" Apr 21 10:43:09.152566 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:43:09.152548 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-vg2pg" Apr 21 10:43:09.160683 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:43:09.160657 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-vg2pg"] Apr 21 10:43:09.331838 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:43:09.331705 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a89ac118-2074-4c31-b72e-614b9d27c7c7-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-vg2pg\" (UID: \"a89ac118-2074-4c31-b72e-614b9d27c7c7\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-vg2pg" Apr 21 10:43:09.433113 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:43:09.433080 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a89ac118-2074-4c31-b72e-614b9d27c7c7-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-vg2pg\" (UID: \"a89ac118-2074-4c31-b72e-614b9d27c7c7\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-vg2pg" Apr 21 10:43:09.433422 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:43:09.433403 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a89ac118-2074-4c31-b72e-614b9d27c7c7-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-vg2pg\" (UID: \"a89ac118-2074-4c31-b72e-614b9d27c7c7\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-vg2pg" Apr 21 10:43:09.463205 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:43:09.463175 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-vg2pg" Apr 21 10:43:09.586840 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:43:09.586811 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-vg2pg"] Apr 21 10:43:09.589462 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:43:09.589435 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda89ac118_2074_4c31_b72e_614b9d27c7c7.slice/crio-be24cbb843f526402733e7ed6f18439658717e340a8d1b81b9b8f51c0f83b060 WatchSource:0}: Error finding container be24cbb843f526402733e7ed6f18439658717e340a8d1b81b9b8f51c0f83b060: Status 404 returned error can't find the container with id be24cbb843f526402733e7ed6f18439658717e340a8d1b81b9b8f51c0f83b060 Apr 21 10:43:09.591294 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:43:09.591278 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 10:43:10.430157 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:43:10.430119 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-vg2pg" event={"ID":"a89ac118-2074-4c31-b72e-614b9d27c7c7","Type":"ContainerStarted","Data":"122054bc693f1373be161f2500b472cb5fac6d200e393bd10e496a42dcf8cd66"} Apr 21 10:43:10.430157 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:43:10.430161 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-vg2pg" event={"ID":"a89ac118-2074-4c31-b72e-614b9d27c7c7","Type":"ContainerStarted","Data":"be24cbb843f526402733e7ed6f18439658717e340a8d1b81b9b8f51c0f83b060"} Apr 21 10:43:13.439469 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:43:13.439435 2577 generic.go:358] "Generic (PLEG): container finished" podID="a89ac118-2074-4c31-b72e-614b9d27c7c7" containerID="122054bc693f1373be161f2500b472cb5fac6d200e393bd10e496a42dcf8cd66" exitCode=0 Apr 21 10:43:13.439929 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:43:13.439507 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-vg2pg" event={"ID":"a89ac118-2074-4c31-b72e-614b9d27c7c7","Type":"ContainerDied","Data":"122054bc693f1373be161f2500b472cb5fac6d200e393bd10e496a42dcf8cd66"} Apr 21 10:43:14.444255 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:43:14.444219 2577 generic.go:358] "Generic (PLEG): container finished" podID="55804ecb-a8d4-4a18-a1f7-de70c0b4b719" containerID="724ee3bce4295ed1a5098ad5e8c85800d990cd42bfc424152dfef00db629185c" exitCode=0 Apr 21 10:43:14.444255 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:43:14.444252 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-lb7s9" event={"ID":"55804ecb-a8d4-4a18-a1f7-de70c0b4b719","Type":"ContainerDied","Data":"724ee3bce4295ed1a5098ad5e8c85800d990cd42bfc424152dfef00db629185c"} Apr 21 10:43:14.445902 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:43:14.445878 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-vg2pg" event={"ID":"a89ac118-2074-4c31-b72e-614b9d27c7c7","Type":"ContainerStarted","Data":"76ba78d0f6ef978ace628646502b855551283e1325b988508077bf902d515ab9"} Apr 21 10:43:14.446100 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:43:14.446078 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-vg2pg" Apr 21 10:43:14.465775 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:43:14.465708 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-vg2pg" podStartSLOduration=5.465692452 podStartE2EDuration="5.465692452s" podCreationTimestamp="2026-04-21 10:43:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:43:14.464181395 +0000 UTC m=+2362.802715414" watchObservedRunningTime="2026-04-21 10:43:14.465692452 +0000 UTC m=+2362.804226468" Apr 21 10:43:14.491760 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:43:14.491723 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-lb7s9" Apr 21 10:43:14.675632 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:43:14.675530 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/55804ecb-a8d4-4a18-a1f7-de70c0b4b719-kserve-provision-location\") pod \"55804ecb-a8d4-4a18-a1f7-de70c0b4b719\" (UID: \"55804ecb-a8d4-4a18-a1f7-de70c0b4b719\") " Apr 21 10:43:14.675926 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:43:14.675901 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55804ecb-a8d4-4a18-a1f7-de70c0b4b719-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "55804ecb-a8d4-4a18-a1f7-de70c0b4b719" (UID: "55804ecb-a8d4-4a18-a1f7-de70c0b4b719"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:43:14.776130 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:43:14.776071 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/55804ecb-a8d4-4a18-a1f7-de70c0b4b719-kserve-provision-location\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:43:15.450232 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:43:15.450201 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-lb7s9" Apr 21 10:43:15.450232 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:43:15.450219 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-lb7s9" event={"ID":"55804ecb-a8d4-4a18-a1f7-de70c0b4b719","Type":"ContainerDied","Data":"34b912b40a1774e2034dbb3db55ad44c470c6aef5c9be13fb593cf537dd3f991"} Apr 21 10:43:15.450835 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:43:15.450265 2577 scope.go:117] "RemoveContainer" containerID="724ee3bce4295ed1a5098ad5e8c85800d990cd42bfc424152dfef00db629185c" Apr 21 10:43:15.458387 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:43:15.458366 2577 scope.go:117] "RemoveContainer" containerID="9886a03d4f6b06a572f976767347b2874056384041295611f828a9ac591897f5" Apr 21 10:43:15.471897 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:43:15.471866 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-lb7s9"] Apr 21 10:43:15.478951 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:43:15.478923 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-lb7s9"] Apr 21 10:43:16.237302 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:43:16.237271 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55804ecb-a8d4-4a18-a1f7-de70c0b4b719" path="/var/lib/kubelet/pods/55804ecb-a8d4-4a18-a1f7-de70c0b4b719/volumes" Apr 21 10:43:45.452680 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:43:45.452628 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-vg2pg" podUID="a89ac118-2074-4c31-b72e-614b9d27c7c7" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.47:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.47:8080: connect: connection refused" Apr 21 10:43:52.278677 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:43:52.278647 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-9vsls_c7fc40ec-8080-4847-9b12-78671f916c03/console-operator/2.log" Apr 21 10:43:52.281819 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:43:52.281787 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-9vsls_c7fc40ec-8080-4847-9b12-78671f916c03/console-operator/2.log" Apr 21 10:43:55.451275 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:43:55.451231 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-vg2pg" podUID="a89ac118-2074-4c31-b72e-614b9d27c7c7" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.47:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.47:8080: connect: connection refused" Apr 21 10:44:05.451583 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:44:05.451534 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-vg2pg" podUID="a89ac118-2074-4c31-b72e-614b9d27c7c7" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.47:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.47:8080: connect: connection refused" Apr 21 10:44:15.451129 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:44:15.451023 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-vg2pg" podUID="a89ac118-2074-4c31-b72e-614b9d27c7c7" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.47:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.47:8080: connect: connection refused" Apr 21 10:44:25.451349 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:44:25.451298 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-vg2pg" podUID="a89ac118-2074-4c31-b72e-614b9d27c7c7" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.47:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.134.0.47:8080: connect: connection refused" Apr 21 10:44:35.455342 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:44:35.455311 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-vg2pg" Apr 21 10:44:39.253514 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:44:39.253475 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-vg2pg"] Apr 21 10:44:39.254034 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:44:39.253829 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-vg2pg" podUID="a89ac118-2074-4c31-b72e-614b9d27c7c7" containerName="kserve-container" containerID="cri-o://76ba78d0f6ef978ace628646502b855551283e1325b988508077bf902d515ab9" gracePeriod=30 Apr 21 10:44:39.358840 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:44:39.358797 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-fmh5b"] Apr 21 10:44:39.359205 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:44:39.359188 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="55804ecb-a8d4-4a18-a1f7-de70c0b4b719" containerName="storage-initializer" Apr 21 10:44:39.359289 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:44:39.359208 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="55804ecb-a8d4-4a18-a1f7-de70c0b4b719" containerName="storage-initializer" Apr 21 10:44:39.359289 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:44:39.359224 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="55804ecb-a8d4-4a18-a1f7-de70c0b4b719" containerName="kserve-container" Apr 21 10:44:39.359289 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:44:39.359232 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="55804ecb-a8d4-4a18-a1f7-de70c0b4b719" containerName="kserve-container" Apr 21 10:44:39.359439 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:44:39.359315 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="55804ecb-a8d4-4a18-a1f7-de70c0b4b719" containerName="kserve-container" Apr 21 10:44:39.362258 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:44:39.362236 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-fmh5b" Apr 21 10:44:39.371481 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:44:39.371454 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-fmh5b"] Apr 21 10:44:39.437169 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:44:39.437133 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2eecc9a4-dc58-4a01-aabc-43ff60ebd843-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-67fd65d6cb-fmh5b\" (UID: \"2eecc9a4-dc58-4a01-aabc-43ff60ebd843\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-fmh5b" Apr 21 10:44:39.537992 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:44:39.537877 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2eecc9a4-dc58-4a01-aabc-43ff60ebd843-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-67fd65d6cb-fmh5b\" (UID: \"2eecc9a4-dc58-4a01-aabc-43ff60ebd843\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-fmh5b" Apr 21 10:44:39.538265 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:44:39.538244 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2eecc9a4-dc58-4a01-aabc-43ff60ebd843-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-67fd65d6cb-fmh5b\" (UID: \"2eecc9a4-dc58-4a01-aabc-43ff60ebd843\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-fmh5b" Apr 21 10:44:39.672720 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:44:39.672683 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-fmh5b" Apr 21 10:44:39.796000 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:44:39.795923 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-fmh5b"] Apr 21 10:44:39.799388 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:44:39.799358 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2eecc9a4_dc58_4a01_aabc_43ff60ebd843.slice/crio-f9c5f0618f52cf1fbf47179d1b533e506eba18f7330626377860041197873d40 WatchSource:0}: Error finding container f9c5f0618f52cf1fbf47179d1b533e506eba18f7330626377860041197873d40: Status 404 returned error can't find the container with id f9c5f0618f52cf1fbf47179d1b533e506eba18f7330626377860041197873d40 Apr 21 10:44:40.693241 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:44:40.693202 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-fmh5b" event={"ID":"2eecc9a4-dc58-4a01-aabc-43ff60ebd843","Type":"ContainerStarted","Data":"18211f8ed6dddc53e53efbd79b15e4a32a9000c2642982a2bb7317a98f7beb2b"} Apr 21 10:44:40.693241 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:44:40.693246 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-fmh5b" event={"ID":"2eecc9a4-dc58-4a01-aabc-43ff60ebd843","Type":"ContainerStarted","Data":"f9c5f0618f52cf1fbf47179d1b533e506eba18f7330626377860041197873d40"} Apr 21 10:44:43.704158 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:44:43.704125 2577 generic.go:358] "Generic (PLEG): container finished" podID="2eecc9a4-dc58-4a01-aabc-43ff60ebd843" containerID="18211f8ed6dddc53e53efbd79b15e4a32a9000c2642982a2bb7317a98f7beb2b" exitCode=0 Apr 21 10:44:43.704543 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:44:43.704185 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-fmh5b" event={"ID":"2eecc9a4-dc58-4a01-aabc-43ff60ebd843","Type":"ContainerDied","Data":"18211f8ed6dddc53e53efbd79b15e4a32a9000c2642982a2bb7317a98f7beb2b"} Apr 21 10:44:44.498134 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:44:44.498106 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-vg2pg" Apr 21 10:44:44.579021 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:44:44.578929 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a89ac118-2074-4c31-b72e-614b9d27c7c7-kserve-provision-location\") pod \"a89ac118-2074-4c31-b72e-614b9d27c7c7\" (UID: \"a89ac118-2074-4c31-b72e-614b9d27c7c7\") " Apr 21 10:44:44.579278 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:44:44.579254 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a89ac118-2074-4c31-b72e-614b9d27c7c7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a89ac118-2074-4c31-b72e-614b9d27c7c7" (UID: "a89ac118-2074-4c31-b72e-614b9d27c7c7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:44:44.680175 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:44:44.680145 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a89ac118-2074-4c31-b72e-614b9d27c7c7-kserve-provision-location\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:44:44.708921 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:44:44.708884 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-fmh5b" event={"ID":"2eecc9a4-dc58-4a01-aabc-43ff60ebd843","Type":"ContainerStarted","Data":"fbec80b80a077e8a218775db723bfe805f75ec79504b6544424b2779cdcdebb2"} Apr 21 10:44:44.709304 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:44:44.709107 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-fmh5b" Apr 21 10:44:44.710295 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:44:44.710268 2577 generic.go:358] "Generic (PLEG): container finished" podID="a89ac118-2074-4c31-b72e-614b9d27c7c7" containerID="76ba78d0f6ef978ace628646502b855551283e1325b988508077bf902d515ab9" exitCode=0 Apr 21 10:44:44.710368 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:44:44.710313 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-vg2pg" event={"ID":"a89ac118-2074-4c31-b72e-614b9d27c7c7","Type":"ContainerDied","Data":"76ba78d0f6ef978ace628646502b855551283e1325b988508077bf902d515ab9"} Apr 21 10:44:44.710368 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:44:44.710326 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-vg2pg" Apr 21 10:44:44.710368 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:44:44.710340 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-vg2pg" event={"ID":"a89ac118-2074-4c31-b72e-614b9d27c7c7","Type":"ContainerDied","Data":"be24cbb843f526402733e7ed6f18439658717e340a8d1b81b9b8f51c0f83b060"} Apr 21 10:44:44.710368 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:44:44.710360 2577 scope.go:117] "RemoveContainer" containerID="76ba78d0f6ef978ace628646502b855551283e1325b988508077bf902d515ab9" Apr 21 10:44:44.724488 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:44:44.724461 2577 scope.go:117] "RemoveContainer" containerID="122054bc693f1373be161f2500b472cb5fac6d200e393bd10e496a42dcf8cd66" Apr 21 10:44:44.726854 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:44:44.726802 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-fmh5b" podStartSLOduration=5.72678781 podStartE2EDuration="5.72678781s" podCreationTimestamp="2026-04-21 10:44:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:44:44.725800115 +0000 UTC m=+2453.064334131" watchObservedRunningTime="2026-04-21 10:44:44.72678781 +0000 UTC m=+2453.065321827" Apr 21 10:44:44.732998 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:44:44.732976 2577 scope.go:117] "RemoveContainer" containerID="76ba78d0f6ef978ace628646502b855551283e1325b988508077bf902d515ab9" Apr 21 10:44:44.733343 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:44:44.733322 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76ba78d0f6ef978ace628646502b855551283e1325b988508077bf902d515ab9\": container with ID starting with 76ba78d0f6ef978ace628646502b855551283e1325b988508077bf902d515ab9 not found: ID does not exist" containerID="76ba78d0f6ef978ace628646502b855551283e1325b988508077bf902d515ab9" Apr 21 10:44:44.733397 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:44:44.733354 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76ba78d0f6ef978ace628646502b855551283e1325b988508077bf902d515ab9"} err="failed to get container status \"76ba78d0f6ef978ace628646502b855551283e1325b988508077bf902d515ab9\": rpc error: code = NotFound desc = could not find container \"76ba78d0f6ef978ace628646502b855551283e1325b988508077bf902d515ab9\": container with ID starting with 76ba78d0f6ef978ace628646502b855551283e1325b988508077bf902d515ab9 not found: ID does not exist" Apr 21 10:44:44.733397 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:44:44.733373 2577 scope.go:117] "RemoveContainer" containerID="122054bc693f1373be161f2500b472cb5fac6d200e393bd10e496a42dcf8cd66" Apr 21 10:44:44.733647 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:44:44.733628 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"122054bc693f1373be161f2500b472cb5fac6d200e393bd10e496a42dcf8cd66\": container with ID starting with 122054bc693f1373be161f2500b472cb5fac6d200e393bd10e496a42dcf8cd66 not found: ID does not exist" containerID="122054bc693f1373be161f2500b472cb5fac6d200e393bd10e496a42dcf8cd66" Apr 21 10:44:44.733705 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:44:44.733651 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"122054bc693f1373be161f2500b472cb5fac6d200e393bd10e496a42dcf8cd66"} err="failed to get container status \"122054bc693f1373be161f2500b472cb5fac6d200e393bd10e496a42dcf8cd66\": rpc error: code = NotFound desc = could not find container \"122054bc693f1373be161f2500b472cb5fac6d200e393bd10e496a42dcf8cd66\": container with ID starting with 122054bc693f1373be161f2500b472cb5fac6d200e393bd10e496a42dcf8cd66 not found: ID does not exist" Apr 21 10:44:44.740831 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:44:44.740738 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-vg2pg"] Apr 21 10:44:44.742559 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:44:44.742536 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-vg2pg"] Apr 21 10:44:46.231601 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:44:46.231565 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a89ac118-2074-4c31-b72e-614b9d27c7c7" path="/var/lib/kubelet/pods/a89ac118-2074-4c31-b72e-614b9d27c7c7/volumes" Apr 21 10:45:15.716512 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:45:15.716467 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-fmh5b" podUID="2eecc9a4-dc58-4a01-aabc-43ff60ebd843" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.48:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.48:8080: connect: connection refused" Apr 21 10:45:25.715171 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:45:25.715118 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-fmh5b" podUID="2eecc9a4-dc58-4a01-aabc-43ff60ebd843" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.48:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.48:8080: connect: connection refused" Apr 21 10:45:35.715615 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:45:35.715561 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-fmh5b" podUID="2eecc9a4-dc58-4a01-aabc-43ff60ebd843" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.48:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.48:8080: connect: connection refused" Apr 21 10:45:45.715463 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:45:45.715371 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-fmh5b" podUID="2eecc9a4-dc58-4a01-aabc-43ff60ebd843" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.48:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.48:8080: connect: connection refused" Apr 21 10:45:50.228437 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:45:50.228386 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-fmh5b" podUID="2eecc9a4-dc58-4a01-aabc-43ff60ebd843" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.48:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.48:8080: connect: connection refused" Apr 21 10:46:00.232685 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:46:00.232652 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-fmh5b" Apr 21 10:46:09.475813 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:46:09.475775 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-fmh5b"] Apr 21 10:46:09.476208 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:46:09.476068 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-fmh5b" podUID="2eecc9a4-dc58-4a01-aabc-43ff60ebd843" containerName="kserve-container" containerID="cri-o://fbec80b80a077e8a218775db723bfe805f75ec79504b6544424b2779cdcdebb2" gracePeriod=30 Apr 21 10:46:09.545153 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:46:09.545116 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-mchcn"] Apr 21 10:46:09.545491 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:46:09.545477 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a89ac118-2074-4c31-b72e-614b9d27c7c7" containerName="kserve-container" Apr 21 10:46:09.545536 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:46:09.545493 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="a89ac118-2074-4c31-b72e-614b9d27c7c7" containerName="kserve-container" Apr 21 10:46:09.545536 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:46:09.545516 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a89ac118-2074-4c31-b72e-614b9d27c7c7" containerName="storage-initializer" Apr 21 10:46:09.545536 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:46:09.545522 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="a89ac118-2074-4c31-b72e-614b9d27c7c7" containerName="storage-initializer" Apr 21 10:46:09.545633 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:46:09.545573 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="a89ac118-2074-4c31-b72e-614b9d27c7c7" containerName="kserve-container" Apr 21 10:46:09.548628 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:46:09.548612 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-mchcn" Apr 21 10:46:09.558254 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:46:09.558224 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-mchcn"] Apr 21 10:46:09.648380 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:46:09.648342 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f160deea-3c8f-4d2c-8d01-97e7b0a3d93b-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-65df45cf79-mchcn\" (UID: \"f160deea-3c8f-4d2c-8d01-97e7b0a3d93b\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-mchcn" Apr 21 10:46:09.749103 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:46:09.749012 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f160deea-3c8f-4d2c-8d01-97e7b0a3d93b-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-65df45cf79-mchcn\" (UID: \"f160deea-3c8f-4d2c-8d01-97e7b0a3d93b\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-mchcn" Apr 21 10:46:09.749385 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:46:09.749365 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f160deea-3c8f-4d2c-8d01-97e7b0a3d93b-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-65df45cf79-mchcn\" (UID: \"f160deea-3c8f-4d2c-8d01-97e7b0a3d93b\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-mchcn" Apr 21 10:46:09.858871 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:46:09.858836 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-mchcn" Apr 21 10:46:09.994933 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:46:09.994898 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-mchcn"] Apr 21 10:46:09.999511 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:46:09.999436 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf160deea_3c8f_4d2c_8d01_97e7b0a3d93b.slice/crio-e7cdb4919b1e9d04f60ea095878cd4e7e7fb807c56a29185f28556f41fae02ee WatchSource:0}: Error finding container e7cdb4919b1e9d04f60ea095878cd4e7e7fb807c56a29185f28556f41fae02ee: Status 404 returned error can't find the container with id e7cdb4919b1e9d04f60ea095878cd4e7e7fb807c56a29185f28556f41fae02ee Apr 21 10:46:10.229269 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:46:10.229225 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-fmh5b" podUID="2eecc9a4-dc58-4a01-aabc-43ff60ebd843" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.48:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.134.0.48:8080: connect: connection refused" Apr 21 10:46:10.955676 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:46:10.955642 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-mchcn" event={"ID":"f160deea-3c8f-4d2c-8d01-97e7b0a3d93b","Type":"ContainerStarted","Data":"ce72872738939cdb05354d4c3c398c6e9f7227045df8f95366068f0cb743f095"} Apr 21 10:46:10.955676 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:46:10.955682 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-mchcn" event={"ID":"f160deea-3c8f-4d2c-8d01-97e7b0a3d93b","Type":"ContainerStarted","Data":"e7cdb4919b1e9d04f60ea095878cd4e7e7fb807c56a29185f28556f41fae02ee"} Apr 21 10:46:13.964874 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:46:13.964837 2577 generic.go:358] "Generic (PLEG): container finished" podID="f160deea-3c8f-4d2c-8d01-97e7b0a3d93b" containerID="ce72872738939cdb05354d4c3c398c6e9f7227045df8f95366068f0cb743f095" exitCode=0 Apr 21 10:46:13.965256 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:46:13.964872 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-mchcn" event={"ID":"f160deea-3c8f-4d2c-8d01-97e7b0a3d93b","Type":"ContainerDied","Data":"ce72872738939cdb05354d4c3c398c6e9f7227045df8f95366068f0cb743f095"} Apr 21 10:46:14.622040 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:46:14.622017 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-fmh5b" Apr 21 10:46:14.689489 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:46:14.689398 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2eecc9a4-dc58-4a01-aabc-43ff60ebd843-kserve-provision-location\") pod \"2eecc9a4-dc58-4a01-aabc-43ff60ebd843\" (UID: \"2eecc9a4-dc58-4a01-aabc-43ff60ebd843\") " Apr 21 10:46:14.689725 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:46:14.689702 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2eecc9a4-dc58-4a01-aabc-43ff60ebd843-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2eecc9a4-dc58-4a01-aabc-43ff60ebd843" (UID: "2eecc9a4-dc58-4a01-aabc-43ff60ebd843"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:46:14.790570 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:46:14.790535 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2eecc9a4-dc58-4a01-aabc-43ff60ebd843-kserve-provision-location\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:46:14.969472 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:46:14.969436 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-mchcn" event={"ID":"f160deea-3c8f-4d2c-8d01-97e7b0a3d93b","Type":"ContainerStarted","Data":"5586aba9585a30c1ee2294a55c4188791a14e69fd472712306571403a627ca77"} Apr 21 10:46:14.969928 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:46:14.969682 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-mchcn" Apr 21 10:46:14.970828 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:46:14.970806 2577 generic.go:358] "Generic (PLEG): container finished" podID="2eecc9a4-dc58-4a01-aabc-43ff60ebd843" containerID="fbec80b80a077e8a218775db723bfe805f75ec79504b6544424b2779cdcdebb2" exitCode=0 Apr 21 10:46:14.970944 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:46:14.970866 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-fmh5b" event={"ID":"2eecc9a4-dc58-4a01-aabc-43ff60ebd843","Type":"ContainerDied","Data":"fbec80b80a077e8a218775db723bfe805f75ec79504b6544424b2779cdcdebb2"} Apr 21 10:46:14.970944 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:46:14.970869 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-fmh5b" Apr 21 10:46:14.970944 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:46:14.970886 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-fmh5b" event={"ID":"2eecc9a4-dc58-4a01-aabc-43ff60ebd843","Type":"ContainerDied","Data":"f9c5f0618f52cf1fbf47179d1b533e506eba18f7330626377860041197873d40"} Apr 21 10:46:14.970944 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:46:14.970901 2577 scope.go:117] "RemoveContainer" containerID="fbec80b80a077e8a218775db723bfe805f75ec79504b6544424b2779cdcdebb2" Apr 21 10:46:14.978986 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:46:14.978960 2577 scope.go:117] "RemoveContainer" containerID="18211f8ed6dddc53e53efbd79b15e4a32a9000c2642982a2bb7317a98f7beb2b" Apr 21 10:46:14.986355 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:46:14.986334 2577 scope.go:117] "RemoveContainer" containerID="fbec80b80a077e8a218775db723bfe805f75ec79504b6544424b2779cdcdebb2" Apr 21 10:46:14.986679 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:46:14.986650 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbec80b80a077e8a218775db723bfe805f75ec79504b6544424b2779cdcdebb2\": container with ID starting with fbec80b80a077e8a218775db723bfe805f75ec79504b6544424b2779cdcdebb2 not found: ID does not exist" containerID="fbec80b80a077e8a218775db723bfe805f75ec79504b6544424b2779cdcdebb2" Apr 21 10:46:14.986782 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:46:14.986691 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbec80b80a077e8a218775db723bfe805f75ec79504b6544424b2779cdcdebb2"} err="failed to get container status \"fbec80b80a077e8a218775db723bfe805f75ec79504b6544424b2779cdcdebb2\": rpc error: code = NotFound desc = could not find container \"fbec80b80a077e8a218775db723bfe805f75ec79504b6544424b2779cdcdebb2\": container with ID starting with fbec80b80a077e8a218775db723bfe805f75ec79504b6544424b2779cdcdebb2 not found: ID does not exist" Apr 21 10:46:14.986782 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:46:14.986718 2577 scope.go:117] "RemoveContainer" containerID="18211f8ed6dddc53e53efbd79b15e4a32a9000c2642982a2bb7317a98f7beb2b" Apr 21 10:46:14.987165 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:46:14.987064 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18211f8ed6dddc53e53efbd79b15e4a32a9000c2642982a2bb7317a98f7beb2b\": container with ID starting with 18211f8ed6dddc53e53efbd79b15e4a32a9000c2642982a2bb7317a98f7beb2b not found: ID does not exist" containerID="18211f8ed6dddc53e53efbd79b15e4a32a9000c2642982a2bb7317a98f7beb2b" Apr 21 10:46:14.987165 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:46:14.987095 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18211f8ed6dddc53e53efbd79b15e4a32a9000c2642982a2bb7317a98f7beb2b"} err="failed to get container status \"18211f8ed6dddc53e53efbd79b15e4a32a9000c2642982a2bb7317a98f7beb2b\": rpc error: code = NotFound desc = could not find container \"18211f8ed6dddc53e53efbd79b15e4a32a9000c2642982a2bb7317a98f7beb2b\": container with ID starting with 18211f8ed6dddc53e53efbd79b15e4a32a9000c2642982a2bb7317a98f7beb2b not found: ID does not exist" Apr 21 10:46:14.988128 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:46:14.988092 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-mchcn" podStartSLOduration=5.988080684 podStartE2EDuration="5.988080684s" podCreationTimestamp="2026-04-21 10:46:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:46:14.986347277 +0000 UTC m=+2543.324881294" watchObservedRunningTime="2026-04-21 10:46:14.988080684 +0000 UTC m=+2543.326614701" Apr 21 10:46:14.999193 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:46:14.999168 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-fmh5b"] Apr 21 10:46:15.002715 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:46:15.002691 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-fmh5b"] Apr 21 10:46:16.232682 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:46:16.232647 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2eecc9a4-dc58-4a01-aabc-43ff60ebd843" path="/var/lib/kubelet/pods/2eecc9a4-dc58-4a01-aabc-43ff60ebd843/volumes" Apr 21 10:46:45.977060 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:46:45.977006 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-mchcn" podUID="f160deea-3c8f-4d2c-8d01-97e7b0a3d93b" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.49:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.134.0.49:8080: connect: connection refused" Apr 21 10:46:55.976535 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:46:55.976489 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-mchcn" podUID="f160deea-3c8f-4d2c-8d01-97e7b0a3d93b" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.49:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.134.0.49:8080: connect: connection refused" Apr 21 10:47:05.976160 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:47:05.976112 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-mchcn" podUID="f160deea-3c8f-4d2c-8d01-97e7b0a3d93b" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.49:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.134.0.49:8080: connect: connection refused" Apr 21 10:47:15.975828 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:47:15.975719 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-mchcn" podUID="f160deea-3c8f-4d2c-8d01-97e7b0a3d93b" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.49:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.134.0.49:8080: connect: connection refused" Apr 21 10:47:25.976042 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:47:25.975993 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-mchcn" podUID="f160deea-3c8f-4d2c-8d01-97e7b0a3d93b" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.49:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.134.0.49:8080: connect: connection refused" Apr 21 10:47:35.980291 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:47:35.980253 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-mchcn" Apr 21 10:47:39.649445 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:47:39.649409 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-mchcn"] Apr 21 10:47:39.649947 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:47:39.649682 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-mchcn" podUID="f160deea-3c8f-4d2c-8d01-97e7b0a3d93b" containerName="kserve-container" containerID="cri-o://5586aba9585a30c1ee2294a55c4188791a14e69fd472712306571403a627ca77" gracePeriod=30 Apr 21 10:47:41.844617 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:47:41.844565 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-7f8c8ff5db-rsqxk"] Apr 21 10:47:41.845145 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:47:41.845016 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2eecc9a4-dc58-4a01-aabc-43ff60ebd843" containerName="kserve-container" Apr 21 10:47:41.845145 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:47:41.845036 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eecc9a4-dc58-4a01-aabc-43ff60ebd843" containerName="kserve-container" Apr 21 10:47:41.845145 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:47:41.845050 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2eecc9a4-dc58-4a01-aabc-43ff60ebd843" containerName="storage-initializer" Apr 21 10:47:41.845145 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:47:41.845058 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eecc9a4-dc58-4a01-aabc-43ff60ebd843" containerName="storage-initializer" Apr 21 10:47:41.845145 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:47:41.845144 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="2eecc9a4-dc58-4a01-aabc-43ff60ebd843" containerName="kserve-container" Apr 21 10:47:41.848252 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:47:41.848232 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7f8c8ff5db-rsqxk" Apr 21 10:47:41.861737 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:47:41.861710 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-7f8c8ff5db-rsqxk"] Apr 21 10:47:42.004306 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:47:42.004261 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/43f5b4d3-c4e2-4ada-8f48-9ccee74f2b95-kserve-provision-location\") pod \"isvc-sklearn-predictor-7f8c8ff5db-rsqxk\" (UID: \"43f5b4d3-c4e2-4ada-8f48-9ccee74f2b95\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7f8c8ff5db-rsqxk" Apr 21 10:47:42.105569 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:47:42.105473 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/43f5b4d3-c4e2-4ada-8f48-9ccee74f2b95-kserve-provision-location\") pod \"isvc-sklearn-predictor-7f8c8ff5db-rsqxk\" (UID: \"43f5b4d3-c4e2-4ada-8f48-9ccee74f2b95\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7f8c8ff5db-rsqxk" Apr 21 10:47:42.105863 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:47:42.105844 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/43f5b4d3-c4e2-4ada-8f48-9ccee74f2b95-kserve-provision-location\") pod \"isvc-sklearn-predictor-7f8c8ff5db-rsqxk\" (UID: \"43f5b4d3-c4e2-4ada-8f48-9ccee74f2b95\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7f8c8ff5db-rsqxk" Apr 21 10:47:42.158606 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:47:42.158571 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7f8c8ff5db-rsqxk" Apr 21 10:47:42.281854 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:47:42.281823 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-7f8c8ff5db-rsqxk"] Apr 21 10:47:42.285101 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:47:42.285064 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43f5b4d3_c4e2_4ada_8f48_9ccee74f2b95.slice/crio-be81a23d6041824a5f95eece7858dde372ff403aebdcde219f93851e33273dc5 WatchSource:0}: Error finding container be81a23d6041824a5f95eece7858dde372ff403aebdcde219f93851e33273dc5: Status 404 returned error can't find the container with id be81a23d6041824a5f95eece7858dde372ff403aebdcde219f93851e33273dc5 Apr 21 10:47:43.232743 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:47:43.232709 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7f8c8ff5db-rsqxk" event={"ID":"43f5b4d3-c4e2-4ada-8f48-9ccee74f2b95","Type":"ContainerStarted","Data":"863857f90ca2ef5707144252325e1f36e010586e8203277226f27f56b28d0365"} Apr 21 10:47:43.232743 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:47:43.232761 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7f8c8ff5db-rsqxk" event={"ID":"43f5b4d3-c4e2-4ada-8f48-9ccee74f2b95","Type":"ContainerStarted","Data":"be81a23d6041824a5f95eece7858dde372ff403aebdcde219f93851e33273dc5"} Apr 21 10:47:44.991664 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:47:44.991639 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-mchcn" Apr 21 10:47:45.130235 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:47:45.130130 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f160deea-3c8f-4d2c-8d01-97e7b0a3d93b-kserve-provision-location\") pod \"f160deea-3c8f-4d2c-8d01-97e7b0a3d93b\" (UID: \"f160deea-3c8f-4d2c-8d01-97e7b0a3d93b\") " Apr 21 10:47:45.130520 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:47:45.130497 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f160deea-3c8f-4d2c-8d01-97e7b0a3d93b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f160deea-3c8f-4d2c-8d01-97e7b0a3d93b" (UID: "f160deea-3c8f-4d2c-8d01-97e7b0a3d93b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:47:45.231679 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:47:45.231641 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f160deea-3c8f-4d2c-8d01-97e7b0a3d93b-kserve-provision-location\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:47:45.239214 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:47:45.239183 2577 generic.go:358] "Generic (PLEG): container finished" podID="f160deea-3c8f-4d2c-8d01-97e7b0a3d93b" containerID="5586aba9585a30c1ee2294a55c4188791a14e69fd472712306571403a627ca77" exitCode=0 Apr 21 10:47:45.239372 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:47:45.239246 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-mchcn" Apr 21 10:47:45.239372 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:47:45.239255 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-mchcn" event={"ID":"f160deea-3c8f-4d2c-8d01-97e7b0a3d93b","Type":"ContainerDied","Data":"5586aba9585a30c1ee2294a55c4188791a14e69fd472712306571403a627ca77"} Apr 21 10:47:45.239372 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:47:45.239280 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-mchcn" event={"ID":"f160deea-3c8f-4d2c-8d01-97e7b0a3d93b","Type":"ContainerDied","Data":"e7cdb4919b1e9d04f60ea095878cd4e7e7fb807c56a29185f28556f41fae02ee"} Apr 21 10:47:45.239372 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:47:45.239295 2577 scope.go:117] "RemoveContainer" containerID="5586aba9585a30c1ee2294a55c4188791a14e69fd472712306571403a627ca77" Apr 21 10:47:45.247162 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:47:45.247144 2577 scope.go:117] "RemoveContainer" containerID="ce72872738939cdb05354d4c3c398c6e9f7227045df8f95366068f0cb743f095" Apr 21 10:47:45.254330 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:47:45.254311 2577 scope.go:117] "RemoveContainer" containerID="5586aba9585a30c1ee2294a55c4188791a14e69fd472712306571403a627ca77" Apr 21 10:47:45.254608 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:47:45.254577 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5586aba9585a30c1ee2294a55c4188791a14e69fd472712306571403a627ca77\": container with ID starting with 5586aba9585a30c1ee2294a55c4188791a14e69fd472712306571403a627ca77 not found: ID does not exist" containerID="5586aba9585a30c1ee2294a55c4188791a14e69fd472712306571403a627ca77" Apr 21 10:47:45.254698 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:47:45.254614 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5586aba9585a30c1ee2294a55c4188791a14e69fd472712306571403a627ca77"} err="failed to get container status \"5586aba9585a30c1ee2294a55c4188791a14e69fd472712306571403a627ca77\": rpc error: code = NotFound desc = could not find container \"5586aba9585a30c1ee2294a55c4188791a14e69fd472712306571403a627ca77\": container with ID starting with 5586aba9585a30c1ee2294a55c4188791a14e69fd472712306571403a627ca77 not found: ID does not exist" Apr 21 10:47:45.254698 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:47:45.254633 2577 scope.go:117] "RemoveContainer" containerID="ce72872738939cdb05354d4c3c398c6e9f7227045df8f95366068f0cb743f095" Apr 21 10:47:45.254917 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:47:45.254899 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce72872738939cdb05354d4c3c398c6e9f7227045df8f95366068f0cb743f095\": container with ID starting with ce72872738939cdb05354d4c3c398c6e9f7227045df8f95366068f0cb743f095 not found: ID does not exist" containerID="ce72872738939cdb05354d4c3c398c6e9f7227045df8f95366068f0cb743f095" Apr 21 10:47:45.254962 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:47:45.254924 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce72872738939cdb05354d4c3c398c6e9f7227045df8f95366068f0cb743f095"} err="failed to get container status \"ce72872738939cdb05354d4c3c398c6e9f7227045df8f95366068f0cb743f095\": rpc error: code = NotFound desc = could not find container \"ce72872738939cdb05354d4c3c398c6e9f7227045df8f95366068f0cb743f095\": container with ID starting with ce72872738939cdb05354d4c3c398c6e9f7227045df8f95366068f0cb743f095 not found: ID does not exist" Apr 21 10:47:45.259983 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:47:45.259960 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-mchcn"] Apr 21 10:47:45.265429 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:47:45.265404 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-mchcn"] Apr 21 10:47:46.231681 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:47:46.231648 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f160deea-3c8f-4d2c-8d01-97e7b0a3d93b" path="/var/lib/kubelet/pods/f160deea-3c8f-4d2c-8d01-97e7b0a3d93b/volumes" Apr 21 10:47:46.243677 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:47:46.243643 2577 generic.go:358] "Generic (PLEG): container finished" podID="43f5b4d3-c4e2-4ada-8f48-9ccee74f2b95" containerID="863857f90ca2ef5707144252325e1f36e010586e8203277226f27f56b28d0365" exitCode=0 Apr 21 10:47:46.243861 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:47:46.243717 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7f8c8ff5db-rsqxk" event={"ID":"43f5b4d3-c4e2-4ada-8f48-9ccee74f2b95","Type":"ContainerDied","Data":"863857f90ca2ef5707144252325e1f36e010586e8203277226f27f56b28d0365"} Apr 21 10:47:47.249605 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:47:47.249573 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7f8c8ff5db-rsqxk" event={"ID":"43f5b4d3-c4e2-4ada-8f48-9ccee74f2b95","Type":"ContainerStarted","Data":"59e126de591e7cc3fb43d47876abe07663705550a03464f68cab8639ccae5551"} Apr 21 10:47:47.250036 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:47:47.249870 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7f8c8ff5db-rsqxk" Apr 21 10:47:47.251142 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:47:47.251116 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7f8c8ff5db-rsqxk" podUID="43f5b4d3-c4e2-4ada-8f48-9ccee74f2b95" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 21 10:47:47.266794 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:47:47.266725 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7f8c8ff5db-rsqxk" podStartSLOduration=6.266709801 podStartE2EDuration="6.266709801s" podCreationTimestamp="2026-04-21 10:47:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:47:47.265183347 +0000 UTC m=+2635.603717364" watchObservedRunningTime="2026-04-21 10:47:47.266709801 +0000 UTC m=+2635.605243818" Apr 21 10:47:48.252386 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:47:48.252340 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7f8c8ff5db-rsqxk" podUID="43f5b4d3-c4e2-4ada-8f48-9ccee74f2b95" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 21 10:47:58.253417 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:47:58.253373 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7f8c8ff5db-rsqxk" podUID="43f5b4d3-c4e2-4ada-8f48-9ccee74f2b95" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 21 10:48:08.253144 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:48:08.253093 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7f8c8ff5db-rsqxk" podUID="43f5b4d3-c4e2-4ada-8f48-9ccee74f2b95" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 21 10:48:18.252849 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:48:18.252802 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7f8c8ff5db-rsqxk" podUID="43f5b4d3-c4e2-4ada-8f48-9ccee74f2b95" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 21 10:48:28.253272 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:48:28.253230 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7f8c8ff5db-rsqxk" podUID="43f5b4d3-c4e2-4ada-8f48-9ccee74f2b95" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 21 10:48:38.252349 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:48:38.252308 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7f8c8ff5db-rsqxk" podUID="43f5b4d3-c4e2-4ada-8f48-9ccee74f2b95" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 21 10:48:48.253091 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:48:48.252993 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7f8c8ff5db-rsqxk" podUID="43f5b4d3-c4e2-4ada-8f48-9ccee74f2b95" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.50:8080: connect: connection refused" Apr 21 10:48:52.300100 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:48:52.300070 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-9vsls_c7fc40ec-8080-4847-9b12-78671f916c03/console-operator/2.log" Apr 21 10:48:52.303416 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:48:52.303393 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-9vsls_c7fc40ec-8080-4847-9b12-78671f916c03/console-operator/2.log" Apr 21 10:48:58.253966 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:48:58.253930 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7f8c8ff5db-rsqxk" Apr 21 10:49:01.916720 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:49:01.916684 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-7f8c8ff5db-rsqxk"] Apr 21 10:49:01.917135 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:49:01.917025 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7f8c8ff5db-rsqxk" podUID="43f5b4d3-c4e2-4ada-8f48-9ccee74f2b95" containerName="kserve-container" containerID="cri-o://59e126de591e7cc3fb43d47876abe07663705550a03464f68cab8639ccae5551" gracePeriod=30 Apr 21 10:49:01.984651 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:49:01.984613 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-v8w4j"] Apr 21 10:49:01.984942 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:49:01.984929 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f160deea-3c8f-4d2c-8d01-97e7b0a3d93b" containerName="kserve-container" Apr 21 10:49:01.984942 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:49:01.984943 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="f160deea-3c8f-4d2c-8d01-97e7b0a3d93b" containerName="kserve-container" Apr 21 10:49:01.985034 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:49:01.984953 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f160deea-3c8f-4d2c-8d01-97e7b0a3d93b" containerName="storage-initializer" Apr 21 10:49:01.985034 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:49:01.984959 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="f160deea-3c8f-4d2c-8d01-97e7b0a3d93b" containerName="storage-initializer" Apr 21 10:49:01.985034 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:49:01.985010 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="f160deea-3c8f-4d2c-8d01-97e7b0a3d93b" containerName="kserve-container" Apr 21 10:49:01.987946 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:49:01.987926 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-v8w4j" Apr 21 10:49:01.996739 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:49:01.996710 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-v8w4j"] Apr 21 10:49:02.042197 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:49:02.042157 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9d8bf364-3d3e-4125-b2f4-9c95447732d9-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-695d5f5568-v8w4j\" (UID: \"9d8bf364-3d3e-4125-b2f4-9c95447732d9\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-v8w4j" Apr 21 10:49:02.143348 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:49:02.143299 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9d8bf364-3d3e-4125-b2f4-9c95447732d9-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-695d5f5568-v8w4j\" (UID: \"9d8bf364-3d3e-4125-b2f4-9c95447732d9\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-v8w4j" Apr 21 10:49:02.143682 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:49:02.143659 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9d8bf364-3d3e-4125-b2f4-9c95447732d9-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-695d5f5568-v8w4j\" (UID: \"9d8bf364-3d3e-4125-b2f4-9c95447732d9\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-v8w4j" Apr 21 10:49:02.298568 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:49:02.298517 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-v8w4j" Apr 21 10:49:02.422248 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:49:02.422082 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-v8w4j"] Apr 21 10:49:02.424938 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:49:02.424909 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d8bf364_3d3e_4125_b2f4_9c95447732d9.slice/crio-e73c32476369bc20b603fec0d552d490d7b466cb911289261376119f7d2f6507 WatchSource:0}: Error finding container e73c32476369bc20b603fec0d552d490d7b466cb911289261376119f7d2f6507: Status 404 returned error can't find the container with id e73c32476369bc20b603fec0d552d490d7b466cb911289261376119f7d2f6507 Apr 21 10:49:02.426856 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:49:02.426839 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 10:49:02.467097 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:49:02.467065 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-v8w4j" event={"ID":"9d8bf364-3d3e-4125-b2f4-9c95447732d9","Type":"ContainerStarted","Data":"e73c32476369bc20b603fec0d552d490d7b466cb911289261376119f7d2f6507"} Apr 21 10:49:03.471231 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:49:03.471190 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-v8w4j" event={"ID":"9d8bf364-3d3e-4125-b2f4-9c95447732d9","Type":"ContainerStarted","Data":"704d0c68f535bf7936016d9701fcb335dbc6a07943e7bc7eee18f726cdeef10a"} Apr 21 10:49:06.481226 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:49:06.481189 2577 generic.go:358] "Generic (PLEG): container finished" podID="9d8bf364-3d3e-4125-b2f4-9c95447732d9" containerID="704d0c68f535bf7936016d9701fcb335dbc6a07943e7bc7eee18f726cdeef10a" exitCode=0 Apr 21 10:49:06.481638 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:49:06.481265 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-v8w4j" event={"ID":"9d8bf364-3d3e-4125-b2f4-9c95447732d9","Type":"ContainerDied","Data":"704d0c68f535bf7936016d9701fcb335dbc6a07943e7bc7eee18f726cdeef10a"} Apr 21 10:49:06.659355 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:49:06.659326 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7f8c8ff5db-rsqxk" Apr 21 10:49:06.678846 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:49:06.678815 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/43f5b4d3-c4e2-4ada-8f48-9ccee74f2b95-kserve-provision-location\") pod \"43f5b4d3-c4e2-4ada-8f48-9ccee74f2b95\" (UID: \"43f5b4d3-c4e2-4ada-8f48-9ccee74f2b95\") " Apr 21 10:49:06.679151 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:49:06.679129 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43f5b4d3-c4e2-4ada-8f48-9ccee74f2b95-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "43f5b4d3-c4e2-4ada-8f48-9ccee74f2b95" (UID: "43f5b4d3-c4e2-4ada-8f48-9ccee74f2b95"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:49:06.779568 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:49:06.779527 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/43f5b4d3-c4e2-4ada-8f48-9ccee74f2b95-kserve-provision-location\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:49:07.486027 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:49:07.485989 2577 generic.go:358] "Generic (PLEG): container finished" podID="43f5b4d3-c4e2-4ada-8f48-9ccee74f2b95" containerID="59e126de591e7cc3fb43d47876abe07663705550a03464f68cab8639ccae5551" exitCode=0 Apr 21 10:49:07.486513 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:49:07.486068 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7f8c8ff5db-rsqxk" Apr 21 10:49:07.486513 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:49:07.486074 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7f8c8ff5db-rsqxk" event={"ID":"43f5b4d3-c4e2-4ada-8f48-9ccee74f2b95","Type":"ContainerDied","Data":"59e126de591e7cc3fb43d47876abe07663705550a03464f68cab8639ccae5551"} Apr 21 10:49:07.486513 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:49:07.486118 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7f8c8ff5db-rsqxk" event={"ID":"43f5b4d3-c4e2-4ada-8f48-9ccee74f2b95","Type":"ContainerDied","Data":"be81a23d6041824a5f95eece7858dde372ff403aebdcde219f93851e33273dc5"} Apr 21 10:49:07.486513 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:49:07.486134 2577 scope.go:117] "RemoveContainer" containerID="59e126de591e7cc3fb43d47876abe07663705550a03464f68cab8639ccae5551" Apr 21 10:49:07.487880 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:49:07.487853 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-v8w4j" event={"ID":"9d8bf364-3d3e-4125-b2f4-9c95447732d9","Type":"ContainerStarted","Data":"a0a70f83a678b997490492b70548c70d5eaccbe719e45331eb0aa3a81f418203"} Apr 21 10:49:07.488109 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:49:07.488092 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-v8w4j" Apr 21 10:49:07.494680 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:49:07.494656 2577 scope.go:117] "RemoveContainer" containerID="863857f90ca2ef5707144252325e1f36e010586e8203277226f27f56b28d0365" Apr 21 10:49:07.502794 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:49:07.502770 2577 scope.go:117] "RemoveContainer" containerID="59e126de591e7cc3fb43d47876abe07663705550a03464f68cab8639ccae5551" Apr 21 10:49:07.503101 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:49:07.503078 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59e126de591e7cc3fb43d47876abe07663705550a03464f68cab8639ccae5551\": container with ID starting with 59e126de591e7cc3fb43d47876abe07663705550a03464f68cab8639ccae5551 not found: ID does not exist" containerID="59e126de591e7cc3fb43d47876abe07663705550a03464f68cab8639ccae5551" Apr 21 10:49:07.503155 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:49:07.503113 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59e126de591e7cc3fb43d47876abe07663705550a03464f68cab8639ccae5551"} err="failed to get container status \"59e126de591e7cc3fb43d47876abe07663705550a03464f68cab8639ccae5551\": rpc error: code = NotFound desc = could not find container \"59e126de591e7cc3fb43d47876abe07663705550a03464f68cab8639ccae5551\": container with ID starting with 59e126de591e7cc3fb43d47876abe07663705550a03464f68cab8639ccae5551 not found: ID does not exist" Apr 21 10:49:07.503155 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:49:07.503134 2577 scope.go:117] "RemoveContainer" containerID="863857f90ca2ef5707144252325e1f36e010586e8203277226f27f56b28d0365" Apr 21 10:49:07.503419 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:49:07.503404 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"863857f90ca2ef5707144252325e1f36e010586e8203277226f27f56b28d0365\": container with ID starting with 863857f90ca2ef5707144252325e1f36e010586e8203277226f27f56b28d0365 not found: ID does not exist" containerID="863857f90ca2ef5707144252325e1f36e010586e8203277226f27f56b28d0365" Apr 21 10:49:07.503486 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:49:07.503428 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"863857f90ca2ef5707144252325e1f36e010586e8203277226f27f56b28d0365"} err="failed to get container status \"863857f90ca2ef5707144252325e1f36e010586e8203277226f27f56b28d0365\": rpc error: code = NotFound desc = could not find container \"863857f90ca2ef5707144252325e1f36e010586e8203277226f27f56b28d0365\": container with ID starting with 863857f90ca2ef5707144252325e1f36e010586e8203277226f27f56b28d0365 not found: ID does not exist" Apr 21 10:49:07.506706 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:49:07.506662 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-v8w4j" podStartSLOduration=6.506647513 podStartE2EDuration="6.506647513s" podCreationTimestamp="2026-04-21 10:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:49:07.505147193 +0000 UTC m=+2715.843681220" watchObservedRunningTime="2026-04-21 10:49:07.506647513 +0000 UTC m=+2715.845181530" Apr 21 10:49:07.519047 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:49:07.519011 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-7f8c8ff5db-rsqxk"] Apr 21 10:49:07.521929 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:49:07.521900 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-7f8c8ff5db-rsqxk"] Apr 21 10:49:08.232394 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:49:08.232358 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43f5b4d3-c4e2-4ada-8f48-9ccee74f2b95" path="/var/lib/kubelet/pods/43f5b4d3-c4e2-4ada-8f48-9ccee74f2b95/volumes" Apr 21 10:49:38.497551 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:49:38.497505 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-v8w4j" podUID="9d8bf364-3d3e-4125-b2f4-9c95447732d9" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 21 10:49:48.494842 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:49:48.494795 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-v8w4j" podUID="9d8bf364-3d3e-4125-b2f4-9c95447732d9" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 21 10:49:58.494485 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:49:58.494449 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-v8w4j" Apr 21 10:50:02.097554 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:02.097520 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-v8w4j"] Apr 21 10:50:02.098076 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:02.097822 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-v8w4j" podUID="9d8bf364-3d3e-4125-b2f4-9c95447732d9" containerName="kserve-container" containerID="cri-o://a0a70f83a678b997490492b70548c70d5eaccbe719e45331eb0aa3a81f418203" gracePeriod=30 Apr 21 10:50:02.171323 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:02.171285 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-c7f7d7dd-rglkp"] Apr 21 10:50:02.171598 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:02.171586 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="43f5b4d3-c4e2-4ada-8f48-9ccee74f2b95" containerName="kserve-container" Apr 21 10:50:02.171598 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:02.171599 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="43f5b4d3-c4e2-4ada-8f48-9ccee74f2b95" containerName="kserve-container" Apr 21 10:50:02.171685 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:02.171615 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="43f5b4d3-c4e2-4ada-8f48-9ccee74f2b95" containerName="storage-initializer" Apr 21 10:50:02.171685 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:02.171621 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="43f5b4d3-c4e2-4ada-8f48-9ccee74f2b95" containerName="storage-initializer" Apr 21 10:50:02.171685 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:02.171670 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="43f5b4d3-c4e2-4ada-8f48-9ccee74f2b95" containerName="kserve-container" Apr 21 10:50:02.174603 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:02.174585 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-c7f7d7dd-rglkp" Apr 21 10:50:02.182243 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:02.182211 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-c7f7d7dd-rglkp"] Apr 21 10:50:02.325645 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:02.325597 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4aa8f8fe-ca64-4792-a3b9-8389bd0e9e84-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-c7f7d7dd-rglkp\" (UID: \"4aa8f8fe-ca64-4792-a3b9-8389bd0e9e84\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-c7f7d7dd-rglkp" Apr 21 10:50:02.426158 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:02.426069 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4aa8f8fe-ca64-4792-a3b9-8389bd0e9e84-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-c7f7d7dd-rglkp\" (UID: \"4aa8f8fe-ca64-4792-a3b9-8389bd0e9e84\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-c7f7d7dd-rglkp" Apr 21 10:50:02.426420 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:02.426403 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4aa8f8fe-ca64-4792-a3b9-8389bd0e9e84-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-c7f7d7dd-rglkp\" (UID: \"4aa8f8fe-ca64-4792-a3b9-8389bd0e9e84\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-c7f7d7dd-rglkp" Apr 21 10:50:02.485667 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:02.485638 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-c7f7d7dd-rglkp" Apr 21 10:50:02.607560 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:02.607515 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-c7f7d7dd-rglkp"] Apr 21 10:50:02.651411 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:02.651378 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-c7f7d7dd-rglkp" event={"ID":"4aa8f8fe-ca64-4792-a3b9-8389bd0e9e84","Type":"ContainerStarted","Data":"5249d958c280ca2cd9ba09e6733b0fa0ace9d83d69110fc5c16a6d9753fb46e2"} Apr 21 10:50:03.655879 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:03.655841 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-c7f7d7dd-rglkp" event={"ID":"4aa8f8fe-ca64-4792-a3b9-8389bd0e9e84","Type":"ContainerStarted","Data":"b5fa905f335474d7c4fef7fe73c4830c2bc77bb957957d24d943b89d4039417e"} Apr 21 10:50:08.493296 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:08.493244 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-v8w4j" podUID="9d8bf364-3d3e-4125-b2f4-9c95447732d9" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.51:8080/v2/models/sklearn-v2-mlserver/ready\": dial tcp 10.134.0.51:8080: connect: connection refused" Apr 21 10:50:08.673314 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:08.673272 2577 generic.go:358] "Generic (PLEG): container finished" podID="4aa8f8fe-ca64-4792-a3b9-8389bd0e9e84" containerID="b5fa905f335474d7c4fef7fe73c4830c2bc77bb957957d24d943b89d4039417e" exitCode=0 Apr 21 10:50:08.673495 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:08.673346 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-c7f7d7dd-rglkp" event={"ID":"4aa8f8fe-ca64-4792-a3b9-8389bd0e9e84","Type":"ContainerDied","Data":"b5fa905f335474d7c4fef7fe73c4830c2bc77bb957957d24d943b89d4039417e"} Apr 21 10:50:09.678635 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:09.678600 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-c7f7d7dd-rglkp" event={"ID":"4aa8f8fe-ca64-4792-a3b9-8389bd0e9e84","Type":"ContainerStarted","Data":"cdc127bec455fa1c3dbc8953e89cd542599b5f910cae354e02a7136be08b69c2"} Apr 21 10:50:09.679063 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:09.678866 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-c7f7d7dd-rglkp" Apr 21 10:50:09.680285 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:09.680259 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-c7f7d7dd-rglkp" podUID="4aa8f8fe-ca64-4792-a3b9-8389bd0e9e84" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Apr 21 10:50:09.693901 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:09.693818 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-c7f7d7dd-rglkp" podStartSLOduration=7.693799883 podStartE2EDuration="7.693799883s" podCreationTimestamp="2026-04-21 10:50:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:50:09.693493392 +0000 UTC m=+2778.032027410" watchObservedRunningTime="2026-04-21 10:50:09.693799883 +0000 UTC m=+2778.032333901" Apr 21 10:50:10.140119 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:10.140094 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-v8w4j" Apr 21 10:50:10.181586 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:10.181553 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9d8bf364-3d3e-4125-b2f4-9c95447732d9-kserve-provision-location\") pod \"9d8bf364-3d3e-4125-b2f4-9c95447732d9\" (UID: \"9d8bf364-3d3e-4125-b2f4-9c95447732d9\") " Apr 21 10:50:10.181928 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:10.181905 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d8bf364-3d3e-4125-b2f4-9c95447732d9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9d8bf364-3d3e-4125-b2f4-9c95447732d9" (UID: "9d8bf364-3d3e-4125-b2f4-9c95447732d9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:50:10.282892 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:10.282809 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9d8bf364-3d3e-4125-b2f4-9c95447732d9-kserve-provision-location\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:50:10.682943 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:10.682846 2577 generic.go:358] "Generic (PLEG): container finished" podID="9d8bf364-3d3e-4125-b2f4-9c95447732d9" containerID="a0a70f83a678b997490492b70548c70d5eaccbe719e45331eb0aa3a81f418203" exitCode=0 Apr 21 10:50:10.682943 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:10.682920 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-v8w4j" Apr 21 10:50:10.682943 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:10.682924 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-v8w4j" event={"ID":"9d8bf364-3d3e-4125-b2f4-9c95447732d9","Type":"ContainerDied","Data":"a0a70f83a678b997490492b70548c70d5eaccbe719e45331eb0aa3a81f418203"} Apr 21 10:50:10.683460 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:10.682981 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-v8w4j" event={"ID":"9d8bf364-3d3e-4125-b2f4-9c95447732d9","Type":"ContainerDied","Data":"e73c32476369bc20b603fec0d552d490d7b466cb911289261376119f7d2f6507"} Apr 21 10:50:10.683460 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:10.683007 2577 scope.go:117] "RemoveContainer" containerID="a0a70f83a678b997490492b70548c70d5eaccbe719e45331eb0aa3a81f418203" Apr 21 10:50:10.683706 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:10.683677 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-c7f7d7dd-rglkp" podUID="4aa8f8fe-ca64-4792-a3b9-8389bd0e9e84" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Apr 21 10:50:10.691438 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:10.691407 2577 scope.go:117] "RemoveContainer" containerID="704d0c68f535bf7936016d9701fcb335dbc6a07943e7bc7eee18f726cdeef10a" Apr 21 10:50:10.701476 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:10.701443 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-v8w4j"] Apr 21 10:50:10.702258 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:10.702237 2577 scope.go:117] "RemoveContainer" containerID="a0a70f83a678b997490492b70548c70d5eaccbe719e45331eb0aa3a81f418203" Apr 21 10:50:10.702616 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:50:10.702595 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0a70f83a678b997490492b70548c70d5eaccbe719e45331eb0aa3a81f418203\": container with ID starting with a0a70f83a678b997490492b70548c70d5eaccbe719e45331eb0aa3a81f418203 not found: ID does not exist" containerID="a0a70f83a678b997490492b70548c70d5eaccbe719e45331eb0aa3a81f418203" Apr 21 10:50:10.702737 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:10.702625 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0a70f83a678b997490492b70548c70d5eaccbe719e45331eb0aa3a81f418203"} err="failed to get container status \"a0a70f83a678b997490492b70548c70d5eaccbe719e45331eb0aa3a81f418203\": rpc error: code = NotFound desc = could not find container \"a0a70f83a678b997490492b70548c70d5eaccbe719e45331eb0aa3a81f418203\": container with ID starting with a0a70f83a678b997490492b70548c70d5eaccbe719e45331eb0aa3a81f418203 not found: ID does not exist" Apr 21 10:50:10.702737 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:10.702677 2577 scope.go:117] "RemoveContainer" containerID="704d0c68f535bf7936016d9701fcb335dbc6a07943e7bc7eee18f726cdeef10a" Apr 21 10:50:10.703085 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:50:10.703056 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"704d0c68f535bf7936016d9701fcb335dbc6a07943e7bc7eee18f726cdeef10a\": container with ID starting with 704d0c68f535bf7936016d9701fcb335dbc6a07943e7bc7eee18f726cdeef10a not found: ID does not exist" containerID="704d0c68f535bf7936016d9701fcb335dbc6a07943e7bc7eee18f726cdeef10a" Apr 21 10:50:10.703211 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:10.703090 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"704d0c68f535bf7936016d9701fcb335dbc6a07943e7bc7eee18f726cdeef10a"} err="failed to get container status \"704d0c68f535bf7936016d9701fcb335dbc6a07943e7bc7eee18f726cdeef10a\": rpc error: code = NotFound desc = could not find container \"704d0c68f535bf7936016d9701fcb335dbc6a07943e7bc7eee18f726cdeef10a\": container with ID starting with 704d0c68f535bf7936016d9701fcb335dbc6a07943e7bc7eee18f726cdeef10a not found: ID does not exist" Apr 21 10:50:10.704489 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:10.704464 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-v8w4j"] Apr 21 10:50:12.232006 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:12.231974 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d8bf364-3d3e-4125-b2f4-9c95447732d9" path="/var/lib/kubelet/pods/9d8bf364-3d3e-4125-b2f4-9c95447732d9/volumes" Apr 21 10:50:20.684292 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:20.684248 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-c7f7d7dd-rglkp" podUID="4aa8f8fe-ca64-4792-a3b9-8389bd0e9e84" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.52:8080: connect: connection refused" Apr 21 10:50:30.685273 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:30.685244 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-c7f7d7dd-rglkp" Apr 21 10:50:39.137940 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:39.137912 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-runtime-predictor-c7f7d7dd-rglkp_4aa8f8fe-ca64-4792-a3b9-8389bd0e9e84/kserve-container/0.log" Apr 21 10:50:39.263491 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:39.263454 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-c7f7d7dd-rglkp"] Apr 21 10:50:39.263796 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:39.263733 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-c7f7d7dd-rglkp" podUID="4aa8f8fe-ca64-4792-a3b9-8389bd0e9e84" containerName="kserve-container" containerID="cri-o://cdc127bec455fa1c3dbc8953e89cd542599b5f910cae354e02a7136be08b69c2" gracePeriod=30 Apr 21 10:50:39.322879 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:39.322844 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-m9vdb"] Apr 21 10:50:39.323238 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:39.323220 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d8bf364-3d3e-4125-b2f4-9c95447732d9" containerName="storage-initializer" Apr 21 10:50:39.323327 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:39.323240 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d8bf364-3d3e-4125-b2f4-9c95447732d9" containerName="storage-initializer" Apr 21 10:50:39.323327 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:39.323271 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d8bf364-3d3e-4125-b2f4-9c95447732d9" containerName="kserve-container" Apr 21 10:50:39.323327 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:39.323281 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d8bf364-3d3e-4125-b2f4-9c95447732d9" containerName="kserve-container" Apr 21 10:50:39.323491 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:39.323354 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="9d8bf364-3d3e-4125-b2f4-9c95447732d9" containerName="kserve-container" Apr 21 10:50:39.326400 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:39.326378 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-m9vdb" Apr 21 10:50:39.336522 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:39.336483 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-m9vdb"] Apr 21 10:50:39.411406 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:39.411300 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/72044f8d-4bd3-4ffc-ba63-073778bf0c49-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-7dd9b85c64-m9vdb\" (UID: \"72044f8d-4bd3-4ffc-ba63-073778bf0c49\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-m9vdb" Apr 21 10:50:39.511816 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:39.511774 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/72044f8d-4bd3-4ffc-ba63-073778bf0c49-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-7dd9b85c64-m9vdb\" (UID: \"72044f8d-4bd3-4ffc-ba63-073778bf0c49\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-m9vdb" Apr 21 10:50:39.512143 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:39.512123 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/72044f8d-4bd3-4ffc-ba63-073778bf0c49-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-7dd9b85c64-m9vdb\" (UID: \"72044f8d-4bd3-4ffc-ba63-073778bf0c49\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-m9vdb" Apr 21 10:50:39.638199 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:39.638158 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-m9vdb" Apr 21 10:50:39.762829 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:39.762791 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-m9vdb"] Apr 21 10:50:39.765683 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:50:39.765655 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72044f8d_4bd3_4ffc_ba63_073778bf0c49.slice/crio-dbc032013cce8e14a6e119e2432425f43308adcdc673e0dacf4e91136a994865 WatchSource:0}: Error finding container dbc032013cce8e14a6e119e2432425f43308adcdc673e0dacf4e91136a994865: Status 404 returned error can't find the container with id dbc032013cce8e14a6e119e2432425f43308adcdc673e0dacf4e91136a994865 Apr 21 10:50:39.775991 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:39.775966 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-m9vdb" event={"ID":"72044f8d-4bd3-4ffc-ba63-073778bf0c49","Type":"ContainerStarted","Data":"dbc032013cce8e14a6e119e2432425f43308adcdc673e0dacf4e91136a994865"} Apr 21 10:50:40.610858 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:40.610833 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-c7f7d7dd-rglkp" Apr 21 10:50:40.723911 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:40.723865 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4aa8f8fe-ca64-4792-a3b9-8389bd0e9e84-kserve-provision-location\") pod \"4aa8f8fe-ca64-4792-a3b9-8389bd0e9e84\" (UID: \"4aa8f8fe-ca64-4792-a3b9-8389bd0e9e84\") " Apr 21 10:50:40.730478 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:40.730449 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4aa8f8fe-ca64-4792-a3b9-8389bd0e9e84-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4aa8f8fe-ca64-4792-a3b9-8389bd0e9e84" (UID: "4aa8f8fe-ca64-4792-a3b9-8389bd0e9e84"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:50:40.780612 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:40.780576 2577 generic.go:358] "Generic (PLEG): container finished" podID="4aa8f8fe-ca64-4792-a3b9-8389bd0e9e84" containerID="cdc127bec455fa1c3dbc8953e89cd542599b5f910cae354e02a7136be08b69c2" exitCode=0 Apr 21 10:50:40.780799 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:40.780653 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-c7f7d7dd-rglkp" event={"ID":"4aa8f8fe-ca64-4792-a3b9-8389bd0e9e84","Type":"ContainerDied","Data":"cdc127bec455fa1c3dbc8953e89cd542599b5f910cae354e02a7136be08b69c2"} Apr 21 10:50:40.780799 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:40.780671 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-c7f7d7dd-rglkp" Apr 21 10:50:40.780799 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:40.780693 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-c7f7d7dd-rglkp" event={"ID":"4aa8f8fe-ca64-4792-a3b9-8389bd0e9e84","Type":"ContainerDied","Data":"5249d958c280ca2cd9ba09e6733b0fa0ace9d83d69110fc5c16a6d9753fb46e2"} Apr 21 10:50:40.780799 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:40.780712 2577 scope.go:117] "RemoveContainer" containerID="cdc127bec455fa1c3dbc8953e89cd542599b5f910cae354e02a7136be08b69c2" Apr 21 10:50:40.782047 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:40.782021 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-m9vdb" event={"ID":"72044f8d-4bd3-4ffc-ba63-073778bf0c49","Type":"ContainerStarted","Data":"0bb7eef851b785df51053d0e279f47d5efad8d275b3cb4c5d3e6f4aefcdaacdc"} Apr 21 10:50:40.790044 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:40.790024 2577 scope.go:117] "RemoveContainer" containerID="b5fa905f335474d7c4fef7fe73c4830c2bc77bb957957d24d943b89d4039417e" Apr 21 10:50:40.797239 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:40.797216 2577 scope.go:117] "RemoveContainer" containerID="cdc127bec455fa1c3dbc8953e89cd542599b5f910cae354e02a7136be08b69c2" Apr 21 10:50:40.797511 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:50:40.797494 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdc127bec455fa1c3dbc8953e89cd542599b5f910cae354e02a7136be08b69c2\": container with ID starting with cdc127bec455fa1c3dbc8953e89cd542599b5f910cae354e02a7136be08b69c2 not found: ID does not exist" containerID="cdc127bec455fa1c3dbc8953e89cd542599b5f910cae354e02a7136be08b69c2" Apr 21 10:50:40.797565 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:40.797522 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdc127bec455fa1c3dbc8953e89cd542599b5f910cae354e02a7136be08b69c2"} err="failed to get container status \"cdc127bec455fa1c3dbc8953e89cd542599b5f910cae354e02a7136be08b69c2\": rpc error: code = NotFound desc = could not find container \"cdc127bec455fa1c3dbc8953e89cd542599b5f910cae354e02a7136be08b69c2\": container with ID starting with cdc127bec455fa1c3dbc8953e89cd542599b5f910cae354e02a7136be08b69c2 not found: ID does not exist" Apr 21 10:50:40.797565 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:40.797540 2577 scope.go:117] "RemoveContainer" containerID="b5fa905f335474d7c4fef7fe73c4830c2bc77bb957957d24d943b89d4039417e" Apr 21 10:50:40.797801 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:50:40.797782 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5fa905f335474d7c4fef7fe73c4830c2bc77bb957957d24d943b89d4039417e\": container with ID starting with b5fa905f335474d7c4fef7fe73c4830c2bc77bb957957d24d943b89d4039417e not found: ID does not exist" containerID="b5fa905f335474d7c4fef7fe73c4830c2bc77bb957957d24d943b89d4039417e" Apr 21 10:50:40.797854 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:40.797809 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5fa905f335474d7c4fef7fe73c4830c2bc77bb957957d24d943b89d4039417e"} err="failed to get container status \"b5fa905f335474d7c4fef7fe73c4830c2bc77bb957957d24d943b89d4039417e\": rpc error: code = NotFound desc = could not find container \"b5fa905f335474d7c4fef7fe73c4830c2bc77bb957957d24d943b89d4039417e\": container with ID starting with b5fa905f335474d7c4fef7fe73c4830c2bc77bb957957d24d943b89d4039417e not found: ID does not exist" Apr 21 10:50:40.814034 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:40.813999 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-c7f7d7dd-rglkp"] Apr 21 10:50:40.817141 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:40.817113 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-c7f7d7dd-rglkp"] Apr 21 10:50:40.825057 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:40.825033 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4aa8f8fe-ca64-4792-a3b9-8389bd0e9e84-kserve-provision-location\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:50:42.232397 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:42.232361 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4aa8f8fe-ca64-4792-a3b9-8389bd0e9e84" path="/var/lib/kubelet/pods/4aa8f8fe-ca64-4792-a3b9-8389bd0e9e84/volumes" Apr 21 10:50:43.793104 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:43.793069 2577 generic.go:358] "Generic (PLEG): container finished" podID="72044f8d-4bd3-4ffc-ba63-073778bf0c49" containerID="0bb7eef851b785df51053d0e279f47d5efad8d275b3cb4c5d3e6f4aefcdaacdc" exitCode=0 Apr 21 10:50:43.793482 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:43.793147 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-m9vdb" event={"ID":"72044f8d-4bd3-4ffc-ba63-073778bf0c49","Type":"ContainerDied","Data":"0bb7eef851b785df51053d0e279f47d5efad8d275b3cb4c5d3e6f4aefcdaacdc"} Apr 21 10:50:44.798329 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:44.798293 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-m9vdb" event={"ID":"72044f8d-4bd3-4ffc-ba63-073778bf0c49","Type":"ContainerStarted","Data":"66811061f79aed32f0d9fc9ee5f917860c925cd7f8cfe36e14f6a57e4d768782"} Apr 21 10:50:44.798720 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:44.798519 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-m9vdb" Apr 21 10:50:44.815178 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:50:44.815131 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-m9vdb" podStartSLOduration=5.8151165769999995 podStartE2EDuration="5.815116577s" podCreationTimestamp="2026-04-21 10:50:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:50:44.814474925 +0000 UTC m=+2813.153008942" watchObservedRunningTime="2026-04-21 10:50:44.815116577 +0000 UTC m=+2813.153650593" Apr 21 10:51:15.895254 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:51:15.895202 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-m9vdb" podUID="72044f8d-4bd3-4ffc-ba63-073778bf0c49" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 21 10:51:25.804494 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:51:25.804452 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-m9vdb" Apr 21 10:51:29.437716 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:51:29.437679 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-m9vdb"] Apr 21 10:51:29.438144 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:51:29.437943 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-m9vdb" podUID="72044f8d-4bd3-4ffc-ba63-073778bf0c49" containerName="kserve-container" containerID="cri-o://66811061f79aed32f0d9fc9ee5f917860c925cd7f8cfe36e14f6a57e4d768782" gracePeriod=30 Apr 21 10:51:29.491035 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:51:29.491000 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-56dc7f9fb6-9qccz"] Apr 21 10:51:29.491308 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:51:29.491297 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4aa8f8fe-ca64-4792-a3b9-8389bd0e9e84" containerName="storage-initializer" Apr 21 10:51:29.491353 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:51:29.491310 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aa8f8fe-ca64-4792-a3b9-8389bd0e9e84" containerName="storage-initializer" Apr 21 10:51:29.491387 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:51:29.491354 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4aa8f8fe-ca64-4792-a3b9-8389bd0e9e84" containerName="kserve-container" Apr 21 10:51:29.491387 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:51:29.491360 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aa8f8fe-ca64-4792-a3b9-8389bd0e9e84" containerName="kserve-container" Apr 21 10:51:29.491454 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:51:29.491417 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="4aa8f8fe-ca64-4792-a3b9-8389bd0e9e84" containerName="kserve-container" Apr 21 10:51:29.494232 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:51:29.494209 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-56dc7f9fb6-9qccz" Apr 21 10:51:29.502686 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:51:29.502658 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-56dc7f9fb6-9qccz"] Apr 21 10:51:29.519587 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:51:29.519550 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9b929e01-6797-45d2-ad82-56dc1f0a8a61-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-56dc7f9fb6-9qccz\" (UID: \"9b929e01-6797-45d2-ad82-56dc1f0a8a61\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-56dc7f9fb6-9qccz" Apr 21 10:51:29.620482 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:51:29.620445 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9b929e01-6797-45d2-ad82-56dc1f0a8a61-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-56dc7f9fb6-9qccz\" (UID: \"9b929e01-6797-45d2-ad82-56dc1f0a8a61\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-56dc7f9fb6-9qccz" Apr 21 10:51:29.620814 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:51:29.620796 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9b929e01-6797-45d2-ad82-56dc1f0a8a61-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-56dc7f9fb6-9qccz\" (UID: \"9b929e01-6797-45d2-ad82-56dc1f0a8a61\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-56dc7f9fb6-9qccz" Apr 21 10:51:29.805858 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:51:29.805822 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-56dc7f9fb6-9qccz" Apr 21 10:51:29.928672 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:51:29.928630 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-56dc7f9fb6-9qccz"] Apr 21 10:51:30.938417 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:51:30.938380 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-56dc7f9fb6-9qccz" event={"ID":"9b929e01-6797-45d2-ad82-56dc1f0a8a61","Type":"ContainerStarted","Data":"49b68710730b8b0e620d2bd318a79666e45c8152e5b5d41a76e4112914142ccd"} Apr 21 10:51:30.938417 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:51:30.938417 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-56dc7f9fb6-9qccz" event={"ID":"9b929e01-6797-45d2-ad82-56dc1f0a8a61","Type":"ContainerStarted","Data":"8807293ab76f7c1e742d951c73ab306b4eb560103efb2ec41e1f972c292f2c4f"} Apr 21 10:51:33.947737 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:51:33.947645 2577 generic.go:358] "Generic (PLEG): container finished" podID="9b929e01-6797-45d2-ad82-56dc1f0a8a61" containerID="49b68710730b8b0e620d2bd318a79666e45c8152e5b5d41a76e4112914142ccd" exitCode=0 Apr 21 10:51:33.947737 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:51:33.947716 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-56dc7f9fb6-9qccz" event={"ID":"9b929e01-6797-45d2-ad82-56dc1f0a8a61","Type":"ContainerDied","Data":"49b68710730b8b0e620d2bd318a79666e45c8152e5b5d41a76e4112914142ccd"} Apr 21 10:51:34.952667 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:51:34.952630 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-56dc7f9fb6-9qccz" event={"ID":"9b929e01-6797-45d2-ad82-56dc1f0a8a61","Type":"ContainerStarted","Data":"c56dfe00385d2580949cd35b36d6189e678974fc9dc318744822279250122876"} Apr 21 10:51:34.953093 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:51:34.952924 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-56dc7f9fb6-9qccz" Apr 21 10:51:34.954338 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:51:34.954309 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-56dc7f9fb6-9qccz" podUID="9b929e01-6797-45d2-ad82-56dc1f0a8a61" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 21 10:51:34.970614 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:51:34.970567 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-56dc7f9fb6-9qccz" podStartSLOduration=5.970552874 podStartE2EDuration="5.970552874s" podCreationTimestamp="2026-04-21 10:51:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:51:34.96923187 +0000 UTC m=+2863.307765887" watchObservedRunningTime="2026-04-21 10:51:34.970552874 +0000 UTC m=+2863.309086891" Apr 21 10:51:35.802854 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:51:35.802808 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-m9vdb" podUID="72044f8d-4bd3-4ffc-ba63-073778bf0c49" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.53:8080/v2/models/isvc-sklearn-v2-runtime/ready\": dial tcp 10.134.0.53:8080: connect: connection refused" Apr 21 10:51:35.956286 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:51:35.956248 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-56dc7f9fb6-9qccz" podUID="9b929e01-6797-45d2-ad82-56dc1f0a8a61" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 21 10:51:37.506713 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:51:37.506687 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-m9vdb" Apr 21 10:51:37.582209 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:51:37.582113 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/72044f8d-4bd3-4ffc-ba63-073778bf0c49-kserve-provision-location\") pod \"72044f8d-4bd3-4ffc-ba63-073778bf0c49\" (UID: \"72044f8d-4bd3-4ffc-ba63-073778bf0c49\") " Apr 21 10:51:37.582450 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:51:37.582423 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72044f8d-4bd3-4ffc-ba63-073778bf0c49-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "72044f8d-4bd3-4ffc-ba63-073778bf0c49" (UID: "72044f8d-4bd3-4ffc-ba63-073778bf0c49"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:51:37.682609 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:51:37.682577 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/72044f8d-4bd3-4ffc-ba63-073778bf0c49-kserve-provision-location\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:51:37.964695 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:51:37.964657 2577 generic.go:358] "Generic (PLEG): container finished" podID="72044f8d-4bd3-4ffc-ba63-073778bf0c49" containerID="66811061f79aed32f0d9fc9ee5f917860c925cd7f8cfe36e14f6a57e4d768782" exitCode=0 Apr 21 10:51:37.964897 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:51:37.964722 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-m9vdb" Apr 21 10:51:37.964897 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:51:37.964722 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-m9vdb" event={"ID":"72044f8d-4bd3-4ffc-ba63-073778bf0c49","Type":"ContainerDied","Data":"66811061f79aed32f0d9fc9ee5f917860c925cd7f8cfe36e14f6a57e4d768782"} Apr 21 10:51:37.964897 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:51:37.964774 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-m9vdb" event={"ID":"72044f8d-4bd3-4ffc-ba63-073778bf0c49","Type":"ContainerDied","Data":"dbc032013cce8e14a6e119e2432425f43308adcdc673e0dacf4e91136a994865"} Apr 21 10:51:37.964897 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:51:37.964793 2577 scope.go:117] "RemoveContainer" containerID="66811061f79aed32f0d9fc9ee5f917860c925cd7f8cfe36e14f6a57e4d768782" Apr 21 10:51:37.973334 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:51:37.973313 2577 scope.go:117] "RemoveContainer" containerID="0bb7eef851b785df51053d0e279f47d5efad8d275b3cb4c5d3e6f4aefcdaacdc" Apr 21 10:51:37.981788 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:51:37.981760 2577 scope.go:117] "RemoveContainer" containerID="66811061f79aed32f0d9fc9ee5f917860c925cd7f8cfe36e14f6a57e4d768782" Apr 21 10:51:37.982131 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:51:37.982109 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66811061f79aed32f0d9fc9ee5f917860c925cd7f8cfe36e14f6a57e4d768782\": container with ID starting with 66811061f79aed32f0d9fc9ee5f917860c925cd7f8cfe36e14f6a57e4d768782 not found: ID does not exist" containerID="66811061f79aed32f0d9fc9ee5f917860c925cd7f8cfe36e14f6a57e4d768782" Apr 21 10:51:37.982177 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:51:37.982144 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66811061f79aed32f0d9fc9ee5f917860c925cd7f8cfe36e14f6a57e4d768782"} err="failed to get container status \"66811061f79aed32f0d9fc9ee5f917860c925cd7f8cfe36e14f6a57e4d768782\": rpc error: code = NotFound desc = could not find container \"66811061f79aed32f0d9fc9ee5f917860c925cd7f8cfe36e14f6a57e4d768782\": container with ID starting with 66811061f79aed32f0d9fc9ee5f917860c925cd7f8cfe36e14f6a57e4d768782 not found: ID does not exist" Apr 21 10:51:37.982177 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:51:37.982165 2577 scope.go:117] "RemoveContainer" containerID="0bb7eef851b785df51053d0e279f47d5efad8d275b3cb4c5d3e6f4aefcdaacdc" Apr 21 10:51:37.982421 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:51:37.982402 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bb7eef851b785df51053d0e279f47d5efad8d275b3cb4c5d3e6f4aefcdaacdc\": container with ID starting with 0bb7eef851b785df51053d0e279f47d5efad8d275b3cb4c5d3e6f4aefcdaacdc not found: ID does not exist" containerID="0bb7eef851b785df51053d0e279f47d5efad8d275b3cb4c5d3e6f4aefcdaacdc" Apr 21 10:51:37.982456 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:51:37.982429 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bb7eef851b785df51053d0e279f47d5efad8d275b3cb4c5d3e6f4aefcdaacdc"} err="failed to get container status \"0bb7eef851b785df51053d0e279f47d5efad8d275b3cb4c5d3e6f4aefcdaacdc\": rpc error: code = NotFound desc = could not find container \"0bb7eef851b785df51053d0e279f47d5efad8d275b3cb4c5d3e6f4aefcdaacdc\": container with ID starting with 0bb7eef851b785df51053d0e279f47d5efad8d275b3cb4c5d3e6f4aefcdaacdc not found: ID does not exist" Apr 21 10:51:37.985096 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:51:37.985073 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-m9vdb"] Apr 21 10:51:37.988465 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:51:37.988442 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-m9vdb"] Apr 21 10:51:38.232250 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:51:38.232168 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72044f8d-4bd3-4ffc-ba63-073778bf0c49" path="/var/lib/kubelet/pods/72044f8d-4bd3-4ffc-ba63-073778bf0c49/volumes" Apr 21 10:51:45.957316 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:51:45.957220 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-56dc7f9fb6-9qccz" podUID="9b929e01-6797-45d2-ad82-56dc1f0a8a61" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 21 10:51:55.956886 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:51:55.956841 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-56dc7f9fb6-9qccz" podUID="9b929e01-6797-45d2-ad82-56dc1f0a8a61" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 21 10:52:05.956652 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:52:05.956607 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-56dc7f9fb6-9qccz" podUID="9b929e01-6797-45d2-ad82-56dc1f0a8a61" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 21 10:52:15.957016 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:52:15.956975 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-56dc7f9fb6-9qccz" podUID="9b929e01-6797-45d2-ad82-56dc1f0a8a61" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 21 10:52:25.957229 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:52:25.957182 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-56dc7f9fb6-9qccz" podUID="9b929e01-6797-45d2-ad82-56dc1f0a8a61" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 21 10:52:35.957263 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:52:35.957211 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-56dc7f9fb6-9qccz" podUID="9b929e01-6797-45d2-ad82-56dc1f0a8a61" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.54:8080: connect: connection refused" Apr 21 10:52:45.957452 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:52:45.957421 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-56dc7f9fb6-9qccz" Apr 21 10:52:49.668577 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:52:49.668539 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-56dc7f9fb6-9qccz"] Apr 21 10:52:49.668993 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:52:49.668828 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-56dc7f9fb6-9qccz" podUID="9b929e01-6797-45d2-ad82-56dc1f0a8a61" containerName="kserve-container" containerID="cri-o://c56dfe00385d2580949cd35b36d6189e678974fc9dc318744822279250122876" gracePeriod=30 Apr 21 10:52:49.716245 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:52:49.716208 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7995f6dc65-vxsbj"] Apr 21 10:52:49.716517 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:52:49.716504 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="72044f8d-4bd3-4ffc-ba63-073778bf0c49" containerName="kserve-container" Apr 21 10:52:49.716517 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:52:49.716518 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="72044f8d-4bd3-4ffc-ba63-073778bf0c49" containerName="kserve-container" Apr 21 10:52:49.716650 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:52:49.716526 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="72044f8d-4bd3-4ffc-ba63-073778bf0c49" containerName="storage-initializer" Apr 21 10:52:49.716650 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:52:49.716531 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="72044f8d-4bd3-4ffc-ba63-073778bf0c49" containerName="storage-initializer" Apr 21 10:52:49.716650 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:52:49.716588 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="72044f8d-4bd3-4ffc-ba63-073778bf0c49" containerName="kserve-container" Apr 21 10:52:49.719578 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:52:49.719561 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7995f6dc65-vxsbj" Apr 21 10:52:49.726889 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:52:49.726865 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7995f6dc65-vxsbj"] Apr 21 10:52:49.739863 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:52:49.739832 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4cca3709-1d82-405c-9e21-55c6546a18ac-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-7995f6dc65-vxsbj\" (UID: \"4cca3709-1d82-405c-9e21-55c6546a18ac\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7995f6dc65-vxsbj" Apr 21 10:52:49.840272 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:52:49.840230 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4cca3709-1d82-405c-9e21-55c6546a18ac-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-7995f6dc65-vxsbj\" (UID: \"4cca3709-1d82-405c-9e21-55c6546a18ac\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7995f6dc65-vxsbj" Apr 21 10:52:49.840607 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:52:49.840585 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4cca3709-1d82-405c-9e21-55c6546a18ac-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-7995f6dc65-vxsbj\" (UID: \"4cca3709-1d82-405c-9e21-55c6546a18ac\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7995f6dc65-vxsbj" Apr 21 10:52:50.030415 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:52:50.030373 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7995f6dc65-vxsbj" Apr 21 10:52:50.156566 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:52:50.156529 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7995f6dc65-vxsbj"] Apr 21 10:52:50.159690 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:52:50.159654 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4cca3709_1d82_405c_9e21_55c6546a18ac.slice/crio-202d01acb8a6e22fa55c09f0c53a3196ad14497eba377313ee47d7e77f5797f7 WatchSource:0}: Error finding container 202d01acb8a6e22fa55c09f0c53a3196ad14497eba377313ee47d7e77f5797f7: Status 404 returned error can't find the container with id 202d01acb8a6e22fa55c09f0c53a3196ad14497eba377313ee47d7e77f5797f7 Apr 21 10:52:50.177647 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:52:50.177618 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7995f6dc65-vxsbj" event={"ID":"4cca3709-1d82-405c-9e21-55c6546a18ac","Type":"ContainerStarted","Data":"202d01acb8a6e22fa55c09f0c53a3196ad14497eba377313ee47d7e77f5797f7"} Apr 21 10:52:51.182779 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:52:51.182724 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7995f6dc65-vxsbj" event={"ID":"4cca3709-1d82-405c-9e21-55c6546a18ac","Type":"ContainerStarted","Data":"c1e1c3650a394bddd1a3dc4d06b8d68f91b4e3ccbd7bb8b242a22b5714fe5f17"} Apr 21 10:52:54.193434 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:52:54.193395 2577 generic.go:358] "Generic (PLEG): container finished" podID="9b929e01-6797-45d2-ad82-56dc1f0a8a61" containerID="c56dfe00385d2580949cd35b36d6189e678974fc9dc318744822279250122876" exitCode=0 Apr 21 10:52:54.193899 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:52:54.193459 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-56dc7f9fb6-9qccz" event={"ID":"9b929e01-6797-45d2-ad82-56dc1f0a8a61","Type":"ContainerDied","Data":"c56dfe00385d2580949cd35b36d6189e678974fc9dc318744822279250122876"} Apr 21 10:52:54.195076 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:52:54.195052 2577 generic.go:358] "Generic (PLEG): container finished" podID="4cca3709-1d82-405c-9e21-55c6546a18ac" containerID="c1e1c3650a394bddd1a3dc4d06b8d68f91b4e3ccbd7bb8b242a22b5714fe5f17" exitCode=0 Apr 21 10:52:54.195214 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:52:54.195123 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7995f6dc65-vxsbj" event={"ID":"4cca3709-1d82-405c-9e21-55c6546a18ac","Type":"ContainerDied","Data":"c1e1c3650a394bddd1a3dc4d06b8d68f91b4e3ccbd7bb8b242a22b5714fe5f17"} Apr 21 10:52:54.314387 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:52:54.314365 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-56dc7f9fb6-9qccz" Apr 21 10:52:54.378396 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:52:54.378359 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9b929e01-6797-45d2-ad82-56dc1f0a8a61-kserve-provision-location\") pod \"9b929e01-6797-45d2-ad82-56dc1f0a8a61\" (UID: \"9b929e01-6797-45d2-ad82-56dc1f0a8a61\") " Apr 21 10:52:54.378709 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:52:54.378685 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b929e01-6797-45d2-ad82-56dc1f0a8a61-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9b929e01-6797-45d2-ad82-56dc1f0a8a61" (UID: "9b929e01-6797-45d2-ad82-56dc1f0a8a61"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:52:54.479148 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:52:54.479099 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9b929e01-6797-45d2-ad82-56dc1f0a8a61-kserve-provision-location\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:52:55.198983 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:52:55.198955 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-56dc7f9fb6-9qccz" Apr 21 10:52:55.198983 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:52:55.198973 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-56dc7f9fb6-9qccz" event={"ID":"9b929e01-6797-45d2-ad82-56dc1f0a8a61","Type":"ContainerDied","Data":"8807293ab76f7c1e742d951c73ab306b4eb560103efb2ec41e1f972c292f2c4f"} Apr 21 10:52:55.199526 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:52:55.199019 2577 scope.go:117] "RemoveContainer" containerID="c56dfe00385d2580949cd35b36d6189e678974fc9dc318744822279250122876" Apr 21 10:52:55.200838 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:52:55.200808 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7995f6dc65-vxsbj" event={"ID":"4cca3709-1d82-405c-9e21-55c6546a18ac","Type":"ContainerStarted","Data":"5884ec2c381deba4626eb77503be93a04441d417be313b0ea2c7b2541d239570"} Apr 21 10:52:55.201111 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:52:55.201085 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7995f6dc65-vxsbj" Apr 21 10:52:55.202915 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:52:55.202889 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7995f6dc65-vxsbj" podUID="4cca3709-1d82-405c-9e21-55c6546a18ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 21 10:52:55.207732 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:52:55.207714 2577 scope.go:117] "RemoveContainer" containerID="49b68710730b8b0e620d2bd318a79666e45c8152e5b5d41a76e4112914142ccd" Apr 21 10:52:55.217630 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:52:55.217585 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7995f6dc65-vxsbj" podStartSLOduration=6.217571923 podStartE2EDuration="6.217571923s" podCreationTimestamp="2026-04-21 10:52:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:52:55.216360027 +0000 UTC m=+2943.554894041" watchObservedRunningTime="2026-04-21 10:52:55.217571923 +0000 UTC m=+2943.556105940" Apr 21 10:52:55.230245 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:52:55.230212 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-56dc7f9fb6-9qccz"] Apr 21 10:52:55.236371 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:52:55.236345 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-56dc7f9fb6-9qccz"] Apr 21 10:52:56.205233 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:52:56.205194 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7995f6dc65-vxsbj" podUID="4cca3709-1d82-405c-9e21-55c6546a18ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 21 10:52:56.232355 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:52:56.232313 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b929e01-6797-45d2-ad82-56dc1f0a8a61" path="/var/lib/kubelet/pods/9b929e01-6797-45d2-ad82-56dc1f0a8a61/volumes" Apr 21 10:53:06.206274 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:53:06.206231 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7995f6dc65-vxsbj" podUID="4cca3709-1d82-405c-9e21-55c6546a18ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 21 10:53:16.205696 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:53:16.205592 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7995f6dc65-vxsbj" podUID="4cca3709-1d82-405c-9e21-55c6546a18ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 21 10:53:26.206249 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:53:26.206206 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7995f6dc65-vxsbj" podUID="4cca3709-1d82-405c-9e21-55c6546a18ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 21 10:53:36.205471 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:53:36.205426 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7995f6dc65-vxsbj" podUID="4cca3709-1d82-405c-9e21-55c6546a18ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 21 10:53:46.205966 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:53:46.205912 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7995f6dc65-vxsbj" podUID="4cca3709-1d82-405c-9e21-55c6546a18ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 21 10:53:52.331580 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:53:52.331546 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-9vsls_c7fc40ec-8080-4847-9b12-78671f916c03/console-operator/2.log" Apr 21 10:53:52.334465 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:53:52.334439 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-9vsls_c7fc40ec-8080-4847-9b12-78671f916c03/console-operator/2.log" Apr 21 10:53:56.205686 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:53:56.205639 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7995f6dc65-vxsbj" podUID="4cca3709-1d82-405c-9e21-55c6546a18ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 21 10:54:01.228942 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:54:01.228910 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7995f6dc65-vxsbj" Apr 21 10:54:09.814690 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:54:09.814652 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7995f6dc65-vxsbj"] Apr 21 10:54:09.815275 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:54:09.815246 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7995f6dc65-vxsbj" podUID="4cca3709-1d82-405c-9e21-55c6546a18ac" containerName="kserve-container" containerID="cri-o://5884ec2c381deba4626eb77503be93a04441d417be313b0ea2c7b2541d239570" gracePeriod=30 Apr 21 10:54:09.873718 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:54:09.873676 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-wpxxs"] Apr 21 10:54:09.874028 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:54:09.874014 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9b929e01-6797-45d2-ad82-56dc1f0a8a61" containerName="storage-initializer" Apr 21 10:54:09.874080 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:54:09.874030 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b929e01-6797-45d2-ad82-56dc1f0a8a61" containerName="storage-initializer" Apr 21 10:54:09.874080 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:54:09.874040 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9b929e01-6797-45d2-ad82-56dc1f0a8a61" containerName="kserve-container" Apr 21 10:54:09.874080 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:54:09.874046 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b929e01-6797-45d2-ad82-56dc1f0a8a61" containerName="kserve-container" Apr 21 10:54:09.874197 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:54:09.874103 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="9b929e01-6797-45d2-ad82-56dc1f0a8a61" containerName="kserve-container" Apr 21 10:54:09.877050 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:54:09.877031 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-wpxxs" Apr 21 10:54:09.886919 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:54:09.886888 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-wpxxs"] Apr 21 10:54:09.976164 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:54:09.976125 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/495b8b26-8708-49ad-b82d-e8a20fe5a26a-kserve-provision-location\") pod \"isvc-tensorflow-predictor-88f6f6cb7-wpxxs\" (UID: \"495b8b26-8708-49ad-b82d-e8a20fe5a26a\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-wpxxs" Apr 21 10:54:10.077479 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:54:10.077389 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/495b8b26-8708-49ad-b82d-e8a20fe5a26a-kserve-provision-location\") pod \"isvc-tensorflow-predictor-88f6f6cb7-wpxxs\" (UID: \"495b8b26-8708-49ad-b82d-e8a20fe5a26a\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-wpxxs" Apr 21 10:54:10.077826 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:54:10.077804 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/495b8b26-8708-49ad-b82d-e8a20fe5a26a-kserve-provision-location\") pod \"isvc-tensorflow-predictor-88f6f6cb7-wpxxs\" (UID: \"495b8b26-8708-49ad-b82d-e8a20fe5a26a\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-wpxxs" Apr 21 10:54:10.187677 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:54:10.187632 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-wpxxs" Apr 21 10:54:10.313122 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:54:10.313097 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-wpxxs"] Apr 21 10:54:10.315564 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:54:10.315538 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod495b8b26_8708_49ad_b82d_e8a20fe5a26a.slice/crio-f6a32b0d22ed7aee870ab5182bfa845f7996a748cfdfe0348169d89d84c4ba51 WatchSource:0}: Error finding container f6a32b0d22ed7aee870ab5182bfa845f7996a748cfdfe0348169d89d84c4ba51: Status 404 returned error can't find the container with id f6a32b0d22ed7aee870ab5182bfa845f7996a748cfdfe0348169d89d84c4ba51 Apr 21 10:54:10.317469 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:54:10.317453 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 10:54:10.421986 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:54:10.421948 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-wpxxs" event={"ID":"495b8b26-8708-49ad-b82d-e8a20fe5a26a","Type":"ContainerStarted","Data":"0ddef7bde5a41790b79af84480391a92650edbe4350c82d8d7c9497f01665f57"} Apr 21 10:54:10.421986 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:54:10.421988 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-wpxxs" event={"ID":"495b8b26-8708-49ad-b82d-e8a20fe5a26a","Type":"ContainerStarted","Data":"f6a32b0d22ed7aee870ab5182bfa845f7996a748cfdfe0348169d89d84c4ba51"} Apr 21 10:54:11.228053 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:54:11.228014 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7995f6dc65-vxsbj" podUID="4cca3709-1d82-405c-9e21-55c6546a18ac" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.55:8080: connect: connection refused" Apr 21 10:54:14.168413 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:54:14.168387 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7995f6dc65-vxsbj" Apr 21 10:54:14.211027 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:54:14.210985 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4cca3709-1d82-405c-9e21-55c6546a18ac-kserve-provision-location\") pod \"4cca3709-1d82-405c-9e21-55c6546a18ac\" (UID: \"4cca3709-1d82-405c-9e21-55c6546a18ac\") " Apr 21 10:54:14.211406 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:54:14.211375 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4cca3709-1d82-405c-9e21-55c6546a18ac-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4cca3709-1d82-405c-9e21-55c6546a18ac" (UID: "4cca3709-1d82-405c-9e21-55c6546a18ac"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:54:14.312460 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:54:14.312366 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4cca3709-1d82-405c-9e21-55c6546a18ac-kserve-provision-location\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:54:14.435690 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:54:14.435647 2577 generic.go:358] "Generic (PLEG): container finished" podID="4cca3709-1d82-405c-9e21-55c6546a18ac" containerID="5884ec2c381deba4626eb77503be93a04441d417be313b0ea2c7b2541d239570" exitCode=0 Apr 21 10:54:14.435882 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:54:14.435723 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7995f6dc65-vxsbj" event={"ID":"4cca3709-1d82-405c-9e21-55c6546a18ac","Type":"ContainerDied","Data":"5884ec2c381deba4626eb77503be93a04441d417be313b0ea2c7b2541d239570"} Apr 21 10:54:14.435882 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:54:14.435780 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7995f6dc65-vxsbj" event={"ID":"4cca3709-1d82-405c-9e21-55c6546a18ac","Type":"ContainerDied","Data":"202d01acb8a6e22fa55c09f0c53a3196ad14497eba377313ee47d7e77f5797f7"} Apr 21 10:54:14.435882 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:54:14.435796 2577 scope.go:117] "RemoveContainer" containerID="5884ec2c381deba4626eb77503be93a04441d417be313b0ea2c7b2541d239570" Apr 21 10:54:14.435882 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:54:14.435735 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7995f6dc65-vxsbj" Apr 21 10:54:14.444151 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:54:14.444126 2577 scope.go:117] "RemoveContainer" containerID="c1e1c3650a394bddd1a3dc4d06b8d68f91b4e3ccbd7bb8b242a22b5714fe5f17" Apr 21 10:54:14.454348 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:54:14.454317 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7995f6dc65-vxsbj"] Apr 21 10:54:14.454711 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:54:14.454692 2577 scope.go:117] "RemoveContainer" containerID="5884ec2c381deba4626eb77503be93a04441d417be313b0ea2c7b2541d239570" Apr 21 10:54:14.455229 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:54:14.455193 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5884ec2c381deba4626eb77503be93a04441d417be313b0ea2c7b2541d239570\": container with ID starting with 5884ec2c381deba4626eb77503be93a04441d417be313b0ea2c7b2541d239570 not found: ID does not exist" containerID="5884ec2c381deba4626eb77503be93a04441d417be313b0ea2c7b2541d239570" Apr 21 10:54:14.455347 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:54:14.455240 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5884ec2c381deba4626eb77503be93a04441d417be313b0ea2c7b2541d239570"} err="failed to get container status \"5884ec2c381deba4626eb77503be93a04441d417be313b0ea2c7b2541d239570\": rpc error: code = NotFound desc = could not find container \"5884ec2c381deba4626eb77503be93a04441d417be313b0ea2c7b2541d239570\": container with ID starting with 5884ec2c381deba4626eb77503be93a04441d417be313b0ea2c7b2541d239570 not found: ID does not exist" Apr 21 10:54:14.455347 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:54:14.455268 2577 scope.go:117] "RemoveContainer" containerID="c1e1c3650a394bddd1a3dc4d06b8d68f91b4e3ccbd7bb8b242a22b5714fe5f17" Apr 21 10:54:14.455539 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:54:14.455519 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1e1c3650a394bddd1a3dc4d06b8d68f91b4e3ccbd7bb8b242a22b5714fe5f17\": container with ID starting with c1e1c3650a394bddd1a3dc4d06b8d68f91b4e3ccbd7bb8b242a22b5714fe5f17 not found: ID does not exist" containerID="c1e1c3650a394bddd1a3dc4d06b8d68f91b4e3ccbd7bb8b242a22b5714fe5f17" Apr 21 10:54:14.455599 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:54:14.455545 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1e1c3650a394bddd1a3dc4d06b8d68f91b4e3ccbd7bb8b242a22b5714fe5f17"} err="failed to get container status \"c1e1c3650a394bddd1a3dc4d06b8d68f91b4e3ccbd7bb8b242a22b5714fe5f17\": rpc error: code = NotFound desc = could not find container \"c1e1c3650a394bddd1a3dc4d06b8d68f91b4e3ccbd7bb8b242a22b5714fe5f17\": container with ID starting with c1e1c3650a394bddd1a3dc4d06b8d68f91b4e3ccbd7bb8b242a22b5714fe5f17 not found: ID does not exist" Apr 21 10:54:14.459303 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:54:14.459283 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7995f6dc65-vxsbj"] Apr 21 10:54:15.441518 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:54:15.441482 2577 generic.go:358] "Generic (PLEG): container finished" podID="495b8b26-8708-49ad-b82d-e8a20fe5a26a" containerID="0ddef7bde5a41790b79af84480391a92650edbe4350c82d8d7c9497f01665f57" exitCode=0 Apr 21 10:54:15.441955 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:54:15.441558 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-wpxxs" event={"ID":"495b8b26-8708-49ad-b82d-e8a20fe5a26a","Type":"ContainerDied","Data":"0ddef7bde5a41790b79af84480391a92650edbe4350c82d8d7c9497f01665f57"} Apr 21 10:54:16.232648 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:54:16.232608 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cca3709-1d82-405c-9e21-55c6546a18ac" path="/var/lib/kubelet/pods/4cca3709-1d82-405c-9e21-55c6546a18ac/volumes" Apr 21 10:54:19.456737 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:54:19.456651 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-wpxxs" event={"ID":"495b8b26-8708-49ad-b82d-e8a20fe5a26a","Type":"ContainerStarted","Data":"8432377a1090bbe8b0c735166ad590dd83256d674028b2dccc024924af2780d5"} Apr 21 10:54:19.457188 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:54:19.457009 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-wpxxs" Apr 21 10:54:19.458324 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:54:19.458299 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-wpxxs" podUID="495b8b26-8708-49ad-b82d-e8a20fe5a26a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 21 10:54:19.472154 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:54:19.472106 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-wpxxs" podStartSLOduration=6.820090338 podStartE2EDuration="10.472091331s" podCreationTimestamp="2026-04-21 10:54:09 +0000 UTC" firstStartedPulling="2026-04-21 10:54:15.442727391 +0000 UTC m=+3023.781261386" lastFinishedPulling="2026-04-21 10:54:19.094728384 +0000 UTC m=+3027.433262379" observedRunningTime="2026-04-21 10:54:19.470946321 +0000 UTC m=+3027.809480337" watchObservedRunningTime="2026-04-21 10:54:19.472091331 +0000 UTC m=+3027.810625345" Apr 21 10:54:20.459567 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:54:20.459522 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-wpxxs" podUID="495b8b26-8708-49ad-b82d-e8a20fe5a26a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 21 10:54:30.460040 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:54:30.459997 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-wpxxs" podUID="495b8b26-8708-49ad-b82d-e8a20fe5a26a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.56:8080: connect: connection refused" Apr 21 10:54:40.460370 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:54:40.460338 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-wpxxs" Apr 21 10:55:01.640450 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:01.640408 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-wpxxs"] Apr 21 10:55:01.640909 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:01.640685 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-wpxxs" podUID="495b8b26-8708-49ad-b82d-e8a20fe5a26a" containerName="kserve-container" containerID="cri-o://8432377a1090bbe8b0c735166ad590dd83256d674028b2dccc024924af2780d5" gracePeriod=30 Apr 21 10:55:01.704200 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:01.704161 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4mjtb"] Apr 21 10:55:01.704467 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:01.704455 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4cca3709-1d82-405c-9e21-55c6546a18ac" containerName="storage-initializer" Apr 21 10:55:01.704511 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:01.704469 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cca3709-1d82-405c-9e21-55c6546a18ac" containerName="storage-initializer" Apr 21 10:55:01.704511 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:01.704487 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4cca3709-1d82-405c-9e21-55c6546a18ac" containerName="kserve-container" Apr 21 10:55:01.704511 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:01.704492 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cca3709-1d82-405c-9e21-55c6546a18ac" containerName="kserve-container" Apr 21 10:55:01.704610 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:01.704538 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="4cca3709-1d82-405c-9e21-55c6546a18ac" containerName="kserve-container" Apr 21 10:55:01.707357 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:01.707341 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4mjtb" Apr 21 10:55:01.714627 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:01.714600 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4mjtb"] Apr 21 10:55:01.805021 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:01.804980 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c9ed4b32-8084-4ed1-ac54-8c46c04556b8-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-854bdff69c-4mjtb\" (UID: \"c9ed4b32-8084-4ed1-ac54-8c46c04556b8\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4mjtb" Apr 21 10:55:01.906272 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:01.906171 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c9ed4b32-8084-4ed1-ac54-8c46c04556b8-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-854bdff69c-4mjtb\" (UID: \"c9ed4b32-8084-4ed1-ac54-8c46c04556b8\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4mjtb" Apr 21 10:55:01.906573 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:01.906551 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c9ed4b32-8084-4ed1-ac54-8c46c04556b8-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-854bdff69c-4mjtb\" (UID: \"c9ed4b32-8084-4ed1-ac54-8c46c04556b8\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4mjtb" Apr 21 10:55:02.018038 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:02.017969 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4mjtb" Apr 21 10:55:02.140279 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:02.140253 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4mjtb"] Apr 21 10:55:02.142946 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:55:02.142916 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9ed4b32_8084_4ed1_ac54_8c46c04556b8.slice/crio-6710a9a93b270d997db47857449054079e72e453957c374f828ccf6256c9d83b WatchSource:0}: Error finding container 6710a9a93b270d997db47857449054079e72e453957c374f828ccf6256c9d83b: Status 404 returned error can't find the container with id 6710a9a93b270d997db47857449054079e72e453957c374f828ccf6256c9d83b Apr 21 10:55:02.580033 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:02.579995 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4mjtb" event={"ID":"c9ed4b32-8084-4ed1-ac54-8c46c04556b8","Type":"ContainerStarted","Data":"7e212796303b28cb7c97bd16bb9e034a282e60233ff5b55617ab0a288850756c"} Apr 21 10:55:02.580033 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:02.580033 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4mjtb" event={"ID":"c9ed4b32-8084-4ed1-ac54-8c46c04556b8","Type":"ContainerStarted","Data":"6710a9a93b270d997db47857449054079e72e453957c374f828ccf6256c9d83b"} Apr 21 10:55:06.592293 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:06.592260 2577 generic.go:358] "Generic (PLEG): container finished" podID="c9ed4b32-8084-4ed1-ac54-8c46c04556b8" containerID="7e212796303b28cb7c97bd16bb9e034a282e60233ff5b55617ab0a288850756c" exitCode=0 Apr 21 10:55:06.592655 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:06.592334 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4mjtb" event={"ID":"c9ed4b32-8084-4ed1-ac54-8c46c04556b8","Type":"ContainerDied","Data":"7e212796303b28cb7c97bd16bb9e034a282e60233ff5b55617ab0a288850756c"} Apr 21 10:55:07.597451 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:07.597418 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4mjtb" event={"ID":"c9ed4b32-8084-4ed1-ac54-8c46c04556b8","Type":"ContainerStarted","Data":"1e930ef1f5f8120cbf7f0dbff4cfa72e5f5ff7b09dba8d51358cb0db2fe20df4"} Apr 21 10:55:07.597858 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:07.597709 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4mjtb" Apr 21 10:55:07.599227 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:07.599198 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4mjtb" podUID="c9ed4b32-8084-4ed1-ac54-8c46c04556b8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.57:8080: connect: connection refused" Apr 21 10:55:07.614458 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:07.614405 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4mjtb" podStartSLOduration=6.614391729 podStartE2EDuration="6.614391729s" podCreationTimestamp="2026-04-21 10:55:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:55:07.613275883 +0000 UTC m=+3075.951809903" watchObservedRunningTime="2026-04-21 10:55:07.614391729 +0000 UTC m=+3075.952925745" Apr 21 10:55:08.600699 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:08.600656 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4mjtb" podUID="c9ed4b32-8084-4ed1-ac54-8c46c04556b8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.57:8080: connect: connection refused" Apr 21 10:55:18.601657 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:18.601626 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4mjtb" Apr 21 10:55:32.276056 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:32.276031 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-wpxxs" Apr 21 10:55:32.349758 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:32.349712 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/495b8b26-8708-49ad-b82d-e8a20fe5a26a-kserve-provision-location\") pod \"495b8b26-8708-49ad-b82d-e8a20fe5a26a\" (UID: \"495b8b26-8708-49ad-b82d-e8a20fe5a26a\") " Apr 21 10:55:32.360829 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:32.360795 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/495b8b26-8708-49ad-b82d-e8a20fe5a26a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "495b8b26-8708-49ad-b82d-e8a20fe5a26a" (UID: "495b8b26-8708-49ad-b82d-e8a20fe5a26a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:55:32.450901 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:32.450793 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/495b8b26-8708-49ad-b82d-e8a20fe5a26a-kserve-provision-location\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:55:32.670873 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:32.670837 2577 generic.go:358] "Generic (PLEG): container finished" podID="495b8b26-8708-49ad-b82d-e8a20fe5a26a" containerID="8432377a1090bbe8b0c735166ad590dd83256d674028b2dccc024924af2780d5" exitCode=137 Apr 21 10:55:32.671044 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:32.670878 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-wpxxs" event={"ID":"495b8b26-8708-49ad-b82d-e8a20fe5a26a","Type":"ContainerDied","Data":"8432377a1090bbe8b0c735166ad590dd83256d674028b2dccc024924af2780d5"} Apr 21 10:55:32.671044 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:32.670907 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-wpxxs" Apr 21 10:55:32.671044 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:32.670914 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-wpxxs" event={"ID":"495b8b26-8708-49ad-b82d-e8a20fe5a26a","Type":"ContainerDied","Data":"f6a32b0d22ed7aee870ab5182bfa845f7996a748cfdfe0348169d89d84c4ba51"} Apr 21 10:55:32.671044 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:32.670929 2577 scope.go:117] "RemoveContainer" containerID="8432377a1090bbe8b0c735166ad590dd83256d674028b2dccc024924af2780d5" Apr 21 10:55:32.679268 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:32.679249 2577 scope.go:117] "RemoveContainer" containerID="0ddef7bde5a41790b79af84480391a92650edbe4350c82d8d7c9497f01665f57" Apr 21 10:55:32.686348 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:32.686330 2577 scope.go:117] "RemoveContainer" containerID="8432377a1090bbe8b0c735166ad590dd83256d674028b2dccc024924af2780d5" Apr 21 10:55:32.686613 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:55:32.686594 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8432377a1090bbe8b0c735166ad590dd83256d674028b2dccc024924af2780d5\": container with ID starting with 8432377a1090bbe8b0c735166ad590dd83256d674028b2dccc024924af2780d5 not found: ID does not exist" containerID="8432377a1090bbe8b0c735166ad590dd83256d674028b2dccc024924af2780d5" Apr 21 10:55:32.686682 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:32.686626 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8432377a1090bbe8b0c735166ad590dd83256d674028b2dccc024924af2780d5"} err="failed to get container status \"8432377a1090bbe8b0c735166ad590dd83256d674028b2dccc024924af2780d5\": rpc error: code = NotFound desc = could not find container \"8432377a1090bbe8b0c735166ad590dd83256d674028b2dccc024924af2780d5\": container with ID starting with 8432377a1090bbe8b0c735166ad590dd83256d674028b2dccc024924af2780d5 not found: ID does not exist" Apr 21 10:55:32.686682 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:32.686651 2577 scope.go:117] "RemoveContainer" containerID="0ddef7bde5a41790b79af84480391a92650edbe4350c82d8d7c9497f01665f57" Apr 21 10:55:32.686910 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:55:32.686893 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ddef7bde5a41790b79af84480391a92650edbe4350c82d8d7c9497f01665f57\": container with ID starting with 0ddef7bde5a41790b79af84480391a92650edbe4350c82d8d7c9497f01665f57 not found: ID does not exist" containerID="0ddef7bde5a41790b79af84480391a92650edbe4350c82d8d7c9497f01665f57" Apr 21 10:55:32.686964 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:32.686916 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ddef7bde5a41790b79af84480391a92650edbe4350c82d8d7c9497f01665f57"} err="failed to get container status \"0ddef7bde5a41790b79af84480391a92650edbe4350c82d8d7c9497f01665f57\": rpc error: code = NotFound desc = could not find container \"0ddef7bde5a41790b79af84480391a92650edbe4350c82d8d7c9497f01665f57\": container with ID starting with 0ddef7bde5a41790b79af84480391a92650edbe4350c82d8d7c9497f01665f57 not found: ID does not exist" Apr 21 10:55:32.691286 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:32.691262 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-wpxxs"] Apr 21 10:55:32.694088 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:32.694059 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-wpxxs"] Apr 21 10:55:33.230186 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:33.230154 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4mjtb"] Apr 21 10:55:33.230479 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:33.230453 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4mjtb" podUID="c9ed4b32-8084-4ed1-ac54-8c46c04556b8" containerName="kserve-container" containerID="cri-o://1e930ef1f5f8120cbf7f0dbff4cfa72e5f5ff7b09dba8d51358cb0db2fe20df4" gracePeriod=30 Apr 21 10:55:33.291686 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:33.291649 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-s29qw"] Apr 21 10:55:33.292085 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:33.291957 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="495b8b26-8708-49ad-b82d-e8a20fe5a26a" containerName="kserve-container" Apr 21 10:55:33.292085 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:33.291968 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="495b8b26-8708-49ad-b82d-e8a20fe5a26a" containerName="kserve-container" Apr 21 10:55:33.292085 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:33.291982 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="495b8b26-8708-49ad-b82d-e8a20fe5a26a" containerName="storage-initializer" Apr 21 10:55:33.292085 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:33.291988 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="495b8b26-8708-49ad-b82d-e8a20fe5a26a" containerName="storage-initializer" Apr 21 10:55:33.292085 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:33.292041 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="495b8b26-8708-49ad-b82d-e8a20fe5a26a" containerName="kserve-container" Apr 21 10:55:33.296122 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:33.296100 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-s29qw" Apr 21 10:55:33.304738 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:33.304707 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-s29qw"] Apr 21 10:55:33.357898 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:33.357849 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f9477c86-4129-438d-bb5c-f16c49d1e4a5-kserve-provision-location\") pod \"isvc-triton-predictor-85f9f46646-s29qw\" (UID: \"f9477c86-4129-438d-bb5c-f16c49d1e4a5\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-s29qw" Apr 21 10:55:33.459179 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:33.459144 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f9477c86-4129-438d-bb5c-f16c49d1e4a5-kserve-provision-location\") pod \"isvc-triton-predictor-85f9f46646-s29qw\" (UID: \"f9477c86-4129-438d-bb5c-f16c49d1e4a5\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-s29qw" Apr 21 10:55:33.459504 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:33.459488 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f9477c86-4129-438d-bb5c-f16c49d1e4a5-kserve-provision-location\") pod \"isvc-triton-predictor-85f9f46646-s29qw\" (UID: \"f9477c86-4129-438d-bb5c-f16c49d1e4a5\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-s29qw" Apr 21 10:55:33.606441 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:33.606360 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-s29qw" Apr 21 10:55:33.728480 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:33.728448 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-s29qw"] Apr 21 10:55:33.731617 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:55:33.731586 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9477c86_4129_438d_bb5c_f16c49d1e4a5.slice/crio-f67bad5c26e2056b676cafe988f03108a4810f69d9d49bba8b0ecf12d532a2ad WatchSource:0}: Error finding container f67bad5c26e2056b676cafe988f03108a4810f69d9d49bba8b0ecf12d532a2ad: Status 404 returned error can't find the container with id f67bad5c26e2056b676cafe988f03108a4810f69d9d49bba8b0ecf12d532a2ad Apr 21 10:55:34.234587 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:34.234552 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="495b8b26-8708-49ad-b82d-e8a20fe5a26a" path="/var/lib/kubelet/pods/495b8b26-8708-49ad-b82d-e8a20fe5a26a/volumes" Apr 21 10:55:34.679612 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:34.679528 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-s29qw" event={"ID":"f9477c86-4129-438d-bb5c-f16c49d1e4a5","Type":"ContainerStarted","Data":"722533e58a29ba91b2449fe269c64ec73a71854a6862b234596e922ac7917363"} Apr 21 10:55:34.679612 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:34.679564 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-s29qw" event={"ID":"f9477c86-4129-438d-bb5c-f16c49d1e4a5","Type":"ContainerStarted","Data":"f67bad5c26e2056b676cafe988f03108a4810f69d9d49bba8b0ecf12d532a2ad"} Apr 21 10:55:37.690049 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:37.690009 2577 generic.go:358] "Generic (PLEG): container finished" podID="f9477c86-4129-438d-bb5c-f16c49d1e4a5" containerID="722533e58a29ba91b2449fe269c64ec73a71854a6862b234596e922ac7917363" exitCode=0 Apr 21 10:55:37.690433 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:55:37.690072 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-s29qw" event={"ID":"f9477c86-4129-438d-bb5c-f16c49d1e4a5","Type":"ContainerDied","Data":"722533e58a29ba91b2449fe269c64ec73a71854a6862b234596e922ac7917363"} Apr 21 10:56:03.801594 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:56:03.801542 2577 generic.go:358] "Generic (PLEG): container finished" podID="c9ed4b32-8084-4ed1-ac54-8c46c04556b8" containerID="1e930ef1f5f8120cbf7f0dbff4cfa72e5f5ff7b09dba8d51358cb0db2fe20df4" exitCode=137 Apr 21 10:56:03.802116 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:56:03.801652 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4mjtb" event={"ID":"c9ed4b32-8084-4ed1-ac54-8c46c04556b8","Type":"ContainerDied","Data":"1e930ef1f5f8120cbf7f0dbff4cfa72e5f5ff7b09dba8d51358cb0db2fe20df4"} Apr 21 10:56:04.068242 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:56:04.067933 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4mjtb" Apr 21 10:56:04.142790 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:56:04.142736 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c9ed4b32-8084-4ed1-ac54-8c46c04556b8-kserve-provision-location\") pod \"c9ed4b32-8084-4ed1-ac54-8c46c04556b8\" (UID: \"c9ed4b32-8084-4ed1-ac54-8c46c04556b8\") " Apr 21 10:56:04.151242 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:56:04.151198 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9ed4b32-8084-4ed1-ac54-8c46c04556b8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c9ed4b32-8084-4ed1-ac54-8c46c04556b8" (UID: "c9ed4b32-8084-4ed1-ac54-8c46c04556b8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:56:04.243674 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:56:04.243515 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c9ed4b32-8084-4ed1-ac54-8c46c04556b8-kserve-provision-location\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:56:04.808694 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:56:04.808544 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4mjtb" event={"ID":"c9ed4b32-8084-4ed1-ac54-8c46c04556b8","Type":"ContainerDied","Data":"6710a9a93b270d997db47857449054079e72e453957c374f828ccf6256c9d83b"} Apr 21 10:56:04.808694 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:56:04.808585 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4mjtb" Apr 21 10:56:04.808694 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:56:04.808595 2577 scope.go:117] "RemoveContainer" containerID="1e930ef1f5f8120cbf7f0dbff4cfa72e5f5ff7b09dba8d51358cb0db2fe20df4" Apr 21 10:56:04.823880 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:56:04.823832 2577 scope.go:117] "RemoveContainer" containerID="7e212796303b28cb7c97bd16bb9e034a282e60233ff5b55617ab0a288850756c" Apr 21 10:56:04.827014 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:56:04.826446 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4mjtb"] Apr 21 10:56:04.830299 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:56:04.830061 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-4mjtb"] Apr 21 10:56:06.234420 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:56:06.234272 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9ed4b32-8084-4ed1-ac54-8c46c04556b8" path="/var/lib/kubelet/pods/c9ed4b32-8084-4ed1-ac54-8c46c04556b8/volumes" Apr 21 10:57:32.092380 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:57:32.092343 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-s29qw" event={"ID":"f9477c86-4129-438d-bb5c-f16c49d1e4a5","Type":"ContainerStarted","Data":"2fcc4c7e2698b2ecd235d88c495edc065e09c1464be7188572d8fe26f3d3f6ba"} Apr 21 10:57:32.092824 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:57:32.092547 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-s29qw" Apr 21 10:57:32.094068 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:57:32.094040 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-s29qw" podUID="f9477c86-4129-438d-bb5c-f16c49d1e4a5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.58:8080: connect: connection refused" Apr 21 10:57:32.109891 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:57:32.109827 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-s29qw" podStartSLOduration=5.545409751 podStartE2EDuration="1m59.109811257s" podCreationTimestamp="2026-04-21 10:55:33 +0000 UTC" firstStartedPulling="2026-04-21 10:55:37.691162379 +0000 UTC m=+3106.029696374" lastFinishedPulling="2026-04-21 10:57:31.255563881 +0000 UTC m=+3219.594097880" observedRunningTime="2026-04-21 10:57:32.108876351 +0000 UTC m=+3220.447410367" watchObservedRunningTime="2026-04-21 10:57:32.109811257 +0000 UTC m=+3220.448345310" Apr 21 10:57:33.095321 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:57:33.095273 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-s29qw" podUID="f9477c86-4129-438d-bb5c-f16c49d1e4a5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.58:8080: connect: connection refused" Apr 21 10:57:43.096039 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:57:43.095966 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-s29qw" Apr 21 10:57:44.868079 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:57:44.868041 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-s29qw"] Apr 21 10:57:44.868472 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:57:44.868366 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-s29qw" podUID="f9477c86-4129-438d-bb5c-f16c49d1e4a5" containerName="kserve-container" containerID="cri-o://2fcc4c7e2698b2ecd235d88c495edc065e09c1464be7188572d8fe26f3d3f6ba" gracePeriod=30 Apr 21 10:57:44.938183 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:57:44.938152 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-k6tqx"] Apr 21 10:57:44.938476 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:57:44.938464 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c9ed4b32-8084-4ed1-ac54-8c46c04556b8" containerName="kserve-container" Apr 21 10:57:44.938546 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:57:44.938478 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9ed4b32-8084-4ed1-ac54-8c46c04556b8" containerName="kserve-container" Apr 21 10:57:44.938546 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:57:44.938494 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c9ed4b32-8084-4ed1-ac54-8c46c04556b8" containerName="storage-initializer" Apr 21 10:57:44.938546 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:57:44.938500 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9ed4b32-8084-4ed1-ac54-8c46c04556b8" containerName="storage-initializer" Apr 21 10:57:44.938546 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:57:44.938545 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="c9ed4b32-8084-4ed1-ac54-8c46c04556b8" containerName="kserve-container" Apr 21 10:57:44.967460 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:57:44.967423 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-k6tqx"] Apr 21 10:57:44.967640 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:57:44.967547 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-k6tqx" Apr 21 10:57:45.001309 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:57:45.001270 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9f3769f2-dff6-4f23-b963-363f382b9bde-kserve-provision-location\") pod \"isvc-xgboost-predictor-6dbc9d6d47-k6tqx\" (UID: \"9f3769f2-dff6-4f23-b963-363f382b9bde\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-k6tqx" Apr 21 10:57:45.102768 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:57:45.102702 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9f3769f2-dff6-4f23-b963-363f382b9bde-kserve-provision-location\") pod \"isvc-xgboost-predictor-6dbc9d6d47-k6tqx\" (UID: \"9f3769f2-dff6-4f23-b963-363f382b9bde\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-k6tqx" Apr 21 10:57:45.103120 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:57:45.103097 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9f3769f2-dff6-4f23-b963-363f382b9bde-kserve-provision-location\") pod \"isvc-xgboost-predictor-6dbc9d6d47-k6tqx\" (UID: \"9f3769f2-dff6-4f23-b963-363f382b9bde\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-k6tqx" Apr 21 10:57:45.278114 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:57:45.278078 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-k6tqx" Apr 21 10:57:45.402951 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:57:45.402912 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-k6tqx"] Apr 21 10:57:45.407500 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:57:45.407458 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f3769f2_dff6_4f23_b963_363f382b9bde.slice/crio-f789da3e620282af34393386491c1ac85dfb463d31bfd45dec3271ea87384486 WatchSource:0}: Error finding container f789da3e620282af34393386491c1ac85dfb463d31bfd45dec3271ea87384486: Status 404 returned error can't find the container with id f789da3e620282af34393386491c1ac85dfb463d31bfd45dec3271ea87384486 Apr 21 10:57:46.134471 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:57:46.134435 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-k6tqx" event={"ID":"9f3769f2-dff6-4f23-b963-363f382b9bde","Type":"ContainerStarted","Data":"1826a125d77ee5fc78d11aff141371da4dce8f31ec39073b8391658c150f517d"} Apr 21 10:57:46.134471 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:57:46.134470 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-k6tqx" event={"ID":"9f3769f2-dff6-4f23-b963-363f382b9bde","Type":"ContainerStarted","Data":"f789da3e620282af34393386491c1ac85dfb463d31bfd45dec3271ea87384486"} Apr 21 10:57:47.112091 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:57:47.112061 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-s29qw" Apr 21 10:57:47.117949 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:57:47.117922 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f9477c86-4129-438d-bb5c-f16c49d1e4a5-kserve-provision-location\") pod \"f9477c86-4129-438d-bb5c-f16c49d1e4a5\" (UID: \"f9477c86-4129-438d-bb5c-f16c49d1e4a5\") " Apr 21 10:57:47.118324 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:57:47.118304 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9477c86-4129-438d-bb5c-f16c49d1e4a5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f9477c86-4129-438d-bb5c-f16c49d1e4a5" (UID: "f9477c86-4129-438d-bb5c-f16c49d1e4a5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:57:47.138767 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:57:47.138718 2577 generic.go:358] "Generic (PLEG): container finished" podID="f9477c86-4129-438d-bb5c-f16c49d1e4a5" containerID="2fcc4c7e2698b2ecd235d88c495edc065e09c1464be7188572d8fe26f3d3f6ba" exitCode=0 Apr 21 10:57:47.139166 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:57:47.138810 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-s29qw" Apr 21 10:57:47.139166 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:57:47.138808 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-s29qw" event={"ID":"f9477c86-4129-438d-bb5c-f16c49d1e4a5","Type":"ContainerDied","Data":"2fcc4c7e2698b2ecd235d88c495edc065e09c1464be7188572d8fe26f3d3f6ba"} Apr 21 10:57:47.139166 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:57:47.138852 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-s29qw" event={"ID":"f9477c86-4129-438d-bb5c-f16c49d1e4a5","Type":"ContainerDied","Data":"f67bad5c26e2056b676cafe988f03108a4810f69d9d49bba8b0ecf12d532a2ad"} Apr 21 10:57:47.139166 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:57:47.138873 2577 scope.go:117] "RemoveContainer" containerID="2fcc4c7e2698b2ecd235d88c495edc065e09c1464be7188572d8fe26f3d3f6ba" Apr 21 10:57:47.146942 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:57:47.146922 2577 scope.go:117] "RemoveContainer" containerID="722533e58a29ba91b2449fe269c64ec73a71854a6862b234596e922ac7917363" Apr 21 10:57:47.154365 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:57:47.154346 2577 scope.go:117] "RemoveContainer" containerID="2fcc4c7e2698b2ecd235d88c495edc065e09c1464be7188572d8fe26f3d3f6ba" Apr 21 10:57:47.154625 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:57:47.154602 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fcc4c7e2698b2ecd235d88c495edc065e09c1464be7188572d8fe26f3d3f6ba\": container with ID starting with 2fcc4c7e2698b2ecd235d88c495edc065e09c1464be7188572d8fe26f3d3f6ba not found: ID does not exist" containerID="2fcc4c7e2698b2ecd235d88c495edc065e09c1464be7188572d8fe26f3d3f6ba" Apr 21 10:57:47.154713 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:57:47.154634 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fcc4c7e2698b2ecd235d88c495edc065e09c1464be7188572d8fe26f3d3f6ba"} err="failed to get container status \"2fcc4c7e2698b2ecd235d88c495edc065e09c1464be7188572d8fe26f3d3f6ba\": rpc error: code = NotFound desc = could not find container \"2fcc4c7e2698b2ecd235d88c495edc065e09c1464be7188572d8fe26f3d3f6ba\": container with ID starting with 2fcc4c7e2698b2ecd235d88c495edc065e09c1464be7188572d8fe26f3d3f6ba not found: ID does not exist" Apr 21 10:57:47.154713 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:57:47.154652 2577 scope.go:117] "RemoveContainer" containerID="722533e58a29ba91b2449fe269c64ec73a71854a6862b234596e922ac7917363" Apr 21 10:57:47.154905 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:57:47.154884 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"722533e58a29ba91b2449fe269c64ec73a71854a6862b234596e922ac7917363\": container with ID starting with 722533e58a29ba91b2449fe269c64ec73a71854a6862b234596e922ac7917363 not found: ID does not exist" containerID="722533e58a29ba91b2449fe269c64ec73a71854a6862b234596e922ac7917363" Apr 21 10:57:47.154946 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:57:47.154911 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"722533e58a29ba91b2449fe269c64ec73a71854a6862b234596e922ac7917363"} err="failed to get container status \"722533e58a29ba91b2449fe269c64ec73a71854a6862b234596e922ac7917363\": rpc error: code = NotFound desc = could not find container \"722533e58a29ba91b2449fe269c64ec73a71854a6862b234596e922ac7917363\": container with ID starting with 722533e58a29ba91b2449fe269c64ec73a71854a6862b234596e922ac7917363 not found: ID does not exist" Apr 21 10:57:47.188124 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:57:47.188087 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-s29qw"] Apr 21 10:57:47.201268 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:57:47.201239 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-s29qw"] Apr 21 10:57:47.219125 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:57:47.219097 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f9477c86-4129-438d-bb5c-f16c49d1e4a5-kserve-provision-location\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:57:48.232238 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:57:48.232197 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9477c86-4129-438d-bb5c-f16c49d1e4a5" path="/var/lib/kubelet/pods/f9477c86-4129-438d-bb5c-f16c49d1e4a5/volumes" Apr 21 10:57:51.152763 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:57:51.152727 2577 generic.go:358] "Generic (PLEG): container finished" podID="9f3769f2-dff6-4f23-b963-363f382b9bde" containerID="1826a125d77ee5fc78d11aff141371da4dce8f31ec39073b8391658c150f517d" exitCode=0 Apr 21 10:57:51.153280 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:57:51.152808 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-k6tqx" event={"ID":"9f3769f2-dff6-4f23-b963-363f382b9bde","Type":"ContainerDied","Data":"1826a125d77ee5fc78d11aff141371da4dce8f31ec39073b8391658c150f517d"} Apr 21 10:58:10.216157 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:58:10.216123 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-k6tqx" event={"ID":"9f3769f2-dff6-4f23-b963-363f382b9bde","Type":"ContainerStarted","Data":"7717483889daebd2d4c83f021cfc210da5493a0013ef3e3fbe93592ab9dec173"} Apr 21 10:58:10.216622 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:58:10.216403 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-k6tqx" Apr 21 10:58:10.217688 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:58:10.217663 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-k6tqx" podUID="9f3769f2-dff6-4f23-b963-363f382b9bde" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.59:8080: connect: connection refused" Apr 21 10:58:10.231734 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:58:10.231676 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-k6tqx" podStartSLOduration=7.636440702 podStartE2EDuration="26.231658937s" podCreationTimestamp="2026-04-21 10:57:44 +0000 UTC" firstStartedPulling="2026-04-21 10:57:51.15412621 +0000 UTC m=+3239.492660205" lastFinishedPulling="2026-04-21 10:58:09.749344432 +0000 UTC m=+3258.087878440" observedRunningTime="2026-04-21 10:58:10.231086387 +0000 UTC m=+3258.569620404" watchObservedRunningTime="2026-04-21 10:58:10.231658937 +0000 UTC m=+3258.570192966" Apr 21 10:58:11.219763 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:58:11.219722 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-k6tqx" podUID="9f3769f2-dff6-4f23-b963-363f382b9bde" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.59:8080: connect: connection refused" Apr 21 10:58:21.219805 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:58:21.219739 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-k6tqx" podUID="9f3769f2-dff6-4f23-b963-363f382b9bde" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.59:8080: connect: connection refused" Apr 21 10:58:31.220297 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:58:31.220253 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-k6tqx" podUID="9f3769f2-dff6-4f23-b963-363f382b9bde" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.59:8080: connect: connection refused" Apr 21 10:58:41.220054 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:58:41.220006 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-k6tqx" podUID="9f3769f2-dff6-4f23-b963-363f382b9bde" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.59:8080: connect: connection refused" Apr 21 10:58:51.220148 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:58:51.220102 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-k6tqx" podUID="9f3769f2-dff6-4f23-b963-363f382b9bde" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.59:8080: connect: connection refused" Apr 21 10:58:52.352269 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:58:52.352240 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-9vsls_c7fc40ec-8080-4847-9b12-78671f916c03/console-operator/2.log" Apr 21 10:58:52.356953 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:58:52.356926 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-9vsls_c7fc40ec-8080-4847-9b12-78671f916c03/console-operator/2.log" Apr 21 10:59:01.219851 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:01.219807 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-k6tqx" podUID="9f3769f2-dff6-4f23-b963-363f382b9bde" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.59:8080: connect: connection refused" Apr 21 10:59:11.221845 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:11.221760 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-k6tqx" Apr 21 10:59:15.041601 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:15.041565 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-k6tqx"] Apr 21 10:59:15.042137 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:15.041975 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-k6tqx" podUID="9f3769f2-dff6-4f23-b963-363f382b9bde" containerName="kserve-container" containerID="cri-o://7717483889daebd2d4c83f021cfc210da5493a0013ef3e3fbe93592ab9dec173" gracePeriod=30 Apr 21 10:59:15.138759 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:15.138721 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-sv4bd"] Apr 21 10:59:15.139047 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:15.139033 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f9477c86-4129-438d-bb5c-f16c49d1e4a5" containerName="storage-initializer" Apr 21 10:59:15.139047 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:15.139049 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9477c86-4129-438d-bb5c-f16c49d1e4a5" containerName="storage-initializer" Apr 21 10:59:15.139139 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:15.139064 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f9477c86-4129-438d-bb5c-f16c49d1e4a5" containerName="kserve-container" Apr 21 10:59:15.139139 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:15.139071 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9477c86-4129-438d-bb5c-f16c49d1e4a5" containerName="kserve-container" Apr 21 10:59:15.139139 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:15.139118 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="f9477c86-4129-438d-bb5c-f16c49d1e4a5" containerName="kserve-container" Apr 21 10:59:15.141950 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:15.141934 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-sv4bd" Apr 21 10:59:15.152370 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:15.152344 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-sv4bd"] Apr 21 10:59:15.223619 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:15.223563 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c5789b15-fd96-49dd-8fbb-dfb54c3bdc6f-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-58b7db6668-sv4bd\" (UID: \"c5789b15-fd96-49dd-8fbb-dfb54c3bdc6f\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-sv4bd" Apr 21 10:59:15.324233 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:15.324118 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c5789b15-fd96-49dd-8fbb-dfb54c3bdc6f-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-58b7db6668-sv4bd\" (UID: \"c5789b15-fd96-49dd-8fbb-dfb54c3bdc6f\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-sv4bd" Apr 21 10:59:15.324529 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:15.324508 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c5789b15-fd96-49dd-8fbb-dfb54c3bdc6f-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-58b7db6668-sv4bd\" (UID: \"c5789b15-fd96-49dd-8fbb-dfb54c3bdc6f\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-sv4bd" Apr 21 10:59:15.452250 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:15.452212 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-sv4bd" Apr 21 10:59:15.575561 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:15.575494 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-sv4bd"] Apr 21 10:59:15.578032 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:59:15.578005 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5789b15_fd96_49dd_8fbb_dfb54c3bdc6f.slice/crio-417a38c8e314f3c26aef0400a576ae95ccb1a0bf1a3289bf552260589e4bbea2 WatchSource:0}: Error finding container 417a38c8e314f3c26aef0400a576ae95ccb1a0bf1a3289bf552260589e4bbea2: Status 404 returned error can't find the container with id 417a38c8e314f3c26aef0400a576ae95ccb1a0bf1a3289bf552260589e4bbea2 Apr 21 10:59:15.580180 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:15.580164 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 10:59:16.405313 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:16.405275 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-sv4bd" event={"ID":"c5789b15-fd96-49dd-8fbb-dfb54c3bdc6f","Type":"ContainerStarted","Data":"0a2a7e1414121ed76744caac2838ea8971ccb0f0f62ab807ddf84606e8c051f1"} Apr 21 10:59:16.405730 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:16.405322 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-sv4bd" event={"ID":"c5789b15-fd96-49dd-8fbb-dfb54c3bdc6f","Type":"ContainerStarted","Data":"417a38c8e314f3c26aef0400a576ae95ccb1a0bf1a3289bf552260589e4bbea2"} Apr 21 10:59:18.882874 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:18.882849 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-k6tqx" Apr 21 10:59:18.953515 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:18.953425 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9f3769f2-dff6-4f23-b963-363f382b9bde-kserve-provision-location\") pod \"9f3769f2-dff6-4f23-b963-363f382b9bde\" (UID: \"9f3769f2-dff6-4f23-b963-363f382b9bde\") " Apr 21 10:59:18.953768 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:18.953727 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f3769f2-dff6-4f23-b963-363f382b9bde-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9f3769f2-dff6-4f23-b963-363f382b9bde" (UID: "9f3769f2-dff6-4f23-b963-363f382b9bde"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 10:59:19.054519 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:19.054481 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9f3769f2-dff6-4f23-b963-363f382b9bde-kserve-provision-location\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 10:59:19.415953 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:19.415919 2577 generic.go:358] "Generic (PLEG): container finished" podID="9f3769f2-dff6-4f23-b963-363f382b9bde" containerID="7717483889daebd2d4c83f021cfc210da5493a0013ef3e3fbe93592ab9dec173" exitCode=0 Apr 21 10:59:19.416155 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:19.415991 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-k6tqx" Apr 21 10:59:19.416155 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:19.416012 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-k6tqx" event={"ID":"9f3769f2-dff6-4f23-b963-363f382b9bde","Type":"ContainerDied","Data":"7717483889daebd2d4c83f021cfc210da5493a0013ef3e3fbe93592ab9dec173"} Apr 21 10:59:19.416155 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:19.416051 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-k6tqx" event={"ID":"9f3769f2-dff6-4f23-b963-363f382b9bde","Type":"ContainerDied","Data":"f789da3e620282af34393386491c1ac85dfb463d31bfd45dec3271ea87384486"} Apr 21 10:59:19.416155 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:19.416071 2577 scope.go:117] "RemoveContainer" containerID="7717483889daebd2d4c83f021cfc210da5493a0013ef3e3fbe93592ab9dec173" Apr 21 10:59:19.417556 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:19.417534 2577 generic.go:358] "Generic (PLEG): container finished" podID="c5789b15-fd96-49dd-8fbb-dfb54c3bdc6f" containerID="0a2a7e1414121ed76744caac2838ea8971ccb0f0f62ab807ddf84606e8c051f1" exitCode=0 Apr 21 10:59:19.417556 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:19.417560 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-sv4bd" event={"ID":"c5789b15-fd96-49dd-8fbb-dfb54c3bdc6f","Type":"ContainerDied","Data":"0a2a7e1414121ed76744caac2838ea8971ccb0f0f62ab807ddf84606e8c051f1"} Apr 21 10:59:19.425535 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:19.425511 2577 scope.go:117] "RemoveContainer" containerID="1826a125d77ee5fc78d11aff141371da4dce8f31ec39073b8391658c150f517d" Apr 21 10:59:19.433217 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:19.433197 2577 scope.go:117] "RemoveContainer" containerID="7717483889daebd2d4c83f021cfc210da5493a0013ef3e3fbe93592ab9dec173" Apr 21 10:59:19.433500 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:59:19.433480 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7717483889daebd2d4c83f021cfc210da5493a0013ef3e3fbe93592ab9dec173\": container with ID starting with 7717483889daebd2d4c83f021cfc210da5493a0013ef3e3fbe93592ab9dec173 not found: ID does not exist" containerID="7717483889daebd2d4c83f021cfc210da5493a0013ef3e3fbe93592ab9dec173" Apr 21 10:59:19.433581 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:19.433514 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7717483889daebd2d4c83f021cfc210da5493a0013ef3e3fbe93592ab9dec173"} err="failed to get container status \"7717483889daebd2d4c83f021cfc210da5493a0013ef3e3fbe93592ab9dec173\": rpc error: code = NotFound desc = could not find container \"7717483889daebd2d4c83f021cfc210da5493a0013ef3e3fbe93592ab9dec173\": container with ID starting with 7717483889daebd2d4c83f021cfc210da5493a0013ef3e3fbe93592ab9dec173 not found: ID does not exist" Apr 21 10:59:19.433581 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:19.433543 2577 scope.go:117] "RemoveContainer" containerID="1826a125d77ee5fc78d11aff141371da4dce8f31ec39073b8391658c150f517d" Apr 21 10:59:19.433866 ip-10-0-133-157 kubenswrapper[2577]: E0421 10:59:19.433834 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1826a125d77ee5fc78d11aff141371da4dce8f31ec39073b8391658c150f517d\": container with ID starting with 1826a125d77ee5fc78d11aff141371da4dce8f31ec39073b8391658c150f517d not found: ID does not exist" containerID="1826a125d77ee5fc78d11aff141371da4dce8f31ec39073b8391658c150f517d" Apr 21 10:59:19.433923 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:19.433875 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1826a125d77ee5fc78d11aff141371da4dce8f31ec39073b8391658c150f517d"} err="failed to get container status \"1826a125d77ee5fc78d11aff141371da4dce8f31ec39073b8391658c150f517d\": rpc error: code = NotFound desc = could not find container \"1826a125d77ee5fc78d11aff141371da4dce8f31ec39073b8391658c150f517d\": container with ID starting with 1826a125d77ee5fc78d11aff141371da4dce8f31ec39073b8391658c150f517d not found: ID does not exist" Apr 21 10:59:19.448838 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:19.448805 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-k6tqx"] Apr 21 10:59:19.452993 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:19.452967 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-k6tqx"] Apr 21 10:59:20.232383 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:20.232350 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f3769f2-dff6-4f23-b963-363f382b9bde" path="/var/lib/kubelet/pods/9f3769f2-dff6-4f23-b963-363f382b9bde/volumes" Apr 21 10:59:20.422537 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:20.422500 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-sv4bd" event={"ID":"c5789b15-fd96-49dd-8fbb-dfb54c3bdc6f","Type":"ContainerStarted","Data":"bfbbb8167a53fe5abb712f67d052ce068a76397fff5a194c6a47fe55a808249b"} Apr 21 10:59:20.422770 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:20.422733 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-sv4bd" Apr 21 10:59:20.440584 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:20.440526 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-sv4bd" podStartSLOduration=5.440509359 podStartE2EDuration="5.440509359s" podCreationTimestamp="2026-04-21 10:59:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 10:59:20.439244187 +0000 UTC m=+3328.777778205" watchObservedRunningTime="2026-04-21 10:59:20.440509359 +0000 UTC m=+3328.779043377" Apr 21 10:59:51.494764 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:51.494712 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-sv4bd" Apr 21 10:59:55.243206 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:55.243165 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-sv4bd"] Apr 21 10:59:55.243628 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:55.243510 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-sv4bd" podUID="c5789b15-fd96-49dd-8fbb-dfb54c3bdc6f" containerName="kserve-container" containerID="cri-o://bfbbb8167a53fe5abb712f67d052ce068a76397fff5a194c6a47fe55a808249b" gracePeriod=30 Apr 21 10:59:55.304858 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:55.304826 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-6zbdv"] Apr 21 10:59:55.305172 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:55.305159 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9f3769f2-dff6-4f23-b963-363f382b9bde" containerName="kserve-container" Apr 21 10:59:55.305239 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:55.305174 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f3769f2-dff6-4f23-b963-363f382b9bde" containerName="kserve-container" Apr 21 10:59:55.305239 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:55.305182 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9f3769f2-dff6-4f23-b963-363f382b9bde" containerName="storage-initializer" Apr 21 10:59:55.305239 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:55.305187 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f3769f2-dff6-4f23-b963-363f382b9bde" containerName="storage-initializer" Apr 21 10:59:55.305239 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:55.305239 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="9f3769f2-dff6-4f23-b963-363f382b9bde" containerName="kserve-container" Apr 21 10:59:55.308212 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:55.308195 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-6zbdv" Apr 21 10:59:55.321550 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:55.321520 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-6zbdv"] Apr 21 10:59:55.451136 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:55.451094 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/91e418e9-8975-4775-94b1-a2405bd37c7e-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-c58d48f-6zbdv\" (UID: \"91e418e9-8975-4775-94b1-a2405bd37c7e\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-6zbdv" Apr 21 10:59:55.552533 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:55.552435 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/91e418e9-8975-4775-94b1-a2405bd37c7e-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-c58d48f-6zbdv\" (UID: \"91e418e9-8975-4775-94b1-a2405bd37c7e\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-6zbdv" Apr 21 10:59:55.552922 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:55.552898 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/91e418e9-8975-4775-94b1-a2405bd37c7e-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-c58d48f-6zbdv\" (UID: \"91e418e9-8975-4775-94b1-a2405bd37c7e\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-6zbdv" Apr 21 10:59:55.617562 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:55.617517 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-6zbdv" Apr 21 10:59:55.783554 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:55.783519 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-6zbdv"] Apr 21 10:59:55.786729 ip-10-0-133-157 kubenswrapper[2577]: W0421 10:59:55.786701 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91e418e9_8975_4775_94b1_a2405bd37c7e.slice/crio-534ab9354ebb26a55f927845ce33a9951d07d098881de3881694790107030c32 WatchSource:0}: Error finding container 534ab9354ebb26a55f927845ce33a9951d07d098881de3881694790107030c32: Status 404 returned error can't find the container with id 534ab9354ebb26a55f927845ce33a9951d07d098881de3881694790107030c32 Apr 21 10:59:56.526864 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:56.526827 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-6zbdv" event={"ID":"91e418e9-8975-4775-94b1-a2405bd37c7e","Type":"ContainerStarted","Data":"3621e15f1e93b5e6f375d30fccc4ab0d022af30be3e8a893d852ceec314ff043"} Apr 21 10:59:56.526864 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:56.526866 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-6zbdv" event={"ID":"91e418e9-8975-4775-94b1-a2405bd37c7e","Type":"ContainerStarted","Data":"534ab9354ebb26a55f927845ce33a9951d07d098881de3881694790107030c32"} Apr 21 10:59:59.537323 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:59.537291 2577 generic.go:358] "Generic (PLEG): container finished" podID="91e418e9-8975-4775-94b1-a2405bd37c7e" containerID="3621e15f1e93b5e6f375d30fccc4ab0d022af30be3e8a893d852ceec314ff043" exitCode=0 Apr 21 10:59:59.537735 ip-10-0-133-157 kubenswrapper[2577]: I0421 10:59:59.537337 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-6zbdv" event={"ID":"91e418e9-8975-4775-94b1-a2405bd37c7e","Type":"ContainerDied","Data":"3621e15f1e93b5e6f375d30fccc4ab0d022af30be3e8a893d852ceec314ff043"} Apr 21 11:00:00.541548 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:00.541515 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-6zbdv" event={"ID":"91e418e9-8975-4775-94b1-a2405bd37c7e","Type":"ContainerStarted","Data":"7ffbdd6693d1094d56633be96c86e395dccb23f5e76904c512477599f3ff76e9"} Apr 21 11:00:00.541966 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:00.541761 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-6zbdv" Apr 21 11:00:00.559090 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:00.559038 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-6zbdv" podStartSLOduration=5.559021922 podStartE2EDuration="5.559021922s" podCreationTimestamp="2026-04-21 10:59:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 11:00:00.558179366 +0000 UTC m=+3368.896713384" watchObservedRunningTime="2026-04-21 11:00:00.559021922 +0000 UTC m=+3368.897555939" Apr 21 11:00:01.425993 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:01.425946 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-sv4bd" podUID="c5789b15-fd96-49dd-8fbb-dfb54c3bdc6f" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.60:8080/v2/models/isvc-xgboost-v2-mlserver/ready\": dial tcp 10.134.0.60:8080: connect: connection refused" Apr 21 11:00:04.124995 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:04.124971 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-sv4bd" Apr 21 11:00:04.224646 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:04.224614 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c5789b15-fd96-49dd-8fbb-dfb54c3bdc6f-kserve-provision-location\") pod \"c5789b15-fd96-49dd-8fbb-dfb54c3bdc6f\" (UID: \"c5789b15-fd96-49dd-8fbb-dfb54c3bdc6f\") " Apr 21 11:00:04.225004 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:04.224978 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5789b15-fd96-49dd-8fbb-dfb54c3bdc6f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c5789b15-fd96-49dd-8fbb-dfb54c3bdc6f" (UID: "c5789b15-fd96-49dd-8fbb-dfb54c3bdc6f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 11:00:04.326142 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:04.326102 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c5789b15-fd96-49dd-8fbb-dfb54c3bdc6f-kserve-provision-location\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 11:00:04.554486 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:04.554396 2577 generic.go:358] "Generic (PLEG): container finished" podID="c5789b15-fd96-49dd-8fbb-dfb54c3bdc6f" containerID="bfbbb8167a53fe5abb712f67d052ce068a76397fff5a194c6a47fe55a808249b" exitCode=0 Apr 21 11:00:04.554486 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:04.554445 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-sv4bd" event={"ID":"c5789b15-fd96-49dd-8fbb-dfb54c3bdc6f","Type":"ContainerDied","Data":"bfbbb8167a53fe5abb712f67d052ce068a76397fff5a194c6a47fe55a808249b"} Apr 21 11:00:04.554486 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:04.554463 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-sv4bd" Apr 21 11:00:04.554486 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:04.554478 2577 scope.go:117] "RemoveContainer" containerID="bfbbb8167a53fe5abb712f67d052ce068a76397fff5a194c6a47fe55a808249b" Apr 21 11:00:04.554840 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:04.554468 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-sv4bd" event={"ID":"c5789b15-fd96-49dd-8fbb-dfb54c3bdc6f","Type":"ContainerDied","Data":"417a38c8e314f3c26aef0400a576ae95ccb1a0bf1a3289bf552260589e4bbea2"} Apr 21 11:00:04.562464 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:04.562444 2577 scope.go:117] "RemoveContainer" containerID="0a2a7e1414121ed76744caac2838ea8971ccb0f0f62ab807ddf84606e8c051f1" Apr 21 11:00:04.570329 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:04.570304 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-sv4bd"] Apr 21 11:00:04.570411 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:04.570343 2577 scope.go:117] "RemoveContainer" containerID="bfbbb8167a53fe5abb712f67d052ce068a76397fff5a194c6a47fe55a808249b" Apr 21 11:00:04.570644 ip-10-0-133-157 kubenswrapper[2577]: E0421 11:00:04.570626 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfbbb8167a53fe5abb712f67d052ce068a76397fff5a194c6a47fe55a808249b\": container with ID starting with bfbbb8167a53fe5abb712f67d052ce068a76397fff5a194c6a47fe55a808249b not found: ID does not exist" containerID="bfbbb8167a53fe5abb712f67d052ce068a76397fff5a194c6a47fe55a808249b" Apr 21 11:00:04.570785 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:04.570657 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfbbb8167a53fe5abb712f67d052ce068a76397fff5a194c6a47fe55a808249b"} err="failed to get container status \"bfbbb8167a53fe5abb712f67d052ce068a76397fff5a194c6a47fe55a808249b\": rpc error: code = NotFound desc = could not find container \"bfbbb8167a53fe5abb712f67d052ce068a76397fff5a194c6a47fe55a808249b\": container with ID starting with bfbbb8167a53fe5abb712f67d052ce068a76397fff5a194c6a47fe55a808249b not found: ID does not exist" Apr 21 11:00:04.570785 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:04.570684 2577 scope.go:117] "RemoveContainer" containerID="0a2a7e1414121ed76744caac2838ea8971ccb0f0f62ab807ddf84606e8c051f1" Apr 21 11:00:04.570977 ip-10-0-133-157 kubenswrapper[2577]: E0421 11:00:04.570959 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a2a7e1414121ed76744caac2838ea8971ccb0f0f62ab807ddf84606e8c051f1\": container with ID starting with 0a2a7e1414121ed76744caac2838ea8971ccb0f0f62ab807ddf84606e8c051f1 not found: ID does not exist" containerID="0a2a7e1414121ed76744caac2838ea8971ccb0f0f62ab807ddf84606e8c051f1" Apr 21 11:00:04.571026 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:04.570988 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a2a7e1414121ed76744caac2838ea8971ccb0f0f62ab807ddf84606e8c051f1"} err="failed to get container status \"0a2a7e1414121ed76744caac2838ea8971ccb0f0f62ab807ddf84606e8c051f1\": rpc error: code = NotFound desc = could not find container \"0a2a7e1414121ed76744caac2838ea8971ccb0f0f62ab807ddf84606e8c051f1\": container with ID starting with 0a2a7e1414121ed76744caac2838ea8971ccb0f0f62ab807ddf84606e8c051f1 not found: ID does not exist" Apr 21 11:00:04.574073 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:04.574053 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-sv4bd"] Apr 21 11:00:06.232049 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:06.232013 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5789b15-fd96-49dd-8fbb-dfb54c3bdc6f" path="/var/lib/kubelet/pods/c5789b15-fd96-49dd-8fbb-dfb54c3bdc6f/volumes" Apr 21 11:00:31.594936 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:31.594892 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-6zbdv" Apr 21 11:00:35.542558 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:35.542523 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-6zbdv"] Apr 21 11:00:35.542969 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:35.542794 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-6zbdv" podUID="91e418e9-8975-4775-94b1-a2405bd37c7e" containerName="kserve-container" containerID="cri-o://7ffbdd6693d1094d56633be96c86e395dccb23f5e76904c512477599f3ff76e9" gracePeriod=30 Apr 21 11:00:35.597679 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:35.597644 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-4zggj"] Apr 21 11:00:35.597970 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:35.597954 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c5789b15-fd96-49dd-8fbb-dfb54c3bdc6f" containerName="kserve-container" Apr 21 11:00:35.597970 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:35.597969 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5789b15-fd96-49dd-8fbb-dfb54c3bdc6f" containerName="kserve-container" Apr 21 11:00:35.598136 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:35.597988 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c5789b15-fd96-49dd-8fbb-dfb54c3bdc6f" containerName="storage-initializer" Apr 21 11:00:35.598136 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:35.597993 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5789b15-fd96-49dd-8fbb-dfb54c3bdc6f" containerName="storage-initializer" Apr 21 11:00:35.598136 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:35.598043 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="c5789b15-fd96-49dd-8fbb-dfb54c3bdc6f" containerName="kserve-container" Apr 21 11:00:35.601008 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:35.600985 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-4zggj" Apr 21 11:00:35.608808 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:35.608780 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-4zggj"] Apr 21 11:00:35.666681 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:35.666642 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d3bba156-5392-488d-a155-317611050f58-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-687c7765c9-4zggj\" (UID: \"d3bba156-5392-488d-a155-317611050f58\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-4zggj" Apr 21 11:00:35.768000 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:35.767963 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d3bba156-5392-488d-a155-317611050f58-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-687c7765c9-4zggj\" (UID: \"d3bba156-5392-488d-a155-317611050f58\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-4zggj" Apr 21 11:00:35.768342 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:35.768321 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d3bba156-5392-488d-a155-317611050f58-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-687c7765c9-4zggj\" (UID: \"d3bba156-5392-488d-a155-317611050f58\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-4zggj" Apr 21 11:00:35.913230 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:35.913142 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-4zggj" Apr 21 11:00:36.038872 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:36.038844 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-4zggj"] Apr 21 11:00:36.041595 ip-10-0-133-157 kubenswrapper[2577]: W0421 11:00:36.041569 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3bba156_5392_488d_a155_317611050f58.slice/crio-fbc19b8f7e2160571689597f481cfbae6135768b6e0f17302e4e0d5b828b00e7 WatchSource:0}: Error finding container fbc19b8f7e2160571689597f481cfbae6135768b6e0f17302e4e0d5b828b00e7: Status 404 returned error can't find the container with id fbc19b8f7e2160571689597f481cfbae6135768b6e0f17302e4e0d5b828b00e7 Apr 21 11:00:36.651952 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:36.651915 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-4zggj" event={"ID":"d3bba156-5392-488d-a155-317611050f58","Type":"ContainerStarted","Data":"63179ba43d65c85a1a52cc54a2b3cd0fe5d8803981ff04229a404983db9c81ad"} Apr 21 11:00:36.651952 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:36.651951 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-4zggj" event={"ID":"d3bba156-5392-488d-a155-317611050f58","Type":"ContainerStarted","Data":"fbc19b8f7e2160571689597f481cfbae6135768b6e0f17302e4e0d5b828b00e7"} Apr 21 11:00:40.663519 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:40.663485 2577 generic.go:358] "Generic (PLEG): container finished" podID="d3bba156-5392-488d-a155-317611050f58" containerID="63179ba43d65c85a1a52cc54a2b3cd0fe5d8803981ff04229a404983db9c81ad" exitCode=0 Apr 21 11:00:40.663965 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:40.663561 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-4zggj" event={"ID":"d3bba156-5392-488d-a155-317611050f58","Type":"ContainerDied","Data":"63179ba43d65c85a1a52cc54a2b3cd0fe5d8803981ff04229a404983db9c81ad"} Apr 21 11:00:41.546140 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:41.546096 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-6zbdv" podUID="91e418e9-8975-4775-94b1-a2405bd37c7e" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.61:8080/v2/models/xgboost-v2-mlserver/ready\": dial tcp 10.134.0.61:8080: connect: connection refused" Apr 21 11:00:41.668356 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:41.668321 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-4zggj" event={"ID":"d3bba156-5392-488d-a155-317611050f58","Type":"ContainerStarted","Data":"0d7f47ec54ceda811310ad01716b98ab8fc3ab89fa28516bf728b6fa787a448c"} Apr 21 11:00:41.668900 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:41.668674 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-4zggj" Apr 21 11:00:41.670074 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:41.670045 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-4zggj" podUID="d3bba156-5392-488d-a155-317611050f58" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 21 11:00:41.683456 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:41.683314 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-4zggj" podStartSLOduration=6.683297176 podStartE2EDuration="6.683297176s" podCreationTimestamp="2026-04-21 11:00:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 11:00:41.682691571 +0000 UTC m=+3410.021225589" watchObservedRunningTime="2026-04-21 11:00:41.683297176 +0000 UTC m=+3410.021831194" Apr 21 11:00:42.671645 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:42.671601 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-4zggj" podUID="d3bba156-5392-488d-a155-317611050f58" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 21 11:00:43.676285 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:43.676252 2577 generic.go:358] "Generic (PLEG): container finished" podID="91e418e9-8975-4775-94b1-a2405bd37c7e" containerID="7ffbdd6693d1094d56633be96c86e395dccb23f5e76904c512477599f3ff76e9" exitCode=0 Apr 21 11:00:43.676684 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:43.676285 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-6zbdv" event={"ID":"91e418e9-8975-4775-94b1-a2405bd37c7e","Type":"ContainerDied","Data":"7ffbdd6693d1094d56633be96c86e395dccb23f5e76904c512477599f3ff76e9"} Apr 21 11:00:43.676684 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:43.676324 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-6zbdv" event={"ID":"91e418e9-8975-4775-94b1-a2405bd37c7e","Type":"ContainerDied","Data":"534ab9354ebb26a55f927845ce33a9951d07d098881de3881694790107030c32"} Apr 21 11:00:43.676684 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:43.676335 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="534ab9354ebb26a55f927845ce33a9951d07d098881de3881694790107030c32" Apr 21 11:00:43.686910 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:43.686889 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-6zbdv" Apr 21 11:00:43.731534 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:43.731499 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/91e418e9-8975-4775-94b1-a2405bd37c7e-kserve-provision-location\") pod \"91e418e9-8975-4775-94b1-a2405bd37c7e\" (UID: \"91e418e9-8975-4775-94b1-a2405bd37c7e\") " Apr 21 11:00:43.731943 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:43.731920 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91e418e9-8975-4775-94b1-a2405bd37c7e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "91e418e9-8975-4775-94b1-a2405bd37c7e" (UID: "91e418e9-8975-4775-94b1-a2405bd37c7e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 11:00:43.833056 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:43.832960 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/91e418e9-8975-4775-94b1-a2405bd37c7e-kserve-provision-location\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 11:00:44.678802 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:44.678688 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-6zbdv" Apr 21 11:00:44.695701 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:44.695668 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-6zbdv"] Apr 21 11:00:44.701617 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:44.701589 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-6zbdv"] Apr 21 11:00:46.231984 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:46.231950 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91e418e9-8975-4775-94b1-a2405bd37c7e" path="/var/lib/kubelet/pods/91e418e9-8975-4775-94b1-a2405bd37c7e/volumes" Apr 21 11:00:52.672692 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:00:52.672643 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-4zggj" podUID="d3bba156-5392-488d-a155-317611050f58" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 21 11:01:02.672659 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:01:02.672614 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-4zggj" podUID="d3bba156-5392-488d-a155-317611050f58" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 21 11:01:12.672147 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:01:12.672102 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-4zggj" podUID="d3bba156-5392-488d-a155-317611050f58" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 21 11:01:22.672093 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:01:22.672042 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-4zggj" podUID="d3bba156-5392-488d-a155-317611050f58" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 21 11:01:32.672315 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:01:32.672265 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-4zggj" podUID="d3bba156-5392-488d-a155-317611050f58" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.62:8080: connect: connection refused" Apr 21 11:01:42.672960 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:01:42.672929 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-4zggj" Apr 21 11:01:45.708553 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:01:45.708518 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-4zggj"] Apr 21 11:01:45.708960 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:01:45.708804 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-4zggj" podUID="d3bba156-5392-488d-a155-317611050f58" containerName="kserve-container" containerID="cri-o://0d7f47ec54ceda811310ad01716b98ab8fc3ab89fa28516bf728b6fa787a448c" gracePeriod=30 Apr 21 11:01:45.758645 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:01:45.758606 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-tr2db"] Apr 21 11:01:45.758963 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:01:45.758950 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="91e418e9-8975-4775-94b1-a2405bd37c7e" containerName="storage-initializer" Apr 21 11:01:45.759017 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:01:45.758964 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="91e418e9-8975-4775-94b1-a2405bd37c7e" containerName="storage-initializer" Apr 21 11:01:45.759017 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:01:45.758973 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="91e418e9-8975-4775-94b1-a2405bd37c7e" containerName="kserve-container" Apr 21 11:01:45.759017 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:01:45.758978 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="91e418e9-8975-4775-94b1-a2405bd37c7e" containerName="kserve-container" Apr 21 11:01:45.759110 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:01:45.759035 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="91e418e9-8975-4775-94b1-a2405bd37c7e" containerName="kserve-container" Apr 21 11:01:45.762070 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:01:45.762053 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-tr2db" Apr 21 11:01:45.771029 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:01:45.770969 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-tr2db"] Apr 21 11:01:45.838658 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:01:45.838619 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aaa907ac-bbd6-4945-8062-652608d8853b-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-b5d6966c7-tr2db\" (UID: \"aaa907ac-bbd6-4945-8062-652608d8853b\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-tr2db" Apr 21 11:01:45.939666 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:01:45.939622 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aaa907ac-bbd6-4945-8062-652608d8853b-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-b5d6966c7-tr2db\" (UID: \"aaa907ac-bbd6-4945-8062-652608d8853b\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-tr2db" Apr 21 11:01:45.940061 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:01:45.940040 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aaa907ac-bbd6-4945-8062-652608d8853b-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-b5d6966c7-tr2db\" (UID: \"aaa907ac-bbd6-4945-8062-652608d8853b\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-tr2db" Apr 21 11:01:46.072045 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:01:46.072009 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-tr2db" Apr 21 11:01:46.191989 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:01:46.191958 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-tr2db"] Apr 21 11:01:46.199508 ip-10-0-133-157 kubenswrapper[2577]: W0421 11:01:46.199400 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaaa907ac_bbd6_4945_8062_652608d8853b.slice/crio-5cae6bb389ed7c3c77659fa3ac546f228ab2ee889c08b378575427554369b965 WatchSource:0}: Error finding container 5cae6bb389ed7c3c77659fa3ac546f228ab2ee889c08b378575427554369b965: Status 404 returned error can't find the container with id 5cae6bb389ed7c3c77659fa3ac546f228ab2ee889c08b378575427554369b965 Apr 21 11:01:46.864461 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:01:46.864425 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-tr2db" event={"ID":"aaa907ac-bbd6-4945-8062-652608d8853b","Type":"ContainerStarted","Data":"98d1e4348c3165092c1b5712fe2a1e4fc31d7249612f8001e9cde14cfe36dab7"} Apr 21 11:01:46.864461 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:01:46.864464 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-tr2db" event={"ID":"aaa907ac-bbd6-4945-8062-652608d8853b","Type":"ContainerStarted","Data":"5cae6bb389ed7c3c77659fa3ac546f228ab2ee889c08b378575427554369b965"} Apr 21 11:01:49.556813 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:01:49.556788 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-4zggj" Apr 21 11:01:49.669679 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:01:49.669587 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d3bba156-5392-488d-a155-317611050f58-kserve-provision-location\") pod \"d3bba156-5392-488d-a155-317611050f58\" (UID: \"d3bba156-5392-488d-a155-317611050f58\") " Apr 21 11:01:49.670004 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:01:49.669981 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3bba156-5392-488d-a155-317611050f58-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d3bba156-5392-488d-a155-317611050f58" (UID: "d3bba156-5392-488d-a155-317611050f58"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 11:01:49.770631 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:01:49.770592 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d3bba156-5392-488d-a155-317611050f58-kserve-provision-location\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 11:01:49.873984 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:01:49.873950 2577 generic.go:358] "Generic (PLEG): container finished" podID="aaa907ac-bbd6-4945-8062-652608d8853b" containerID="98d1e4348c3165092c1b5712fe2a1e4fc31d7249612f8001e9cde14cfe36dab7" exitCode=0 Apr 21 11:01:49.874171 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:01:49.874023 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-tr2db" event={"ID":"aaa907ac-bbd6-4945-8062-652608d8853b","Type":"ContainerDied","Data":"98d1e4348c3165092c1b5712fe2a1e4fc31d7249612f8001e9cde14cfe36dab7"} Apr 21 11:01:49.875563 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:01:49.875536 2577 generic.go:358] "Generic (PLEG): container finished" podID="d3bba156-5392-488d-a155-317611050f58" containerID="0d7f47ec54ceda811310ad01716b98ab8fc3ab89fa28516bf728b6fa787a448c" exitCode=0 Apr 21 11:01:49.875661 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:01:49.875586 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-4zggj" event={"ID":"d3bba156-5392-488d-a155-317611050f58","Type":"ContainerDied","Data":"0d7f47ec54ceda811310ad01716b98ab8fc3ab89fa28516bf728b6fa787a448c"} Apr 21 11:01:49.875661 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:01:49.875592 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-4zggj" Apr 21 11:01:49.875661 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:01:49.875615 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-4zggj" event={"ID":"d3bba156-5392-488d-a155-317611050f58","Type":"ContainerDied","Data":"fbc19b8f7e2160571689597f481cfbae6135768b6e0f17302e4e0d5b828b00e7"} Apr 21 11:01:49.875661 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:01:49.875630 2577 scope.go:117] "RemoveContainer" containerID="0d7f47ec54ceda811310ad01716b98ab8fc3ab89fa28516bf728b6fa787a448c" Apr 21 11:01:49.884052 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:01:49.883812 2577 scope.go:117] "RemoveContainer" containerID="63179ba43d65c85a1a52cc54a2b3cd0fe5d8803981ff04229a404983db9c81ad" Apr 21 11:01:49.891941 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:01:49.891914 2577 scope.go:117] "RemoveContainer" containerID="0d7f47ec54ceda811310ad01716b98ab8fc3ab89fa28516bf728b6fa787a448c" Apr 21 11:01:49.892226 ip-10-0-133-157 kubenswrapper[2577]: E0421 11:01:49.892209 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d7f47ec54ceda811310ad01716b98ab8fc3ab89fa28516bf728b6fa787a448c\": container with ID starting with 0d7f47ec54ceda811310ad01716b98ab8fc3ab89fa28516bf728b6fa787a448c not found: ID does not exist" containerID="0d7f47ec54ceda811310ad01716b98ab8fc3ab89fa28516bf728b6fa787a448c" Apr 21 11:01:49.892274 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:01:49.892236 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d7f47ec54ceda811310ad01716b98ab8fc3ab89fa28516bf728b6fa787a448c"} err="failed to get container status \"0d7f47ec54ceda811310ad01716b98ab8fc3ab89fa28516bf728b6fa787a448c\": rpc error: code = NotFound desc = could not find container \"0d7f47ec54ceda811310ad01716b98ab8fc3ab89fa28516bf728b6fa787a448c\": container with ID starting with 0d7f47ec54ceda811310ad01716b98ab8fc3ab89fa28516bf728b6fa787a448c not found: ID does not exist" Apr 21 11:01:49.892274 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:01:49.892254 2577 scope.go:117] "RemoveContainer" containerID="63179ba43d65c85a1a52cc54a2b3cd0fe5d8803981ff04229a404983db9c81ad" Apr 21 11:01:49.892547 ip-10-0-133-157 kubenswrapper[2577]: E0421 11:01:49.892523 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63179ba43d65c85a1a52cc54a2b3cd0fe5d8803981ff04229a404983db9c81ad\": container with ID starting with 63179ba43d65c85a1a52cc54a2b3cd0fe5d8803981ff04229a404983db9c81ad not found: ID does not exist" containerID="63179ba43d65c85a1a52cc54a2b3cd0fe5d8803981ff04229a404983db9c81ad" Apr 21 11:01:49.892639 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:01:49.892561 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63179ba43d65c85a1a52cc54a2b3cd0fe5d8803981ff04229a404983db9c81ad"} err="failed to get container status \"63179ba43d65c85a1a52cc54a2b3cd0fe5d8803981ff04229a404983db9c81ad\": rpc error: code = NotFound desc = could not find container \"63179ba43d65c85a1a52cc54a2b3cd0fe5d8803981ff04229a404983db9c81ad\": container with ID starting with 63179ba43d65c85a1a52cc54a2b3cd0fe5d8803981ff04229a404983db9c81ad not found: ID does not exist" Apr 21 11:01:49.902162 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:01:49.902132 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-4zggj"] Apr 21 11:01:49.905563 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:01:49.905535 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-4zggj"] Apr 21 11:01:50.231651 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:01:50.231614 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3bba156-5392-488d-a155-317611050f58" path="/var/lib/kubelet/pods/d3bba156-5392-488d-a155-317611050f58/volumes" Apr 21 11:01:50.881288 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:01:50.881251 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-tr2db" event={"ID":"aaa907ac-bbd6-4945-8062-652608d8853b","Type":"ContainerStarted","Data":"cf8bd887d703906530a5e70437dd121ccd7b9c4e6f73a0decdc14c2b2c02307c"} Apr 21 11:01:50.881658 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:01:50.881483 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-tr2db" Apr 21 11:01:50.900063 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:01:50.899989 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-tr2db" podStartSLOduration=5.899968036 podStartE2EDuration="5.899968036s" podCreationTimestamp="2026-04-21 11:01:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 11:01:50.897287115 +0000 UTC m=+3479.235821135" watchObservedRunningTime="2026-04-21 11:01:50.899968036 +0000 UTC m=+3479.238502055" Apr 21 11:02:21.895528 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:02:21.895482 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-tr2db" podUID="aaa907ac-bbd6-4945-8062-652608d8853b" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 21 11:02:31.894334 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:02:31.894275 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-tr2db" podUID="aaa907ac-bbd6-4945-8062-652608d8853b" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 21 11:02:41.887412 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:02:41.887382 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-tr2db" Apr 21 11:02:45.882780 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:02:45.882724 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-tr2db"] Apr 21 11:02:45.883262 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:02:45.883084 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-tr2db" podUID="aaa907ac-bbd6-4945-8062-652608d8853b" containerName="kserve-container" containerID="cri-o://cf8bd887d703906530a5e70437dd121ccd7b9c4e6f73a0decdc14c2b2c02307c" gracePeriod=30 Apr 21 11:02:45.941126 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:02:45.941090 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d7vtb"] Apr 21 11:02:45.941408 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:02:45.941396 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d3bba156-5392-488d-a155-317611050f58" containerName="kserve-container" Apr 21 11:02:45.941459 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:02:45.941410 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3bba156-5392-488d-a155-317611050f58" containerName="kserve-container" Apr 21 11:02:45.941459 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:02:45.941427 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d3bba156-5392-488d-a155-317611050f58" containerName="storage-initializer" Apr 21 11:02:45.941459 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:02:45.941432 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3bba156-5392-488d-a155-317611050f58" containerName="storage-initializer" Apr 21 11:02:45.941555 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:02:45.941477 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="d3bba156-5392-488d-a155-317611050f58" containerName="kserve-container" Apr 21 11:02:45.944378 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:02:45.944361 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d7vtb" Apr 21 11:02:45.954676 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:02:45.954650 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d7vtb"] Apr 21 11:02:45.998060 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:02:45.998018 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/33f3e04e-7d3a-4d53-87d2-ec804501ab45-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-5db5686f9f-d7vtb\" (UID: \"33f3e04e-7d3a-4d53-87d2-ec804501ab45\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d7vtb" Apr 21 11:02:46.099501 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:02:46.099459 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/33f3e04e-7d3a-4d53-87d2-ec804501ab45-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-5db5686f9f-d7vtb\" (UID: \"33f3e04e-7d3a-4d53-87d2-ec804501ab45\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d7vtb" Apr 21 11:02:46.099989 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:02:46.099963 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/33f3e04e-7d3a-4d53-87d2-ec804501ab45-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-5db5686f9f-d7vtb\" (UID: \"33f3e04e-7d3a-4d53-87d2-ec804501ab45\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d7vtb" Apr 21 11:02:46.255259 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:02:46.255220 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d7vtb" Apr 21 11:02:46.375518 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:02:46.375480 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d7vtb"] Apr 21 11:02:46.378999 ip-10-0-133-157 kubenswrapper[2577]: W0421 11:02:46.378961 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33f3e04e_7d3a_4d53_87d2_ec804501ab45.slice/crio-33448d8d0a53f404315a23520145690cef8988c2f85a55f3629b0973efa3b2b7 WatchSource:0}: Error finding container 33448d8d0a53f404315a23520145690cef8988c2f85a55f3629b0973efa3b2b7: Status 404 returned error can't find the container with id 33448d8d0a53f404315a23520145690cef8988c2f85a55f3629b0973efa3b2b7 Apr 21 11:02:47.047687 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:02:47.047651 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d7vtb" event={"ID":"33f3e04e-7d3a-4d53-87d2-ec804501ab45","Type":"ContainerStarted","Data":"0e33a3c907ac19f5de684e6367891529948db35ebc1f372f90dca86438459ffc"} Apr 21 11:02:47.047687 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:02:47.047689 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d7vtb" event={"ID":"33f3e04e-7d3a-4d53-87d2-ec804501ab45","Type":"ContainerStarted","Data":"33448d8d0a53f404315a23520145690cef8988c2f85a55f3629b0973efa3b2b7"} Apr 21 11:02:51.062412 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:02:51.062374 2577 generic.go:358] "Generic (PLEG): container finished" podID="33f3e04e-7d3a-4d53-87d2-ec804501ab45" containerID="0e33a3c907ac19f5de684e6367891529948db35ebc1f372f90dca86438459ffc" exitCode=0 Apr 21 11:02:51.062826 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:02:51.062453 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d7vtb" event={"ID":"33f3e04e-7d3a-4d53-87d2-ec804501ab45","Type":"ContainerDied","Data":"0e33a3c907ac19f5de684e6367891529948db35ebc1f372f90dca86438459ffc"} Apr 21 11:02:51.885700 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:02:51.885654 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-tr2db" podUID="aaa907ac-bbd6-4945-8062-652608d8853b" containerName="kserve-container" probeResult="failure" output="Get \"http://10.134.0.63:8080/v2/models/isvc-xgboost-v2-runtime/ready\": dial tcp 10.134.0.63:8080: connect: connection refused" Apr 21 11:02:52.066986 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:02:52.066953 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d7vtb" event={"ID":"33f3e04e-7d3a-4d53-87d2-ec804501ab45","Type":"ContainerStarted","Data":"db37d326e197308a0ecb8c700fdec39429ecc0f962f55ef228ef9b6ecb22c67c"} Apr 21 11:02:52.067380 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:02:52.067253 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d7vtb" Apr 21 11:02:52.068723 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:02:52.068690 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d7vtb" podUID="33f3e04e-7d3a-4d53-87d2-ec804501ab45" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 21 11:02:52.082278 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:02:52.082234 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d7vtb" podStartSLOduration=7.082217698 podStartE2EDuration="7.082217698s" podCreationTimestamp="2026-04-21 11:02:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 11:02:52.081269453 +0000 UTC m=+3540.419803470" watchObservedRunningTime="2026-04-21 11:02:52.082217698 +0000 UTC m=+3540.420751716" Apr 21 11:02:53.069797 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:02:53.069741 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d7vtb" podUID="33f3e04e-7d3a-4d53-87d2-ec804501ab45" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 21 11:02:53.916871 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:02:53.916845 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-tr2db" Apr 21 11:02:53.952397 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:02:53.952304 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aaa907ac-bbd6-4945-8062-652608d8853b-kserve-provision-location\") pod \"aaa907ac-bbd6-4945-8062-652608d8853b\" (UID: \"aaa907ac-bbd6-4945-8062-652608d8853b\") " Apr 21 11:02:53.952661 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:02:53.952634 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aaa907ac-bbd6-4945-8062-652608d8853b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "aaa907ac-bbd6-4945-8062-652608d8853b" (UID: "aaa907ac-bbd6-4945-8062-652608d8853b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 11:02:54.053595 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:02:54.053540 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aaa907ac-bbd6-4945-8062-652608d8853b-kserve-provision-location\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 11:02:54.073071 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:02:54.073033 2577 generic.go:358] "Generic (PLEG): container finished" podID="aaa907ac-bbd6-4945-8062-652608d8853b" containerID="cf8bd887d703906530a5e70437dd121ccd7b9c4e6f73a0decdc14c2b2c02307c" exitCode=0 Apr 21 11:02:54.073473 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:02:54.073099 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-tr2db" Apr 21 11:02:54.073473 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:02:54.073118 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-tr2db" event={"ID":"aaa907ac-bbd6-4945-8062-652608d8853b","Type":"ContainerDied","Data":"cf8bd887d703906530a5e70437dd121ccd7b9c4e6f73a0decdc14c2b2c02307c"} Apr 21 11:02:54.073473 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:02:54.073156 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-tr2db" event={"ID":"aaa907ac-bbd6-4945-8062-652608d8853b","Type":"ContainerDied","Data":"5cae6bb389ed7c3c77659fa3ac546f228ab2ee889c08b378575427554369b965"} Apr 21 11:02:54.073473 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:02:54.073171 2577 scope.go:117] "RemoveContainer" containerID="cf8bd887d703906530a5e70437dd121ccd7b9c4e6f73a0decdc14c2b2c02307c" Apr 21 11:02:54.081762 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:02:54.081723 2577 scope.go:117] "RemoveContainer" containerID="98d1e4348c3165092c1b5712fe2a1e4fc31d7249612f8001e9cde14cfe36dab7" Apr 21 11:02:54.088731 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:02:54.088711 2577 scope.go:117] "RemoveContainer" containerID="cf8bd887d703906530a5e70437dd121ccd7b9c4e6f73a0decdc14c2b2c02307c" Apr 21 11:02:54.089002 ip-10-0-133-157 kubenswrapper[2577]: E0421 11:02:54.088986 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf8bd887d703906530a5e70437dd121ccd7b9c4e6f73a0decdc14c2b2c02307c\": container with ID starting with cf8bd887d703906530a5e70437dd121ccd7b9c4e6f73a0decdc14c2b2c02307c not found: ID does not exist" containerID="cf8bd887d703906530a5e70437dd121ccd7b9c4e6f73a0decdc14c2b2c02307c" Apr 21 11:02:54.089042 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:02:54.089012 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf8bd887d703906530a5e70437dd121ccd7b9c4e6f73a0decdc14c2b2c02307c"} err="failed to get container status \"cf8bd887d703906530a5e70437dd121ccd7b9c4e6f73a0decdc14c2b2c02307c\": rpc error: code = NotFound desc = could not find container \"cf8bd887d703906530a5e70437dd121ccd7b9c4e6f73a0decdc14c2b2c02307c\": container with ID starting with cf8bd887d703906530a5e70437dd121ccd7b9c4e6f73a0decdc14c2b2c02307c not found: ID does not exist" Apr 21 11:02:54.089042 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:02:54.089030 2577 scope.go:117] "RemoveContainer" containerID="98d1e4348c3165092c1b5712fe2a1e4fc31d7249612f8001e9cde14cfe36dab7" Apr 21 11:02:54.089271 ip-10-0-133-157 kubenswrapper[2577]: E0421 11:02:54.089249 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98d1e4348c3165092c1b5712fe2a1e4fc31d7249612f8001e9cde14cfe36dab7\": container with ID starting with 98d1e4348c3165092c1b5712fe2a1e4fc31d7249612f8001e9cde14cfe36dab7 not found: ID does not exist" containerID="98d1e4348c3165092c1b5712fe2a1e4fc31d7249612f8001e9cde14cfe36dab7" Apr 21 11:02:54.089311 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:02:54.089278 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98d1e4348c3165092c1b5712fe2a1e4fc31d7249612f8001e9cde14cfe36dab7"} err="failed to get container status \"98d1e4348c3165092c1b5712fe2a1e4fc31d7249612f8001e9cde14cfe36dab7\": rpc error: code = NotFound desc = could not find container \"98d1e4348c3165092c1b5712fe2a1e4fc31d7249612f8001e9cde14cfe36dab7\": container with ID starting with 98d1e4348c3165092c1b5712fe2a1e4fc31d7249612f8001e9cde14cfe36dab7 not found: ID does not exist" Apr 21 11:02:54.093548 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:02:54.093526 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-tr2db"] Apr 21 11:02:54.095817 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:02:54.095797 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-tr2db"] Apr 21 11:02:54.237805 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:02:54.237767 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aaa907ac-bbd6-4945-8062-652608d8853b" path="/var/lib/kubelet/pods/aaa907ac-bbd6-4945-8062-652608d8853b/volumes" Apr 21 11:03:03.070215 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:03:03.070169 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d7vtb" podUID="33f3e04e-7d3a-4d53-87d2-ec804501ab45" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 21 11:03:13.070489 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:03:13.070438 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d7vtb" podUID="33f3e04e-7d3a-4d53-87d2-ec804501ab45" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 21 11:03:23.070658 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:03:23.070613 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d7vtb" podUID="33f3e04e-7d3a-4d53-87d2-ec804501ab45" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 21 11:03:33.069991 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:03:33.069946 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d7vtb" podUID="33f3e04e-7d3a-4d53-87d2-ec804501ab45" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 21 11:03:43.070247 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:03:43.070147 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d7vtb" podUID="33f3e04e-7d3a-4d53-87d2-ec804501ab45" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.64:8080: connect: connection refused" Apr 21 11:03:52.374430 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:03:52.374400 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-9vsls_c7fc40ec-8080-4847-9b12-78671f916c03/console-operator/2.log" Apr 21 11:03:52.379188 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:03:52.379165 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-9vsls_c7fc40ec-8080-4847-9b12-78671f916c03/console-operator/2.log" Apr 21 11:03:53.071436 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:03:53.071402 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d7vtb" Apr 21 11:03:56.060482 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:03:56.060444 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d7vtb"] Apr 21 11:03:56.060919 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:03:56.060731 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d7vtb" podUID="33f3e04e-7d3a-4d53-87d2-ec804501ab45" containerName="kserve-container" containerID="cri-o://db37d326e197308a0ecb8c700fdec39429ecc0f962f55ef228ef9b6ecb22c67c" gracePeriod=30 Apr 21 11:03:56.098082 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:03:56.098042 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d459cc9c7-54fdc"] Apr 21 11:03:56.098345 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:03:56.098334 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aaa907ac-bbd6-4945-8062-652608d8853b" containerName="kserve-container" Apr 21 11:03:56.098405 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:03:56.098347 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaa907ac-bbd6-4945-8062-652608d8853b" containerName="kserve-container" Apr 21 11:03:56.098405 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:03:56.098366 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aaa907ac-bbd6-4945-8062-652608d8853b" containerName="storage-initializer" Apr 21 11:03:56.098405 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:03:56.098372 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaa907ac-bbd6-4945-8062-652608d8853b" containerName="storage-initializer" Apr 21 11:03:56.098521 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:03:56.098427 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="aaa907ac-bbd6-4945-8062-652608d8853b" containerName="kserve-container" Apr 21 11:03:56.101419 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:03:56.101399 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d459cc9c7-54fdc" Apr 21 11:03:56.104010 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:03:56.103984 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"storage-config\"" Apr 21 11:03:56.109576 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:03:56.109537 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d459cc9c7-54fdc"] Apr 21 11:03:56.230591 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:03:56.230562 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59413f7c-baf1-4468-bed9-de6838b70033-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-d459cc9c7-54fdc\" (UID: \"59413f7c-baf1-4468-bed9-de6838b70033\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d459cc9c7-54fdc" Apr 21 11:03:56.331171 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:03:56.331070 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59413f7c-baf1-4468-bed9-de6838b70033-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-d459cc9c7-54fdc\" (UID: \"59413f7c-baf1-4468-bed9-de6838b70033\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d459cc9c7-54fdc" Apr 21 11:03:56.331458 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:03:56.331437 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59413f7c-baf1-4468-bed9-de6838b70033-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-d459cc9c7-54fdc\" (UID: \"59413f7c-baf1-4468-bed9-de6838b70033\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d459cc9c7-54fdc" Apr 21 11:03:56.412689 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:03:56.412647 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d459cc9c7-54fdc" Apr 21 11:03:56.534290 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:03:56.534253 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d459cc9c7-54fdc"] Apr 21 11:03:56.537373 ip-10-0-133-157 kubenswrapper[2577]: W0421 11:03:56.537340 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59413f7c_baf1_4468_bed9_de6838b70033.slice/crio-871a0e4e0b962e5ba56be9deb47b3ce6cfec427e00fa4afa48ef5c559af62683 WatchSource:0}: Error finding container 871a0e4e0b962e5ba56be9deb47b3ce6cfec427e00fa4afa48ef5c559af62683: Status 404 returned error can't find the container with id 871a0e4e0b962e5ba56be9deb47b3ce6cfec427e00fa4afa48ef5c559af62683 Apr 21 11:03:57.261040 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:03:57.261003 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d459cc9c7-54fdc" event={"ID":"59413f7c-baf1-4468-bed9-de6838b70033","Type":"ContainerStarted","Data":"dc6acf4ea0d0c1d870d59a1b3fa398a21f51ca98db3fb03c35cf9a9c3e9e4e6f"} Apr 21 11:03:57.261040 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:03:57.261043 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d459cc9c7-54fdc" event={"ID":"59413f7c-baf1-4468-bed9-de6838b70033","Type":"ContainerStarted","Data":"871a0e4e0b962e5ba56be9deb47b3ce6cfec427e00fa4afa48ef5c559af62683"} Apr 21 11:03:58.265607 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:03:58.265572 2577 generic.go:358] "Generic (PLEG): container finished" podID="59413f7c-baf1-4468-bed9-de6838b70033" containerID="dc6acf4ea0d0c1d870d59a1b3fa398a21f51ca98db3fb03c35cf9a9c3e9e4e6f" exitCode=0 Apr 21 11:03:58.266031 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:03:58.265672 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d459cc9c7-54fdc" event={"ID":"59413f7c-baf1-4468-bed9-de6838b70033","Type":"ContainerDied","Data":"dc6acf4ea0d0c1d870d59a1b3fa398a21f51ca98db3fb03c35cf9a9c3e9e4e6f"} Apr 21 11:03:59.270442 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:03:59.270407 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d459cc9c7-54fdc" event={"ID":"59413f7c-baf1-4468-bed9-de6838b70033","Type":"ContainerStarted","Data":"1875172b5581f8e8a2e4d6c102bb3dca480440c41900c28f08e7af76ab1985cb"} Apr 21 11:03:59.270911 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:03:59.270674 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d459cc9c7-54fdc" Apr 21 11:03:59.272020 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:03:59.271986 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d459cc9c7-54fdc" podUID="59413f7c-baf1-4468-bed9-de6838b70033" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.65:8080: connect: connection refused" Apr 21 11:03:59.285234 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:03:59.285182 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d459cc9c7-54fdc" podStartSLOduration=3.285165924 podStartE2EDuration="3.285165924s" podCreationTimestamp="2026-04-21 11:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 11:03:59.284542904 +0000 UTC m=+3607.623076921" watchObservedRunningTime="2026-04-21 11:03:59.285165924 +0000 UTC m=+3607.623699939" Apr 21 11:03:59.997530 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:03:59.997505 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d7vtb" Apr 21 11:04:00.160240 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:04:00.160151 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/33f3e04e-7d3a-4d53-87d2-ec804501ab45-kserve-provision-location\") pod \"33f3e04e-7d3a-4d53-87d2-ec804501ab45\" (UID: \"33f3e04e-7d3a-4d53-87d2-ec804501ab45\") " Apr 21 11:04:00.160492 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:04:00.160471 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33f3e04e-7d3a-4d53-87d2-ec804501ab45-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "33f3e04e-7d3a-4d53-87d2-ec804501ab45" (UID: "33f3e04e-7d3a-4d53-87d2-ec804501ab45"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 11:04:00.261472 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:04:00.261436 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/33f3e04e-7d3a-4d53-87d2-ec804501ab45-kserve-provision-location\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 11:04:00.275532 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:04:00.275496 2577 generic.go:358] "Generic (PLEG): container finished" podID="33f3e04e-7d3a-4d53-87d2-ec804501ab45" containerID="db37d326e197308a0ecb8c700fdec39429ecc0f962f55ef228ef9b6ecb22c67c" exitCode=0 Apr 21 11:04:00.275972 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:04:00.275572 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d7vtb" Apr 21 11:04:00.275972 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:04:00.275581 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d7vtb" event={"ID":"33f3e04e-7d3a-4d53-87d2-ec804501ab45","Type":"ContainerDied","Data":"db37d326e197308a0ecb8c700fdec39429ecc0f962f55ef228ef9b6ecb22c67c"} Apr 21 11:04:00.275972 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:04:00.275617 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d7vtb" event={"ID":"33f3e04e-7d3a-4d53-87d2-ec804501ab45","Type":"ContainerDied","Data":"33448d8d0a53f404315a23520145690cef8988c2f85a55f3629b0973efa3b2b7"} Apr 21 11:04:00.275972 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:04:00.275636 2577 scope.go:117] "RemoveContainer" containerID="db37d326e197308a0ecb8c700fdec39429ecc0f962f55ef228ef9b6ecb22c67c" Apr 21 11:04:00.276331 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:04:00.276305 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d459cc9c7-54fdc" podUID="59413f7c-baf1-4468-bed9-de6838b70033" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.65:8080: connect: connection refused" Apr 21 11:04:00.283443 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:04:00.283415 2577 scope.go:117] "RemoveContainer" containerID="0e33a3c907ac19f5de684e6367891529948db35ebc1f372f90dca86438459ffc" Apr 21 11:04:00.290877 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:04:00.290803 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d7vtb"] Apr 21 11:04:00.290949 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:04:00.290916 2577 scope.go:117] "RemoveContainer" containerID="db37d326e197308a0ecb8c700fdec39429ecc0f962f55ef228ef9b6ecb22c67c" Apr 21 11:04:00.291235 ip-10-0-133-157 kubenswrapper[2577]: E0421 11:04:00.291218 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db37d326e197308a0ecb8c700fdec39429ecc0f962f55ef228ef9b6ecb22c67c\": container with ID starting with db37d326e197308a0ecb8c700fdec39429ecc0f962f55ef228ef9b6ecb22c67c not found: ID does not exist" containerID="db37d326e197308a0ecb8c700fdec39429ecc0f962f55ef228ef9b6ecb22c67c" Apr 21 11:04:00.291289 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:04:00.291243 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db37d326e197308a0ecb8c700fdec39429ecc0f962f55ef228ef9b6ecb22c67c"} err="failed to get container status \"db37d326e197308a0ecb8c700fdec39429ecc0f962f55ef228ef9b6ecb22c67c\": rpc error: code = NotFound desc = could not find container \"db37d326e197308a0ecb8c700fdec39429ecc0f962f55ef228ef9b6ecb22c67c\": container with ID starting with db37d326e197308a0ecb8c700fdec39429ecc0f962f55ef228ef9b6ecb22c67c not found: ID does not exist" Apr 21 11:04:00.291289 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:04:00.291260 2577 scope.go:117] "RemoveContainer" containerID="0e33a3c907ac19f5de684e6367891529948db35ebc1f372f90dca86438459ffc" Apr 21 11:04:00.291487 ip-10-0-133-157 kubenswrapper[2577]: E0421 11:04:00.291468 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e33a3c907ac19f5de684e6367891529948db35ebc1f372f90dca86438459ffc\": container with ID starting with 0e33a3c907ac19f5de684e6367891529948db35ebc1f372f90dca86438459ffc not found: ID does not exist" containerID="0e33a3c907ac19f5de684e6367891529948db35ebc1f372f90dca86438459ffc" Apr 21 11:04:00.291526 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:04:00.291494 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e33a3c907ac19f5de684e6367891529948db35ebc1f372f90dca86438459ffc"} err="failed to get container status \"0e33a3c907ac19f5de684e6367891529948db35ebc1f372f90dca86438459ffc\": rpc error: code = NotFound desc = could not find container \"0e33a3c907ac19f5de684e6367891529948db35ebc1f372f90dca86438459ffc\": container with ID starting with 0e33a3c907ac19f5de684e6367891529948db35ebc1f372f90dca86438459ffc not found: ID does not exist" Apr 21 11:04:00.294408 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:04:00.294384 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-d7vtb"] Apr 21 11:04:02.232544 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:04:02.232510 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33f3e04e-7d3a-4d53-87d2-ec804501ab45" path="/var/lib/kubelet/pods/33f3e04e-7d3a-4d53-87d2-ec804501ab45/volumes" Apr 21 11:04:10.277107 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:04:10.277059 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d459cc9c7-54fdc" podUID="59413f7c-baf1-4468-bed9-de6838b70033" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.65:8080: connect: connection refused" Apr 21 11:04:20.276960 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:04:20.276909 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d459cc9c7-54fdc" podUID="59413f7c-baf1-4468-bed9-de6838b70033" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.65:8080: connect: connection refused" Apr 21 11:04:30.277178 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:04:30.277131 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d459cc9c7-54fdc" podUID="59413f7c-baf1-4468-bed9-de6838b70033" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.65:8080: connect: connection refused" Apr 21 11:04:40.276803 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:04:40.276732 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d459cc9c7-54fdc" podUID="59413f7c-baf1-4468-bed9-de6838b70033" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.65:8080: connect: connection refused" Apr 21 11:04:50.276499 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:04:50.276447 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d459cc9c7-54fdc" podUID="59413f7c-baf1-4468-bed9-de6838b70033" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.65:8080: connect: connection refused" Apr 21 11:05:00.277037 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:05:00.276982 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d459cc9c7-54fdc" podUID="59413f7c-baf1-4468-bed9-de6838b70033" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.65:8080: connect: connection refused" Apr 21 11:05:02.231554 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:05:02.231510 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d459cc9c7-54fdc" podUID="59413f7c-baf1-4468-bed9-de6838b70033" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.65:8080: connect: connection refused" Apr 21 11:05:12.232997 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:05:12.232918 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d459cc9c7-54fdc" Apr 21 11:05:16.213982 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:05:16.213946 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d459cc9c7-54fdc"] Apr 21 11:05:16.214385 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:05:16.214230 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d459cc9c7-54fdc" podUID="59413f7c-baf1-4468-bed9-de6838b70033" containerName="kserve-container" containerID="cri-o://1875172b5581f8e8a2e4d6c102bb3dca480440c41900c28f08e7af76ab1985cb" gracePeriod=30 Apr 21 11:05:16.320433 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:05:16.320375 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-69899db4d9-2lkbn"] Apr 21 11:05:16.320733 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:05:16.320720 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="33f3e04e-7d3a-4d53-87d2-ec804501ab45" containerName="storage-initializer" Apr 21 11:05:16.320817 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:05:16.320734 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="33f3e04e-7d3a-4d53-87d2-ec804501ab45" containerName="storage-initializer" Apr 21 11:05:16.320817 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:05:16.320765 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="33f3e04e-7d3a-4d53-87d2-ec804501ab45" containerName="kserve-container" Apr 21 11:05:16.320817 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:05:16.320771 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="33f3e04e-7d3a-4d53-87d2-ec804501ab45" containerName="kserve-container" Apr 21 11:05:16.320924 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:05:16.320831 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="33f3e04e-7d3a-4d53-87d2-ec804501ab45" containerName="kserve-container" Apr 21 11:05:16.323818 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:05:16.323798 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-69899db4d9-2lkbn" Apr 21 11:05:16.326160 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:05:16.326133 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 21 11:05:16.333489 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:05:16.333464 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-69899db4d9-2lkbn"] Apr 21 11:05:16.463472 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:05:16.463429 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a14573e5-0f59-4238-accf-2448c4547453-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-69899db4d9-2lkbn\" (UID: \"a14573e5-0f59-4238-accf-2448c4547453\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-69899db4d9-2lkbn" Apr 21 11:05:16.463472 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:05:16.463474 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a14573e5-0f59-4238-accf-2448c4547453-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-69899db4d9-2lkbn\" (UID: \"a14573e5-0f59-4238-accf-2448c4547453\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-69899db4d9-2lkbn" Apr 21 11:05:16.564760 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:05:16.564653 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a14573e5-0f59-4238-accf-2448c4547453-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-69899db4d9-2lkbn\" (UID: \"a14573e5-0f59-4238-accf-2448c4547453\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-69899db4d9-2lkbn" Apr 21 11:05:16.564760 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:05:16.564700 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a14573e5-0f59-4238-accf-2448c4547453-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-69899db4d9-2lkbn\" (UID: \"a14573e5-0f59-4238-accf-2448c4547453\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-69899db4d9-2lkbn" Apr 21 11:05:16.565182 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:05:16.565154 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a14573e5-0f59-4238-accf-2448c4547453-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-69899db4d9-2lkbn\" (UID: \"a14573e5-0f59-4238-accf-2448c4547453\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-69899db4d9-2lkbn" Apr 21 11:05:16.565410 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:05:16.565393 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a14573e5-0f59-4238-accf-2448c4547453-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-69899db4d9-2lkbn\" (UID: \"a14573e5-0f59-4238-accf-2448c4547453\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-69899db4d9-2lkbn" Apr 21 11:05:16.636592 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:05:16.636550 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-69899db4d9-2lkbn" Apr 21 11:05:16.761118 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:05:16.761012 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-69899db4d9-2lkbn"] Apr 21 11:05:16.763925 ip-10-0-133-157 kubenswrapper[2577]: W0421 11:05:16.763890 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda14573e5_0f59_4238_accf_2448c4547453.slice/crio-bc5ceaff5ff8fca1e8e4b168e558e7f1d3d6b379beb95870b8af73283d3fa801 WatchSource:0}: Error finding container bc5ceaff5ff8fca1e8e4b168e558e7f1d3d6b379beb95870b8af73283d3fa801: Status 404 returned error can't find the container with id bc5ceaff5ff8fca1e8e4b168e558e7f1d3d6b379beb95870b8af73283d3fa801 Apr 21 11:05:16.765895 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:05:16.765864 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 11:05:17.505337 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:05:17.505299 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-69899db4d9-2lkbn" event={"ID":"a14573e5-0f59-4238-accf-2448c4547453","Type":"ContainerStarted","Data":"accdd164689b16bb03af7ee2206d0a19d2f935b48ff6c5e5c02774dd34d2435e"} Apr 21 11:05:17.505337 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:05:17.505339 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-69899db4d9-2lkbn" event={"ID":"a14573e5-0f59-4238-accf-2448c4547453","Type":"ContainerStarted","Data":"bc5ceaff5ff8fca1e8e4b168e558e7f1d3d6b379beb95870b8af73283d3fa801"} Apr 21 11:05:18.510009 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:05:18.509971 2577 generic.go:358] "Generic (PLEG): container finished" podID="a14573e5-0f59-4238-accf-2448c4547453" containerID="accdd164689b16bb03af7ee2206d0a19d2f935b48ff6c5e5c02774dd34d2435e" exitCode=0 Apr 21 11:05:18.510417 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:05:18.510056 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-69899db4d9-2lkbn" event={"ID":"a14573e5-0f59-4238-accf-2448c4547453","Type":"ContainerDied","Data":"accdd164689b16bb03af7ee2206d0a19d2f935b48ff6c5e5c02774dd34d2435e"} Apr 21 11:05:19.516661 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:05:19.516628 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-69899db4d9-2lkbn" event={"ID":"a14573e5-0f59-4238-accf-2448c4547453","Type":"ContainerStarted","Data":"080be6bc05b6fd7bbf2c49eeb3fab52888133f96bc958974c9686d0ca274a0f1"} Apr 21 11:05:19.517107 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:05:19.516884 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-69899db4d9-2lkbn" Apr 21 11:05:19.518306 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:05:19.518280 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-69899db4d9-2lkbn" podUID="a14573e5-0f59-4238-accf-2448c4547453" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.66:8080: connect: connection refused" Apr 21 11:05:19.539116 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:05:19.539063 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-69899db4d9-2lkbn" podStartSLOduration=3.539046143 podStartE2EDuration="3.539046143s" podCreationTimestamp="2026-04-21 11:05:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 11:05:19.538057468 +0000 UTC m=+3687.876591489" watchObservedRunningTime="2026-04-21 11:05:19.539046143 +0000 UTC m=+3687.877580159" Apr 21 11:05:20.521175 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:05:20.521139 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-69899db4d9-2lkbn" podUID="a14573e5-0f59-4238-accf-2448c4547453" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.66:8080: connect: connection refused" Apr 21 11:05:20.855133 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:05:20.855108 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d459cc9c7-54fdc" Apr 21 11:05:21.000094 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:05:21.000057 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59413f7c-baf1-4468-bed9-de6838b70033-kserve-provision-location\") pod \"59413f7c-baf1-4468-bed9-de6838b70033\" (UID: \"59413f7c-baf1-4468-bed9-de6838b70033\") " Apr 21 11:05:21.000479 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:05:21.000451 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59413f7c-baf1-4468-bed9-de6838b70033-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "59413f7c-baf1-4468-bed9-de6838b70033" (UID: "59413f7c-baf1-4468-bed9-de6838b70033"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 11:05:21.101591 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:05:21.101511 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59413f7c-baf1-4468-bed9-de6838b70033-kserve-provision-location\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 11:05:21.525643 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:05:21.525606 2577 generic.go:358] "Generic (PLEG): container finished" podID="59413f7c-baf1-4468-bed9-de6838b70033" containerID="1875172b5581f8e8a2e4d6c102bb3dca480440c41900c28f08e7af76ab1985cb" exitCode=0 Apr 21 11:05:21.525643 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:05:21.525646 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d459cc9c7-54fdc" event={"ID":"59413f7c-baf1-4468-bed9-de6838b70033","Type":"ContainerDied","Data":"1875172b5581f8e8a2e4d6c102bb3dca480440c41900c28f08e7af76ab1985cb"} Apr 21 11:05:21.526155 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:05:21.525673 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d459cc9c7-54fdc" event={"ID":"59413f7c-baf1-4468-bed9-de6838b70033","Type":"ContainerDied","Data":"871a0e4e0b962e5ba56be9deb47b3ce6cfec427e00fa4afa48ef5c559af62683"} Apr 21 11:05:21.526155 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:05:21.525677 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d459cc9c7-54fdc" Apr 21 11:05:21.526155 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:05:21.525688 2577 scope.go:117] "RemoveContainer" containerID="1875172b5581f8e8a2e4d6c102bb3dca480440c41900c28f08e7af76ab1985cb" Apr 21 11:05:21.534056 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:05:21.534034 2577 scope.go:117] "RemoveContainer" containerID="dc6acf4ea0d0c1d870d59a1b3fa398a21f51ca98db3fb03c35cf9a9c3e9e4e6f" Apr 21 11:05:21.541595 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:05:21.541578 2577 scope.go:117] "RemoveContainer" containerID="1875172b5581f8e8a2e4d6c102bb3dca480440c41900c28f08e7af76ab1985cb" Apr 21 11:05:21.541906 ip-10-0-133-157 kubenswrapper[2577]: E0421 11:05:21.541883 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1875172b5581f8e8a2e4d6c102bb3dca480440c41900c28f08e7af76ab1985cb\": container with ID starting with 1875172b5581f8e8a2e4d6c102bb3dca480440c41900c28f08e7af76ab1985cb not found: ID does not exist" containerID="1875172b5581f8e8a2e4d6c102bb3dca480440c41900c28f08e7af76ab1985cb" Apr 21 11:05:21.541952 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:05:21.541916 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1875172b5581f8e8a2e4d6c102bb3dca480440c41900c28f08e7af76ab1985cb"} err="failed to get container status \"1875172b5581f8e8a2e4d6c102bb3dca480440c41900c28f08e7af76ab1985cb\": rpc error: code = NotFound desc = could not find container \"1875172b5581f8e8a2e4d6c102bb3dca480440c41900c28f08e7af76ab1985cb\": container with ID starting with 1875172b5581f8e8a2e4d6c102bb3dca480440c41900c28f08e7af76ab1985cb not found: ID does not exist" Apr 21 11:05:21.541952 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:05:21.541935 2577 scope.go:117] "RemoveContainer" containerID="dc6acf4ea0d0c1d870d59a1b3fa398a21f51ca98db3fb03c35cf9a9c3e9e4e6f" Apr 21 11:05:21.542154 ip-10-0-133-157 kubenswrapper[2577]: E0421 11:05:21.542138 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc6acf4ea0d0c1d870d59a1b3fa398a21f51ca98db3fb03c35cf9a9c3e9e4e6f\": container with ID starting with dc6acf4ea0d0c1d870d59a1b3fa398a21f51ca98db3fb03c35cf9a9c3e9e4e6f not found: ID does not exist" containerID="dc6acf4ea0d0c1d870d59a1b3fa398a21f51ca98db3fb03c35cf9a9c3e9e4e6f" Apr 21 11:05:21.542188 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:05:21.542158 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc6acf4ea0d0c1d870d59a1b3fa398a21f51ca98db3fb03c35cf9a9c3e9e4e6f"} err="failed to get container status \"dc6acf4ea0d0c1d870d59a1b3fa398a21f51ca98db3fb03c35cf9a9c3e9e4e6f\": rpc error: code = NotFound desc = could not find container \"dc6acf4ea0d0c1d870d59a1b3fa398a21f51ca98db3fb03c35cf9a9c3e9e4e6f\": container with ID starting with dc6acf4ea0d0c1d870d59a1b3fa398a21f51ca98db3fb03c35cf9a9c3e9e4e6f not found: ID does not exist" Apr 21 11:05:21.546632 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:05:21.546608 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d459cc9c7-54fdc"] Apr 21 11:05:21.550294 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:05:21.550271 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-d459cc9c7-54fdc"] Apr 21 11:05:22.231765 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:05:22.231707 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59413f7c-baf1-4468-bed9-de6838b70033" path="/var/lib/kubelet/pods/59413f7c-baf1-4468-bed9-de6838b70033/volumes" Apr 21 11:05:30.521999 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:05:30.521955 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-69899db4d9-2lkbn" podUID="a14573e5-0f59-4238-accf-2448c4547453" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.66:8080: connect: connection refused" Apr 21 11:05:40.521148 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:05:40.521098 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-69899db4d9-2lkbn" podUID="a14573e5-0f59-4238-accf-2448c4547453" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.66:8080: connect: connection refused" Apr 21 11:05:50.521702 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:05:50.521651 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-69899db4d9-2lkbn" podUID="a14573e5-0f59-4238-accf-2448c4547453" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.66:8080: connect: connection refused" Apr 21 11:05:56.196119 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:05:56.196085 2577 scope.go:117] "RemoveContainer" containerID="3621e15f1e93b5e6f375d30fccc4ab0d022af30be3e8a893d852ceec314ff043" Apr 21 11:06:00.521285 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:00.521230 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-69899db4d9-2lkbn" podUID="a14573e5-0f59-4238-accf-2448c4547453" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.66:8080: connect: connection refused" Apr 21 11:06:10.521947 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:10.521900 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-69899db4d9-2lkbn" podUID="a14573e5-0f59-4238-accf-2448c4547453" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.66:8080: connect: connection refused" Apr 21 11:06:20.522118 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:20.522068 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-69899db4d9-2lkbn" podUID="a14573e5-0f59-4238-accf-2448c4547453" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.66:8080: connect: connection refused" Apr 21 11:06:30.522955 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:30.522922 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-69899db4d9-2lkbn" Apr 21 11:06:36.364967 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:36.364925 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-69899db4d9-2lkbn"] Apr 21 11:06:36.365389 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:36.365286 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-69899db4d9-2lkbn" podUID="a14573e5-0f59-4238-accf-2448c4547453" containerName="kserve-container" containerID="cri-o://080be6bc05b6fd7bbf2c49eeb3fab52888133f96bc958974c9686d0ca274a0f1" gracePeriod=30 Apr 21 11:06:37.423384 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:37.423348 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-56554d98d7-fpmtv"] Apr 21 11:06:37.423814 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:37.423653 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59413f7c-baf1-4468-bed9-de6838b70033" containerName="storage-initializer" Apr 21 11:06:37.423814 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:37.423665 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="59413f7c-baf1-4468-bed9-de6838b70033" containerName="storage-initializer" Apr 21 11:06:37.423814 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:37.423677 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59413f7c-baf1-4468-bed9-de6838b70033" containerName="kserve-container" Apr 21 11:06:37.423814 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:37.423683 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="59413f7c-baf1-4468-bed9-de6838b70033" containerName="kserve-container" Apr 21 11:06:37.423814 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:37.423729 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="59413f7c-baf1-4468-bed9-de6838b70033" containerName="kserve-container" Apr 21 11:06:37.426606 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:37.426586 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-56554d98d7-fpmtv" Apr 21 11:06:37.434887 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:37.434856 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-56554d98d7-fpmtv"] Apr 21 11:06:37.498321 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:37.498279 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5a9bf823-b072-4889-8e8b-ead3d1a2a393-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-56554d98d7-fpmtv\" (UID: \"5a9bf823-b072-4889-8e8b-ead3d1a2a393\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-56554d98d7-fpmtv" Apr 21 11:06:37.599735 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:37.599687 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5a9bf823-b072-4889-8e8b-ead3d1a2a393-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-56554d98d7-fpmtv\" (UID: \"5a9bf823-b072-4889-8e8b-ead3d1a2a393\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-56554d98d7-fpmtv" Apr 21 11:06:37.600132 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:37.600105 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5a9bf823-b072-4889-8e8b-ead3d1a2a393-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-56554d98d7-fpmtv\" (UID: \"5a9bf823-b072-4889-8e8b-ead3d1a2a393\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-56554d98d7-fpmtv" Apr 21 11:06:37.738007 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:37.737956 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-56554d98d7-fpmtv" Apr 21 11:06:37.864578 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:37.864538 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-56554d98d7-fpmtv"] Apr 21 11:06:37.867549 ip-10-0-133-157 kubenswrapper[2577]: W0421 11:06:37.867515 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a9bf823_b072_4889_8e8b_ead3d1a2a393.slice/crio-1ff6aa8069ce61f5c7eafe48771da66cdb72822c78dedc59b41be06314b26b30 WatchSource:0}: Error finding container 1ff6aa8069ce61f5c7eafe48771da66cdb72822c78dedc59b41be06314b26b30: Status 404 returned error can't find the container with id 1ff6aa8069ce61f5c7eafe48771da66cdb72822c78dedc59b41be06314b26b30 Apr 21 11:06:38.764345 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:38.764311 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-56554d98d7-fpmtv" event={"ID":"5a9bf823-b072-4889-8e8b-ead3d1a2a393","Type":"ContainerStarted","Data":"315cce20cc9ff557a34286b0e883fa612e0a3674febc5ddce93da30235894407"} Apr 21 11:06:38.764345 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:38.764346 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-56554d98d7-fpmtv" event={"ID":"5a9bf823-b072-4889-8e8b-ead3d1a2a393","Type":"ContainerStarted","Data":"1ff6aa8069ce61f5c7eafe48771da66cdb72822c78dedc59b41be06314b26b30"} Apr 21 11:06:40.521146 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:40.521093 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-69899db4d9-2lkbn" podUID="a14573e5-0f59-4238-accf-2448c4547453" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.66:8080: connect: connection refused" Apr 21 11:06:41.010082 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:41.010053 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-69899db4d9-2lkbn" Apr 21 11:06:41.126974 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:41.126886 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a14573e5-0f59-4238-accf-2448c4547453-kserve-provision-location\") pod \"a14573e5-0f59-4238-accf-2448c4547453\" (UID: \"a14573e5-0f59-4238-accf-2448c4547453\") " Apr 21 11:06:41.126974 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:41.126956 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a14573e5-0f59-4238-accf-2448c4547453-cabundle-cert\") pod \"a14573e5-0f59-4238-accf-2448c4547453\" (UID: \"a14573e5-0f59-4238-accf-2448c4547453\") " Apr 21 11:06:41.127270 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:41.127247 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a14573e5-0f59-4238-accf-2448c4547453-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a14573e5-0f59-4238-accf-2448c4547453" (UID: "a14573e5-0f59-4238-accf-2448c4547453"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 11:06:41.127319 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:41.127300 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a14573e5-0f59-4238-accf-2448c4547453-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "a14573e5-0f59-4238-accf-2448c4547453" (UID: "a14573e5-0f59-4238-accf-2448c4547453"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 11:06:41.228368 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:41.228323 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a14573e5-0f59-4238-accf-2448c4547453-kserve-provision-location\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 11:06:41.228368 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:41.228362 2577 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a14573e5-0f59-4238-accf-2448c4547453-cabundle-cert\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 11:06:41.774715 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:41.774680 2577 generic.go:358] "Generic (PLEG): container finished" podID="a14573e5-0f59-4238-accf-2448c4547453" containerID="080be6bc05b6fd7bbf2c49eeb3fab52888133f96bc958974c9686d0ca274a0f1" exitCode=0 Apr 21 11:06:41.775257 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:41.774762 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-69899db4d9-2lkbn" Apr 21 11:06:41.775257 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:41.774768 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-69899db4d9-2lkbn" event={"ID":"a14573e5-0f59-4238-accf-2448c4547453","Type":"ContainerDied","Data":"080be6bc05b6fd7bbf2c49eeb3fab52888133f96bc958974c9686d0ca274a0f1"} Apr 21 11:06:41.775257 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:41.774794 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-69899db4d9-2lkbn" event={"ID":"a14573e5-0f59-4238-accf-2448c4547453","Type":"ContainerDied","Data":"bc5ceaff5ff8fca1e8e4b168e558e7f1d3d6b379beb95870b8af73283d3fa801"} Apr 21 11:06:41.775257 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:41.774811 2577 scope.go:117] "RemoveContainer" containerID="080be6bc05b6fd7bbf2c49eeb3fab52888133f96bc958974c9686d0ca274a0f1" Apr 21 11:06:41.783767 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:41.783730 2577 scope.go:117] "RemoveContainer" containerID="accdd164689b16bb03af7ee2206d0a19d2f935b48ff6c5e5c02774dd34d2435e" Apr 21 11:06:41.793232 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:41.793208 2577 scope.go:117] "RemoveContainer" containerID="080be6bc05b6fd7bbf2c49eeb3fab52888133f96bc958974c9686d0ca274a0f1" Apr 21 11:06:41.793577 ip-10-0-133-157 kubenswrapper[2577]: E0421 11:06:41.793555 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"080be6bc05b6fd7bbf2c49eeb3fab52888133f96bc958974c9686d0ca274a0f1\": container with ID starting with 080be6bc05b6fd7bbf2c49eeb3fab52888133f96bc958974c9686d0ca274a0f1 not found: ID does not exist" containerID="080be6bc05b6fd7bbf2c49eeb3fab52888133f96bc958974c9686d0ca274a0f1" Apr 21 11:06:41.793673 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:41.793587 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"080be6bc05b6fd7bbf2c49eeb3fab52888133f96bc958974c9686d0ca274a0f1"} err="failed to get container status \"080be6bc05b6fd7bbf2c49eeb3fab52888133f96bc958974c9686d0ca274a0f1\": rpc error: code = NotFound desc = could not find container \"080be6bc05b6fd7bbf2c49eeb3fab52888133f96bc958974c9686d0ca274a0f1\": container with ID starting with 080be6bc05b6fd7bbf2c49eeb3fab52888133f96bc958974c9686d0ca274a0f1 not found: ID does not exist" Apr 21 11:06:41.793673 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:41.793608 2577 scope.go:117] "RemoveContainer" containerID="accdd164689b16bb03af7ee2206d0a19d2f935b48ff6c5e5c02774dd34d2435e" Apr 21 11:06:41.793906 ip-10-0-133-157 kubenswrapper[2577]: E0421 11:06:41.793884 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"accdd164689b16bb03af7ee2206d0a19d2f935b48ff6c5e5c02774dd34d2435e\": container with ID starting with accdd164689b16bb03af7ee2206d0a19d2f935b48ff6c5e5c02774dd34d2435e not found: ID does not exist" containerID="accdd164689b16bb03af7ee2206d0a19d2f935b48ff6c5e5c02774dd34d2435e" Apr 21 11:06:41.793968 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:41.793916 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"accdd164689b16bb03af7ee2206d0a19d2f935b48ff6c5e5c02774dd34d2435e"} err="failed to get container status \"accdd164689b16bb03af7ee2206d0a19d2f935b48ff6c5e5c02774dd34d2435e\": rpc error: code = NotFound desc = could not find container \"accdd164689b16bb03af7ee2206d0a19d2f935b48ff6c5e5c02774dd34d2435e\": container with ID starting with accdd164689b16bb03af7ee2206d0a19d2f935b48ff6c5e5c02774dd34d2435e not found: ID does not exist" Apr 21 11:06:41.796114 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:41.796089 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-69899db4d9-2lkbn"] Apr 21 11:06:41.805471 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:41.800988 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-69899db4d9-2lkbn"] Apr 21 11:06:42.233438 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:42.233402 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a14573e5-0f59-4238-accf-2448c4547453" path="/var/lib/kubelet/pods/a14573e5-0f59-4238-accf-2448c4547453/volumes" Apr 21 11:06:44.785640 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:44.785608 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-56554d98d7-fpmtv_5a9bf823-b072-4889-8e8b-ead3d1a2a393/storage-initializer/0.log" Apr 21 11:06:44.786080 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:44.785651 2577 generic.go:358] "Generic (PLEG): container finished" podID="5a9bf823-b072-4889-8e8b-ead3d1a2a393" containerID="315cce20cc9ff557a34286b0e883fa612e0a3674febc5ddce93da30235894407" exitCode=1 Apr 21 11:06:44.786080 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:44.785727 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-56554d98d7-fpmtv" event={"ID":"5a9bf823-b072-4889-8e8b-ead3d1a2a393","Type":"ContainerDied","Data":"315cce20cc9ff557a34286b0e883fa612e0a3674febc5ddce93da30235894407"} Apr 21 11:06:45.789951 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:45.789922 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-56554d98d7-fpmtv_5a9bf823-b072-4889-8e8b-ead3d1a2a393/storage-initializer/0.log" Apr 21 11:06:45.790391 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:45.790020 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-56554d98d7-fpmtv" event={"ID":"5a9bf823-b072-4889-8e8b-ead3d1a2a393","Type":"ContainerStarted","Data":"1a24a75ce3675886e43a65b0ab9ac09b5b2b6549f53d40b0071b8eeed27fe2ef"} Apr 21 11:06:47.445358 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:47.445323 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-56554d98d7-fpmtv"] Apr 21 11:06:47.445766 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:47.445541 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-56554d98d7-fpmtv" podUID="5a9bf823-b072-4889-8e8b-ead3d1a2a393" containerName="storage-initializer" containerID="cri-o://1a24a75ce3675886e43a65b0ab9ac09b5b2b6549f53d40b0071b8eeed27fe2ef" gracePeriod=30 Apr 21 11:06:48.501652 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:48.501611 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75857865-42p5k"] Apr 21 11:06:48.502068 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:48.501944 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a14573e5-0f59-4238-accf-2448c4547453" containerName="storage-initializer" Apr 21 11:06:48.502068 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:48.501955 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="a14573e5-0f59-4238-accf-2448c4547453" containerName="storage-initializer" Apr 21 11:06:48.502068 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:48.501986 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a14573e5-0f59-4238-accf-2448c4547453" containerName="kserve-container" Apr 21 11:06:48.502068 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:48.501992 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="a14573e5-0f59-4238-accf-2448c4547453" containerName="kserve-container" Apr 21 11:06:48.502068 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:48.502041 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="a14573e5-0f59-4238-accf-2448c4547453" containerName="kserve-container" Apr 21 11:06:48.504937 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:48.504919 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75857865-42p5k" Apr 21 11:06:48.507388 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:48.507362 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 21 11:06:48.514734 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:48.514705 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75857865-42p5k"] Apr 21 11:06:48.587397 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:48.587356 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d2d0f96-bf6d-487a-a502-85fa5826d0ba-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-75857865-42p5k\" (UID: \"4d2d0f96-bf6d-487a-a502-85fa5826d0ba\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75857865-42p5k" Apr 21 11:06:48.587590 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:48.587411 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/4d2d0f96-bf6d-487a-a502-85fa5826d0ba-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-75857865-42p5k\" (UID: \"4d2d0f96-bf6d-487a-a502-85fa5826d0ba\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75857865-42p5k" Apr 21 11:06:48.688843 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:48.688804 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d2d0f96-bf6d-487a-a502-85fa5826d0ba-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-75857865-42p5k\" (UID: \"4d2d0f96-bf6d-487a-a502-85fa5826d0ba\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75857865-42p5k" Apr 21 11:06:48.689071 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:48.688890 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/4d2d0f96-bf6d-487a-a502-85fa5826d0ba-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-75857865-42p5k\" (UID: \"4d2d0f96-bf6d-487a-a502-85fa5826d0ba\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75857865-42p5k" Apr 21 11:06:48.692791 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:48.689448 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d2d0f96-bf6d-487a-a502-85fa5826d0ba-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-75857865-42p5k\" (UID: \"4d2d0f96-bf6d-487a-a502-85fa5826d0ba\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75857865-42p5k" Apr 21 11:06:48.692791 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:48.690053 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/4d2d0f96-bf6d-487a-a502-85fa5826d0ba-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-75857865-42p5k\" (UID: \"4d2d0f96-bf6d-487a-a502-85fa5826d0ba\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75857865-42p5k" Apr 21 11:06:48.816286 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:48.816195 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75857865-42p5k" Apr 21 11:06:48.944555 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:48.944518 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75857865-42p5k"] Apr 21 11:06:48.948813 ip-10-0-133-157 kubenswrapper[2577]: W0421 11:06:48.948778 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d2d0f96_bf6d_487a_a502_85fa5826d0ba.slice/crio-e7527fff51e4fdec05db43afa7f080fe47928b65e10aa4af4c41b60dd129186a WatchSource:0}: Error finding container e7527fff51e4fdec05db43afa7f080fe47928b65e10aa4af4c41b60dd129186a: Status 404 returned error can't find the container with id e7527fff51e4fdec05db43afa7f080fe47928b65e10aa4af4c41b60dd129186a Apr 21 11:06:49.806030 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:49.805996 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75857865-42p5k" event={"ID":"4d2d0f96-bf6d-487a-a502-85fa5826d0ba","Type":"ContainerStarted","Data":"fdcca36acbcd93b977049045da839b66bb25ea2e9ef57604a80cabc42167d384"} Apr 21 11:06:49.806030 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:49.806031 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75857865-42p5k" event={"ID":"4d2d0f96-bf6d-487a-a502-85fa5826d0ba","Type":"ContainerStarted","Data":"e7527fff51e4fdec05db43afa7f080fe47928b65e10aa4af4c41b60dd129186a"} Apr 21 11:06:50.384861 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:50.384834 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-56554d98d7-fpmtv_5a9bf823-b072-4889-8e8b-ead3d1a2a393/storage-initializer/1.log" Apr 21 11:06:50.385216 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:50.385200 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-56554d98d7-fpmtv_5a9bf823-b072-4889-8e8b-ead3d1a2a393/storage-initializer/0.log" Apr 21 11:06:50.385283 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:50.385264 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-56554d98d7-fpmtv" Apr 21 11:06:50.505892 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:50.505780 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5a9bf823-b072-4889-8e8b-ead3d1a2a393-kserve-provision-location\") pod \"5a9bf823-b072-4889-8e8b-ead3d1a2a393\" (UID: \"5a9bf823-b072-4889-8e8b-ead3d1a2a393\") " Apr 21 11:06:50.506082 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:50.506058 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a9bf823-b072-4889-8e8b-ead3d1a2a393-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5a9bf823-b072-4889-8e8b-ead3d1a2a393" (UID: "5a9bf823-b072-4889-8e8b-ead3d1a2a393"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 11:06:50.606614 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:50.606575 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5a9bf823-b072-4889-8e8b-ead3d1a2a393-kserve-provision-location\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 11:06:50.810758 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:50.810649 2577 generic.go:358] "Generic (PLEG): container finished" podID="4d2d0f96-bf6d-487a-a502-85fa5826d0ba" containerID="fdcca36acbcd93b977049045da839b66bb25ea2e9ef57604a80cabc42167d384" exitCode=0 Apr 21 11:06:50.811240 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:50.810737 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75857865-42p5k" event={"ID":"4d2d0f96-bf6d-487a-a502-85fa5826d0ba","Type":"ContainerDied","Data":"fdcca36acbcd93b977049045da839b66bb25ea2e9ef57604a80cabc42167d384"} Apr 21 11:06:50.811899 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:50.811883 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-56554d98d7-fpmtv_5a9bf823-b072-4889-8e8b-ead3d1a2a393/storage-initializer/1.log" Apr 21 11:06:50.812266 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:50.812253 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-56554d98d7-fpmtv_5a9bf823-b072-4889-8e8b-ead3d1a2a393/storage-initializer/0.log" Apr 21 11:06:50.812332 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:50.812285 2577 generic.go:358] "Generic (PLEG): container finished" podID="5a9bf823-b072-4889-8e8b-ead3d1a2a393" containerID="1a24a75ce3675886e43a65b0ab9ac09b5b2b6549f53d40b0071b8eeed27fe2ef" exitCode=1 Apr 21 11:06:50.812385 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:50.812355 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-56554d98d7-fpmtv" Apr 21 11:06:50.812445 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:50.812374 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-56554d98d7-fpmtv" event={"ID":"5a9bf823-b072-4889-8e8b-ead3d1a2a393","Type":"ContainerDied","Data":"1a24a75ce3675886e43a65b0ab9ac09b5b2b6549f53d40b0071b8eeed27fe2ef"} Apr 21 11:06:50.812445 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:50.812421 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-56554d98d7-fpmtv" event={"ID":"5a9bf823-b072-4889-8e8b-ead3d1a2a393","Type":"ContainerDied","Data":"1ff6aa8069ce61f5c7eafe48771da66cdb72822c78dedc59b41be06314b26b30"} Apr 21 11:06:50.812551 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:50.812443 2577 scope.go:117] "RemoveContainer" containerID="1a24a75ce3675886e43a65b0ab9ac09b5b2b6549f53d40b0071b8eeed27fe2ef" Apr 21 11:06:50.820903 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:50.820880 2577 scope.go:117] "RemoveContainer" containerID="315cce20cc9ff557a34286b0e883fa612e0a3674febc5ddce93da30235894407" Apr 21 11:06:50.828460 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:50.828437 2577 scope.go:117] "RemoveContainer" containerID="1a24a75ce3675886e43a65b0ab9ac09b5b2b6549f53d40b0071b8eeed27fe2ef" Apr 21 11:06:50.828778 ip-10-0-133-157 kubenswrapper[2577]: E0421 11:06:50.828735 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a24a75ce3675886e43a65b0ab9ac09b5b2b6549f53d40b0071b8eeed27fe2ef\": container with ID starting with 1a24a75ce3675886e43a65b0ab9ac09b5b2b6549f53d40b0071b8eeed27fe2ef not found: ID does not exist" containerID="1a24a75ce3675886e43a65b0ab9ac09b5b2b6549f53d40b0071b8eeed27fe2ef" Apr 21 11:06:50.828874 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:50.828791 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a24a75ce3675886e43a65b0ab9ac09b5b2b6549f53d40b0071b8eeed27fe2ef"} err="failed to get container status \"1a24a75ce3675886e43a65b0ab9ac09b5b2b6549f53d40b0071b8eeed27fe2ef\": rpc error: code = NotFound desc = could not find container \"1a24a75ce3675886e43a65b0ab9ac09b5b2b6549f53d40b0071b8eeed27fe2ef\": container with ID starting with 1a24a75ce3675886e43a65b0ab9ac09b5b2b6549f53d40b0071b8eeed27fe2ef not found: ID does not exist" Apr 21 11:06:50.828874 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:50.828817 2577 scope.go:117] "RemoveContainer" containerID="315cce20cc9ff557a34286b0e883fa612e0a3674febc5ddce93da30235894407" Apr 21 11:06:50.829111 ip-10-0-133-157 kubenswrapper[2577]: E0421 11:06:50.829084 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"315cce20cc9ff557a34286b0e883fa612e0a3674febc5ddce93da30235894407\": container with ID starting with 315cce20cc9ff557a34286b0e883fa612e0a3674febc5ddce93da30235894407 not found: ID does not exist" containerID="315cce20cc9ff557a34286b0e883fa612e0a3674febc5ddce93da30235894407" Apr 21 11:06:50.829157 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:50.829122 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"315cce20cc9ff557a34286b0e883fa612e0a3674febc5ddce93da30235894407"} err="failed to get container status \"315cce20cc9ff557a34286b0e883fa612e0a3674febc5ddce93da30235894407\": rpc error: code = NotFound desc = could not find container \"315cce20cc9ff557a34286b0e883fa612e0a3674febc5ddce93da30235894407\": container with ID starting with 315cce20cc9ff557a34286b0e883fa612e0a3674febc5ddce93da30235894407 not found: ID does not exist" Apr 21 11:06:50.854464 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:50.854435 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-56554d98d7-fpmtv"] Apr 21 11:06:50.860442 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:50.860403 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-56554d98d7-fpmtv"] Apr 21 11:06:51.817088 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:51.817054 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75857865-42p5k" event={"ID":"4d2d0f96-bf6d-487a-a502-85fa5826d0ba","Type":"ContainerStarted","Data":"1ecc1942d823b3883055c51b5a361c98f7a8612e0a84b677f061663817abb721"} Apr 21 11:06:51.817555 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:51.817241 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75857865-42p5k" Apr 21 11:06:51.818695 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:51.818667 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75857865-42p5k" podUID="4d2d0f96-bf6d-487a-a502-85fa5826d0ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.68:8080: connect: connection refused" Apr 21 11:06:51.834317 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:51.834254 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75857865-42p5k" podStartSLOduration=3.83423286 podStartE2EDuration="3.83423286s" podCreationTimestamp="2026-04-21 11:06:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 11:06:51.833085254 +0000 UTC m=+3780.171619270" watchObservedRunningTime="2026-04-21 11:06:51.83423286 +0000 UTC m=+3780.172766876" Apr 21 11:06:52.231629 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:52.231597 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a9bf823-b072-4889-8e8b-ead3d1a2a393" path="/var/lib/kubelet/pods/5a9bf823-b072-4889-8e8b-ead3d1a2a393/volumes" Apr 21 11:06:52.821507 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:52.821469 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75857865-42p5k" podUID="4d2d0f96-bf6d-487a-a502-85fa5826d0ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.68:8080: connect: connection refused" Apr 21 11:06:56.212481 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:06:56.212448 2577 scope.go:117] "RemoveContainer" containerID="7ffbdd6693d1094d56633be96c86e395dccb23f5e76904c512477599f3ff76e9" Apr 21 11:07:02.822095 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:07:02.822047 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75857865-42p5k" podUID="4d2d0f96-bf6d-487a-a502-85fa5826d0ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.68:8080: connect: connection refused" Apr 21 11:07:12.821484 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:07:12.821440 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75857865-42p5k" podUID="4d2d0f96-bf6d-487a-a502-85fa5826d0ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.68:8080: connect: connection refused" Apr 21 11:07:22.821804 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:07:22.821727 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75857865-42p5k" podUID="4d2d0f96-bf6d-487a-a502-85fa5826d0ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.68:8080: connect: connection refused" Apr 21 11:07:32.822435 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:07:32.822382 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75857865-42p5k" podUID="4d2d0f96-bf6d-487a-a502-85fa5826d0ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.68:8080: connect: connection refused" Apr 21 11:07:42.821527 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:07:42.821481 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75857865-42p5k" podUID="4d2d0f96-bf6d-487a-a502-85fa5826d0ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.68:8080: connect: connection refused" Apr 21 11:07:52.822486 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:07:52.822426 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75857865-42p5k" podUID="4d2d0f96-bf6d-487a-a502-85fa5826d0ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.68:8080: connect: connection refused" Apr 21 11:07:53.227846 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:07:53.227807 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75857865-42p5k" podUID="4d2d0f96-bf6d-487a-a502-85fa5826d0ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.68:8080: connect: connection refused" Apr 21 11:08:03.229784 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:03.229730 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75857865-42p5k" Apr 21 11:08:08.570819 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:08.570776 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75857865-42p5k"] Apr 21 11:08:08.571245 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:08.571187 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75857865-42p5k" podUID="4d2d0f96-bf6d-487a-a502-85fa5826d0ba" containerName="kserve-container" containerID="cri-o://1ecc1942d823b3883055c51b5a361c98f7a8612e0a84b677f061663817abb721" gracePeriod=30 Apr 21 11:08:09.611320 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:09.611281 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6748f87c4c-vh58m"] Apr 21 11:08:09.611719 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:09.611590 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5a9bf823-b072-4889-8e8b-ead3d1a2a393" containerName="storage-initializer" Apr 21 11:08:09.611719 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:09.611601 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a9bf823-b072-4889-8e8b-ead3d1a2a393" containerName="storage-initializer" Apr 21 11:08:09.611719 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:09.611613 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5a9bf823-b072-4889-8e8b-ead3d1a2a393" containerName="storage-initializer" Apr 21 11:08:09.611719 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:09.611619 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a9bf823-b072-4889-8e8b-ead3d1a2a393" containerName="storage-initializer" Apr 21 11:08:09.611719 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:09.611665 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="5a9bf823-b072-4889-8e8b-ead3d1a2a393" containerName="storage-initializer" Apr 21 11:08:09.611719 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:09.611675 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="5a9bf823-b072-4889-8e8b-ead3d1a2a393" containerName="storage-initializer" Apr 21 11:08:09.614552 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:09.614532 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6748f87c4c-vh58m" Apr 21 11:08:09.624867 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:09.624841 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6748f87c4c-vh58m"] Apr 21 11:08:09.740418 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:09.740372 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/16d55ddd-56e2-4cd1-a703-17aafc84fa30-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-6748f87c4c-vh58m\" (UID: \"16d55ddd-56e2-4cd1-a703-17aafc84fa30\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6748f87c4c-vh58m" Apr 21 11:08:09.841659 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:09.841613 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/16d55ddd-56e2-4cd1-a703-17aafc84fa30-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-6748f87c4c-vh58m\" (UID: \"16d55ddd-56e2-4cd1-a703-17aafc84fa30\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6748f87c4c-vh58m" Apr 21 11:08:09.842109 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:09.842084 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/16d55ddd-56e2-4cd1-a703-17aafc84fa30-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-6748f87c4c-vh58m\" (UID: \"16d55ddd-56e2-4cd1-a703-17aafc84fa30\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6748f87c4c-vh58m" Apr 21 11:08:09.924980 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:09.924880 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6748f87c4c-vh58m" Apr 21 11:08:10.051224 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:10.051183 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6748f87c4c-vh58m"] Apr 21 11:08:10.054529 ip-10-0-133-157 kubenswrapper[2577]: W0421 11:08:10.054497 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16d55ddd_56e2_4cd1_a703_17aafc84fa30.slice/crio-0fe16e5e44c58232f62b563af1d73a3e708d8db315bb0f705fe438f595cf3cd5 WatchSource:0}: Error finding container 0fe16e5e44c58232f62b563af1d73a3e708d8db315bb0f705fe438f595cf3cd5: Status 404 returned error can't find the container with id 0fe16e5e44c58232f62b563af1d73a3e708d8db315bb0f705fe438f595cf3cd5 Apr 21 11:08:11.046163 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:11.046087 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6748f87c4c-vh58m" event={"ID":"16d55ddd-56e2-4cd1-a703-17aafc84fa30","Type":"ContainerStarted","Data":"bb0ee8c235e5c0f6af608836c79c5bcc67203bf0991981d16300e1a663c34055"} Apr 21 11:08:11.046163 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:11.046124 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6748f87c4c-vh58m" event={"ID":"16d55ddd-56e2-4cd1-a703-17aafc84fa30","Type":"ContainerStarted","Data":"0fe16e5e44c58232f62b563af1d73a3e708d8db315bb0f705fe438f595cf3cd5"} Apr 21 11:08:13.227926 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:13.227881 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75857865-42p5k" podUID="4d2d0f96-bf6d-487a-a502-85fa5826d0ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.68:8080: connect: connection refused" Apr 21 11:08:13.420543 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:13.420517 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75857865-42p5k" Apr 21 11:08:13.471128 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:13.471092 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/4d2d0f96-bf6d-487a-a502-85fa5826d0ba-cabundle-cert\") pod \"4d2d0f96-bf6d-487a-a502-85fa5826d0ba\" (UID: \"4d2d0f96-bf6d-487a-a502-85fa5826d0ba\") " Apr 21 11:08:13.471307 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:13.471195 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d2d0f96-bf6d-487a-a502-85fa5826d0ba-kserve-provision-location\") pod \"4d2d0f96-bf6d-487a-a502-85fa5826d0ba\" (UID: \"4d2d0f96-bf6d-487a-a502-85fa5826d0ba\") " Apr 21 11:08:13.471533 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:13.471507 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d2d0f96-bf6d-487a-a502-85fa5826d0ba-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "4d2d0f96-bf6d-487a-a502-85fa5826d0ba" (UID: "4d2d0f96-bf6d-487a-a502-85fa5826d0ba"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 11:08:13.471585 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:13.471506 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d2d0f96-bf6d-487a-a502-85fa5826d0ba-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4d2d0f96-bf6d-487a-a502-85fa5826d0ba" (UID: "4d2d0f96-bf6d-487a-a502-85fa5826d0ba"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 11:08:13.572369 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:13.572321 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d2d0f96-bf6d-487a-a502-85fa5826d0ba-kserve-provision-location\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 11:08:13.572369 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:13.572356 2577 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/4d2d0f96-bf6d-487a-a502-85fa5826d0ba-cabundle-cert\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 11:08:14.057633 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:14.057607 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-6748f87c4c-vh58m_16d55ddd-56e2-4cd1-a703-17aafc84fa30/storage-initializer/0.log" Apr 21 11:08:14.057837 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:14.057643 2577 generic.go:358] "Generic (PLEG): container finished" podID="16d55ddd-56e2-4cd1-a703-17aafc84fa30" containerID="bb0ee8c235e5c0f6af608836c79c5bcc67203bf0991981d16300e1a663c34055" exitCode=1 Apr 21 11:08:14.057837 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:14.057724 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6748f87c4c-vh58m" event={"ID":"16d55ddd-56e2-4cd1-a703-17aafc84fa30","Type":"ContainerDied","Data":"bb0ee8c235e5c0f6af608836c79c5bcc67203bf0991981d16300e1a663c34055"} Apr 21 11:08:14.059238 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:14.059215 2577 generic.go:358] "Generic (PLEG): container finished" podID="4d2d0f96-bf6d-487a-a502-85fa5826d0ba" containerID="1ecc1942d823b3883055c51b5a361c98f7a8612e0a84b677f061663817abb721" exitCode=0 Apr 21 11:08:14.059351 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:14.059267 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75857865-42p5k" event={"ID":"4d2d0f96-bf6d-487a-a502-85fa5826d0ba","Type":"ContainerDied","Data":"1ecc1942d823b3883055c51b5a361c98f7a8612e0a84b677f061663817abb721"} Apr 21 11:08:14.059351 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:14.059295 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75857865-42p5k" event={"ID":"4d2d0f96-bf6d-487a-a502-85fa5826d0ba","Type":"ContainerDied","Data":"e7527fff51e4fdec05db43afa7f080fe47928b65e10aa4af4c41b60dd129186a"} Apr 21 11:08:14.059351 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:14.059316 2577 scope.go:117] "RemoveContainer" containerID="1ecc1942d823b3883055c51b5a361c98f7a8612e0a84b677f061663817abb721" Apr 21 11:08:14.059525 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:14.059271 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75857865-42p5k" Apr 21 11:08:14.068917 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:14.068898 2577 scope.go:117] "RemoveContainer" containerID="fdcca36acbcd93b977049045da839b66bb25ea2e9ef57604a80cabc42167d384" Apr 21 11:08:14.081916 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:14.081895 2577 scope.go:117] "RemoveContainer" containerID="1ecc1942d823b3883055c51b5a361c98f7a8612e0a84b677f061663817abb721" Apr 21 11:08:14.082228 ip-10-0-133-157 kubenswrapper[2577]: E0421 11:08:14.082209 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ecc1942d823b3883055c51b5a361c98f7a8612e0a84b677f061663817abb721\": container with ID starting with 1ecc1942d823b3883055c51b5a361c98f7a8612e0a84b677f061663817abb721 not found: ID does not exist" containerID="1ecc1942d823b3883055c51b5a361c98f7a8612e0a84b677f061663817abb721" Apr 21 11:08:14.082286 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:14.082239 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ecc1942d823b3883055c51b5a361c98f7a8612e0a84b677f061663817abb721"} err="failed to get container status \"1ecc1942d823b3883055c51b5a361c98f7a8612e0a84b677f061663817abb721\": rpc error: code = NotFound desc = could not find container \"1ecc1942d823b3883055c51b5a361c98f7a8612e0a84b677f061663817abb721\": container with ID starting with 1ecc1942d823b3883055c51b5a361c98f7a8612e0a84b677f061663817abb721 not found: ID does not exist" Apr 21 11:08:14.082286 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:14.082258 2577 scope.go:117] "RemoveContainer" containerID="fdcca36acbcd93b977049045da839b66bb25ea2e9ef57604a80cabc42167d384" Apr 21 11:08:14.082545 ip-10-0-133-157 kubenswrapper[2577]: E0421 11:08:14.082527 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdcca36acbcd93b977049045da839b66bb25ea2e9ef57604a80cabc42167d384\": container with ID starting with fdcca36acbcd93b977049045da839b66bb25ea2e9ef57604a80cabc42167d384 not found: ID does not exist" containerID="fdcca36acbcd93b977049045da839b66bb25ea2e9ef57604a80cabc42167d384" Apr 21 11:08:14.082593 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:14.082553 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdcca36acbcd93b977049045da839b66bb25ea2e9ef57604a80cabc42167d384"} err="failed to get container status \"fdcca36acbcd93b977049045da839b66bb25ea2e9ef57604a80cabc42167d384\": rpc error: code = NotFound desc = could not find container \"fdcca36acbcd93b977049045da839b66bb25ea2e9ef57604a80cabc42167d384\": container with ID starting with fdcca36acbcd93b977049045da839b66bb25ea2e9ef57604a80cabc42167d384 not found: ID does not exist" Apr 21 11:08:14.086144 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:14.086116 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75857865-42p5k"] Apr 21 11:08:14.088354 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:14.088331 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-75857865-42p5k"] Apr 21 11:08:14.231667 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:14.231632 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d2d0f96-bf6d-487a-a502-85fa5826d0ba" path="/var/lib/kubelet/pods/4d2d0f96-bf6d-487a-a502-85fa5826d0ba/volumes" Apr 21 11:08:15.064862 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:15.064837 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-6748f87c4c-vh58m_16d55ddd-56e2-4cd1-a703-17aafc84fa30/storage-initializer/0.log" Apr 21 11:08:15.065062 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:15.064898 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6748f87c4c-vh58m" event={"ID":"16d55ddd-56e2-4cd1-a703-17aafc84fa30","Type":"ContainerStarted","Data":"0ccf9db0f692c0a62eadc5a666e36b3132a604d06766057a9ac886472274a684"} Apr 21 11:08:19.626671 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:19.626637 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6748f87c4c-vh58m"] Apr 21 11:08:19.627150 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:19.626906 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6748f87c4c-vh58m" podUID="16d55ddd-56e2-4cd1-a703-17aafc84fa30" containerName="storage-initializer" containerID="cri-o://0ccf9db0f692c0a62eadc5a666e36b3132a604d06766057a9ac886472274a684" gracePeriod=30 Apr 21 11:08:20.677646 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:20.677610 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c79d95d48-q4ztw"] Apr 21 11:08:20.678094 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:20.677929 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4d2d0f96-bf6d-487a-a502-85fa5826d0ba" containerName="storage-initializer" Apr 21 11:08:20.678094 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:20.677941 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d2d0f96-bf6d-487a-a502-85fa5826d0ba" containerName="storage-initializer" Apr 21 11:08:20.678094 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:20.677949 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4d2d0f96-bf6d-487a-a502-85fa5826d0ba" containerName="kserve-container" Apr 21 11:08:20.678094 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:20.677955 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d2d0f96-bf6d-487a-a502-85fa5826d0ba" containerName="kserve-container" Apr 21 11:08:20.678094 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:20.678018 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="4d2d0f96-bf6d-487a-a502-85fa5826d0ba" containerName="kserve-container" Apr 21 11:08:20.681077 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:20.681060 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c79d95d48-q4ztw" Apr 21 11:08:20.683320 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:20.683292 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 21 11:08:20.690577 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:20.690552 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c79d95d48-q4ztw"] Apr 21 11:08:20.721114 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:20.721071 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/ed0833ab-60c4-4801-aa3d-221bb5509e54-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-7c79d95d48-q4ztw\" (UID: \"ed0833ab-60c4-4801-aa3d-221bb5509e54\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c79d95d48-q4ztw" Apr 21 11:08:20.721281 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:20.721138 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ed0833ab-60c4-4801-aa3d-221bb5509e54-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-7c79d95d48-q4ztw\" (UID: \"ed0833ab-60c4-4801-aa3d-221bb5509e54\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c79d95d48-q4ztw" Apr 21 11:08:20.822148 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:20.822104 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/ed0833ab-60c4-4801-aa3d-221bb5509e54-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-7c79d95d48-q4ztw\" (UID: \"ed0833ab-60c4-4801-aa3d-221bb5509e54\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c79d95d48-q4ztw" Apr 21 11:08:20.822360 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:20.822173 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ed0833ab-60c4-4801-aa3d-221bb5509e54-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-7c79d95d48-q4ztw\" (UID: \"ed0833ab-60c4-4801-aa3d-221bb5509e54\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c79d95d48-q4ztw" Apr 21 11:08:20.822575 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:20.822546 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ed0833ab-60c4-4801-aa3d-221bb5509e54-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-7c79d95d48-q4ztw\" (UID: \"ed0833ab-60c4-4801-aa3d-221bb5509e54\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c79d95d48-q4ztw" Apr 21 11:08:20.822790 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:20.822740 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/ed0833ab-60c4-4801-aa3d-221bb5509e54-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-7c79d95d48-q4ztw\" (UID: \"ed0833ab-60c4-4801-aa3d-221bb5509e54\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c79d95d48-q4ztw" Apr 21 11:08:20.991481 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:20.991446 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c79d95d48-q4ztw" Apr 21 11:08:21.116796 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:21.116767 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c79d95d48-q4ztw"] Apr 21 11:08:21.119519 ip-10-0-133-157 kubenswrapper[2577]: W0421 11:08:21.119491 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded0833ab_60c4_4801_aa3d_221bb5509e54.slice/crio-20b80edfd19cacc653bd321477ac7f6a7689e396faca15807094df137456967e WatchSource:0}: Error finding container 20b80edfd19cacc653bd321477ac7f6a7689e396faca15807094df137456967e: Status 404 returned error can't find the container with id 20b80edfd19cacc653bd321477ac7f6a7689e396faca15807094df137456967e Apr 21 11:08:21.859664 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:21.859635 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-6748f87c4c-vh58m_16d55ddd-56e2-4cd1-a703-17aafc84fa30/storage-initializer/1.log" Apr 21 11:08:21.860048 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:21.860034 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-6748f87c4c-vh58m_16d55ddd-56e2-4cd1-a703-17aafc84fa30/storage-initializer/0.log" Apr 21 11:08:21.860105 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:21.860094 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6748f87c4c-vh58m" Apr 21 11:08:21.933681 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:21.933588 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/16d55ddd-56e2-4cd1-a703-17aafc84fa30-kserve-provision-location\") pod \"16d55ddd-56e2-4cd1-a703-17aafc84fa30\" (UID: \"16d55ddd-56e2-4cd1-a703-17aafc84fa30\") " Apr 21 11:08:21.933904 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:21.933880 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16d55ddd-56e2-4cd1-a703-17aafc84fa30-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "16d55ddd-56e2-4cd1-a703-17aafc84fa30" (UID: "16d55ddd-56e2-4cd1-a703-17aafc84fa30"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 11:08:22.034558 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:22.034511 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/16d55ddd-56e2-4cd1-a703-17aafc84fa30-kserve-provision-location\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 11:08:22.090179 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:22.090152 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c79d95d48-q4ztw" event={"ID":"ed0833ab-60c4-4801-aa3d-221bb5509e54","Type":"ContainerStarted","Data":"279cedbf6644e26a7509a3a5f9099b031f6fc1e619c8df14d6dabc23e455e089"} Apr 21 11:08:22.090310 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:22.090186 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c79d95d48-q4ztw" event={"ID":"ed0833ab-60c4-4801-aa3d-221bb5509e54","Type":"ContainerStarted","Data":"20b80edfd19cacc653bd321477ac7f6a7689e396faca15807094df137456967e"} Apr 21 11:08:22.091292 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:22.091274 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-6748f87c4c-vh58m_16d55ddd-56e2-4cd1-a703-17aafc84fa30/storage-initializer/1.log" Apr 21 11:08:22.091687 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:22.091669 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-6748f87c4c-vh58m_16d55ddd-56e2-4cd1-a703-17aafc84fa30/storage-initializer/0.log" Apr 21 11:08:22.091738 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:22.091711 2577 generic.go:358] "Generic (PLEG): container finished" podID="16d55ddd-56e2-4cd1-a703-17aafc84fa30" containerID="0ccf9db0f692c0a62eadc5a666e36b3132a604d06766057a9ac886472274a684" exitCode=1 Apr 21 11:08:22.091812 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:22.091764 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6748f87c4c-vh58m" event={"ID":"16d55ddd-56e2-4cd1-a703-17aafc84fa30","Type":"ContainerDied","Data":"0ccf9db0f692c0a62eadc5a666e36b3132a604d06766057a9ac886472274a684"} Apr 21 11:08:22.091812 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:22.091785 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6748f87c4c-vh58m" Apr 21 11:08:22.091812 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:22.091796 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6748f87c4c-vh58m" event={"ID":"16d55ddd-56e2-4cd1-a703-17aafc84fa30","Type":"ContainerDied","Data":"0fe16e5e44c58232f62b563af1d73a3e708d8db315bb0f705fe438f595cf3cd5"} Apr 21 11:08:22.091926 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:22.091817 2577 scope.go:117] "RemoveContainer" containerID="0ccf9db0f692c0a62eadc5a666e36b3132a604d06766057a9ac886472274a684" Apr 21 11:08:22.127801 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:22.127777 2577 scope.go:117] "RemoveContainer" containerID="bb0ee8c235e5c0f6af608836c79c5bcc67203bf0991981d16300e1a663c34055" Apr 21 11:08:22.135170 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:22.135152 2577 scope.go:117] "RemoveContainer" containerID="0ccf9db0f692c0a62eadc5a666e36b3132a604d06766057a9ac886472274a684" Apr 21 11:08:22.135434 ip-10-0-133-157 kubenswrapper[2577]: E0421 11:08:22.135417 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ccf9db0f692c0a62eadc5a666e36b3132a604d06766057a9ac886472274a684\": container with ID starting with 0ccf9db0f692c0a62eadc5a666e36b3132a604d06766057a9ac886472274a684 not found: ID does not exist" containerID="0ccf9db0f692c0a62eadc5a666e36b3132a604d06766057a9ac886472274a684" Apr 21 11:08:22.135501 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:22.135444 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ccf9db0f692c0a62eadc5a666e36b3132a604d06766057a9ac886472274a684"} err="failed to get container status \"0ccf9db0f692c0a62eadc5a666e36b3132a604d06766057a9ac886472274a684\": rpc error: code = NotFound desc = could not find container \"0ccf9db0f692c0a62eadc5a666e36b3132a604d06766057a9ac886472274a684\": container with ID starting with 0ccf9db0f692c0a62eadc5a666e36b3132a604d06766057a9ac886472274a684 not found: ID does not exist" Apr 21 11:08:22.135501 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:22.135459 2577 scope.go:117] "RemoveContainer" containerID="bb0ee8c235e5c0f6af608836c79c5bcc67203bf0991981d16300e1a663c34055" Apr 21 11:08:22.135686 ip-10-0-133-157 kubenswrapper[2577]: E0421 11:08:22.135672 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb0ee8c235e5c0f6af608836c79c5bcc67203bf0991981d16300e1a663c34055\": container with ID starting with bb0ee8c235e5c0f6af608836c79c5bcc67203bf0991981d16300e1a663c34055 not found: ID does not exist" containerID="bb0ee8c235e5c0f6af608836c79c5bcc67203bf0991981d16300e1a663c34055" Apr 21 11:08:22.135735 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:22.135689 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb0ee8c235e5c0f6af608836c79c5bcc67203bf0991981d16300e1a663c34055"} err="failed to get container status \"bb0ee8c235e5c0f6af608836c79c5bcc67203bf0991981d16300e1a663c34055\": rpc error: code = NotFound desc = could not find container \"bb0ee8c235e5c0f6af608836c79c5bcc67203bf0991981d16300e1a663c34055\": container with ID starting with bb0ee8c235e5c0f6af608836c79c5bcc67203bf0991981d16300e1a663c34055 not found: ID does not exist" Apr 21 11:08:22.152608 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:22.152581 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6748f87c4c-vh58m"] Apr 21 11:08:22.157193 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:22.157163 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-6748f87c4c-vh58m"] Apr 21 11:08:22.237257 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:22.234334 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16d55ddd-56e2-4cd1-a703-17aafc84fa30" path="/var/lib/kubelet/pods/16d55ddd-56e2-4cd1-a703-17aafc84fa30/volumes" Apr 21 11:08:23.096218 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:23.096183 2577 generic.go:358] "Generic (PLEG): container finished" podID="ed0833ab-60c4-4801-aa3d-221bb5509e54" containerID="279cedbf6644e26a7509a3a5f9099b031f6fc1e619c8df14d6dabc23e455e089" exitCode=0 Apr 21 11:08:23.096218 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:23.096231 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c79d95d48-q4ztw" event={"ID":"ed0833ab-60c4-4801-aa3d-221bb5509e54","Type":"ContainerDied","Data":"279cedbf6644e26a7509a3a5f9099b031f6fc1e619c8df14d6dabc23e455e089"} Apr 21 11:08:24.101150 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:24.101114 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c79d95d48-q4ztw" event={"ID":"ed0833ab-60c4-4801-aa3d-221bb5509e54","Type":"ContainerStarted","Data":"3575fba8a35258fb00df099f12dad45b2e6086e9df74129e8d30173fd43e01e8"} Apr 21 11:08:24.101642 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:24.101385 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c79d95d48-q4ztw" Apr 21 11:08:24.102608 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:24.102582 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c79d95d48-q4ztw" podUID="ed0833ab-60c4-4801-aa3d-221bb5509e54" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.70:8080: connect: connection refused" Apr 21 11:08:24.125735 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:24.125680 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c79d95d48-q4ztw" podStartSLOduration=4.125664693 podStartE2EDuration="4.125664693s" podCreationTimestamp="2026-04-21 11:08:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 11:08:24.123363879 +0000 UTC m=+3872.461897896" watchObservedRunningTime="2026-04-21 11:08:24.125664693 +0000 UTC m=+3872.464198709" Apr 21 11:08:25.104990 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:25.104952 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c79d95d48-q4ztw" podUID="ed0833ab-60c4-4801-aa3d-221bb5509e54" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.70:8080: connect: connection refused" Apr 21 11:08:35.105685 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:35.105638 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c79d95d48-q4ztw" podUID="ed0833ab-60c4-4801-aa3d-221bb5509e54" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.70:8080: connect: connection refused" Apr 21 11:08:45.105269 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:45.105219 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c79d95d48-q4ztw" podUID="ed0833ab-60c4-4801-aa3d-221bb5509e54" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.70:8080: connect: connection refused" Apr 21 11:08:52.398436 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:52.398405 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-9vsls_c7fc40ec-8080-4847-9b12-78671f916c03/console-operator/2.log" Apr 21 11:08:52.407390 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:52.407361 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-9vsls_c7fc40ec-8080-4847-9b12-78671f916c03/console-operator/2.log" Apr 21 11:08:55.105149 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:08:55.105097 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c79d95d48-q4ztw" podUID="ed0833ab-60c4-4801-aa3d-221bb5509e54" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.70:8080: connect: connection refused" Apr 21 11:09:05.105154 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:05.105101 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c79d95d48-q4ztw" podUID="ed0833ab-60c4-4801-aa3d-221bb5509e54" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.70:8080: connect: connection refused" Apr 21 11:09:15.105653 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:15.105609 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c79d95d48-q4ztw" podUID="ed0833ab-60c4-4801-aa3d-221bb5509e54" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.70:8080: connect: connection refused" Apr 21 11:09:25.105621 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:25.105579 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c79d95d48-q4ztw" podUID="ed0833ab-60c4-4801-aa3d-221bb5509e54" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.70:8080: connect: connection refused" Apr 21 11:09:35.106453 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:35.106421 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c79d95d48-q4ztw" Apr 21 11:09:40.704506 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:40.704466 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c79d95d48-q4ztw"] Apr 21 11:09:40.704919 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:40.704697 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c79d95d48-q4ztw" podUID="ed0833ab-60c4-4801-aa3d-221bb5509e54" containerName="kserve-container" containerID="cri-o://3575fba8a35258fb00df099f12dad45b2e6086e9df74129e8d30173fd43e01e8" gracePeriod=30 Apr 21 11:09:41.780153 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:41.780110 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-596c58c69d-c7ql2"] Apr 21 11:09:41.780542 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:41.780440 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="16d55ddd-56e2-4cd1-a703-17aafc84fa30" containerName="storage-initializer" Apr 21 11:09:41.780542 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:41.780453 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="16d55ddd-56e2-4cd1-a703-17aafc84fa30" containerName="storage-initializer" Apr 21 11:09:41.780542 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:41.780468 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="16d55ddd-56e2-4cd1-a703-17aafc84fa30" containerName="storage-initializer" Apr 21 11:09:41.780542 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:41.780475 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="16d55ddd-56e2-4cd1-a703-17aafc84fa30" containerName="storage-initializer" Apr 21 11:09:41.780542 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:41.780530 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="16d55ddd-56e2-4cd1-a703-17aafc84fa30" containerName="storage-initializer" Apr 21 11:09:41.780706 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:41.780629 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="16d55ddd-56e2-4cd1-a703-17aafc84fa30" containerName="storage-initializer" Apr 21 11:09:41.783651 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:41.783635 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-596c58c69d-c7ql2" Apr 21 11:09:41.792952 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:41.792925 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-596c58c69d-c7ql2"] Apr 21 11:09:41.852411 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:41.852360 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8273c798-7589-4ac7-b45e-7addc7bbc841-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-596c58c69d-c7ql2\" (UID: \"8273c798-7589-4ac7-b45e-7addc7bbc841\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-596c58c69d-c7ql2" Apr 21 11:09:41.953509 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:41.953463 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8273c798-7589-4ac7-b45e-7addc7bbc841-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-596c58c69d-c7ql2\" (UID: \"8273c798-7589-4ac7-b45e-7addc7bbc841\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-596c58c69d-c7ql2" Apr 21 11:09:41.953938 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:41.953912 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8273c798-7589-4ac7-b45e-7addc7bbc841-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-596c58c69d-c7ql2\" (UID: \"8273c798-7589-4ac7-b45e-7addc7bbc841\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-596c58c69d-c7ql2" Apr 21 11:09:42.094457 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:42.094365 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-596c58c69d-c7ql2" Apr 21 11:09:42.217136 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:42.217103 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-596c58c69d-c7ql2"] Apr 21 11:09:42.220105 ip-10-0-133-157 kubenswrapper[2577]: W0421 11:09:42.220077 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8273c798_7589_4ac7_b45e_7addc7bbc841.slice/crio-63cdab851d925abb8f360dffbbdddeabb4adfc155bc1ad2206f2b93830306aec WatchSource:0}: Error finding container 63cdab851d925abb8f360dffbbdddeabb4adfc155bc1ad2206f2b93830306aec: Status 404 returned error can't find the container with id 63cdab851d925abb8f360dffbbdddeabb4adfc155bc1ad2206f2b93830306aec Apr 21 11:09:42.328527 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:42.328493 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-596c58c69d-c7ql2" event={"ID":"8273c798-7589-4ac7-b45e-7addc7bbc841","Type":"ContainerStarted","Data":"e83afc8119f44da128f28a51f9b494ccae119f5f2662a43c23d731347e754e59"} Apr 21 11:09:42.328527 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:42.328531 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-596c58c69d-c7ql2" event={"ID":"8273c798-7589-4ac7-b45e-7addc7bbc841","Type":"ContainerStarted","Data":"63cdab851d925abb8f360dffbbdddeabb4adfc155bc1ad2206f2b93830306aec"} Apr 21 11:09:45.105111 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:45.105063 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c79d95d48-q4ztw" podUID="ed0833ab-60c4-4801-aa3d-221bb5509e54" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.70:8080: connect: connection refused" Apr 21 11:09:45.338059 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:45.338025 2577 generic.go:358] "Generic (PLEG): container finished" podID="ed0833ab-60c4-4801-aa3d-221bb5509e54" containerID="3575fba8a35258fb00df099f12dad45b2e6086e9df74129e8d30173fd43e01e8" exitCode=0 Apr 21 11:09:45.338191 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:45.338093 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c79d95d48-q4ztw" event={"ID":"ed0833ab-60c4-4801-aa3d-221bb5509e54","Type":"ContainerDied","Data":"3575fba8a35258fb00df099f12dad45b2e6086e9df74129e8d30173fd43e01e8"} Apr 21 11:09:45.338191 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:45.338131 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c79d95d48-q4ztw" event={"ID":"ed0833ab-60c4-4801-aa3d-221bb5509e54","Type":"ContainerDied","Data":"20b80edfd19cacc653bd321477ac7f6a7689e396faca15807094df137456967e"} Apr 21 11:09:45.338191 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:45.338144 2577 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20b80edfd19cacc653bd321477ac7f6a7689e396faca15807094df137456967e" Apr 21 11:09:45.348438 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:45.348416 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c79d95d48-q4ztw" Apr 21 11:09:45.379860 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:45.379773 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ed0833ab-60c4-4801-aa3d-221bb5509e54-kserve-provision-location\") pod \"ed0833ab-60c4-4801-aa3d-221bb5509e54\" (UID: \"ed0833ab-60c4-4801-aa3d-221bb5509e54\") " Apr 21 11:09:45.379860 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:45.379811 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/ed0833ab-60c4-4801-aa3d-221bb5509e54-cabundle-cert\") pod \"ed0833ab-60c4-4801-aa3d-221bb5509e54\" (UID: \"ed0833ab-60c4-4801-aa3d-221bb5509e54\") " Apr 21 11:09:45.380125 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:45.380102 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed0833ab-60c4-4801-aa3d-221bb5509e54-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ed0833ab-60c4-4801-aa3d-221bb5509e54" (UID: "ed0833ab-60c4-4801-aa3d-221bb5509e54"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 11:09:45.380241 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:45.380215 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed0833ab-60c4-4801-aa3d-221bb5509e54-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "ed0833ab-60c4-4801-aa3d-221bb5509e54" (UID: "ed0833ab-60c4-4801-aa3d-221bb5509e54"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 21 11:09:45.480437 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:45.480396 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ed0833ab-60c4-4801-aa3d-221bb5509e54-kserve-provision-location\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 11:09:45.480437 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:45.480428 2577 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/ed0833ab-60c4-4801-aa3d-221bb5509e54-cabundle-cert\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 11:09:46.342743 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:46.342664 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-596c58c69d-c7ql2_8273c798-7589-4ac7-b45e-7addc7bbc841/storage-initializer/0.log" Apr 21 11:09:46.342743 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:46.342707 2577 generic.go:358] "Generic (PLEG): container finished" podID="8273c798-7589-4ac7-b45e-7addc7bbc841" containerID="e83afc8119f44da128f28a51f9b494ccae119f5f2662a43c23d731347e754e59" exitCode=1 Apr 21 11:09:46.343253 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:46.342787 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-596c58c69d-c7ql2" event={"ID":"8273c798-7589-4ac7-b45e-7addc7bbc841","Type":"ContainerDied","Data":"e83afc8119f44da128f28a51f9b494ccae119f5f2662a43c23d731347e754e59"} Apr 21 11:09:46.343253 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:46.342892 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c79d95d48-q4ztw" Apr 21 11:09:46.370045 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:46.370016 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c79d95d48-q4ztw"] Apr 21 11:09:46.375452 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:46.375423 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c79d95d48-q4ztw"] Apr 21 11:09:47.347692 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:47.347664 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-596c58c69d-c7ql2_8273c798-7589-4ac7-b45e-7addc7bbc841/storage-initializer/0.log" Apr 21 11:09:47.348096 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:47.347770 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-596c58c69d-c7ql2" event={"ID":"8273c798-7589-4ac7-b45e-7addc7bbc841","Type":"ContainerStarted","Data":"cc9f72ee731a57176d2a64b0b412a8f64a83d7074da8cb2ee63ab3a7ad2ded41"} Apr 21 11:09:48.231420 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:48.231383 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed0833ab-60c4-4801-aa3d-221bb5509e54" path="/var/lib/kubelet/pods/ed0833ab-60c4-4801-aa3d-221bb5509e54/volumes" Apr 21 11:09:51.359889 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:51.359864 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-596c58c69d-c7ql2_8273c798-7589-4ac7-b45e-7addc7bbc841/storage-initializer/1.log" Apr 21 11:09:51.360326 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:51.360218 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-596c58c69d-c7ql2_8273c798-7589-4ac7-b45e-7addc7bbc841/storage-initializer/0.log" Apr 21 11:09:51.360326 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:51.360248 2577 generic.go:358] "Generic (PLEG): container finished" podID="8273c798-7589-4ac7-b45e-7addc7bbc841" containerID="cc9f72ee731a57176d2a64b0b412a8f64a83d7074da8cb2ee63ab3a7ad2ded41" exitCode=1 Apr 21 11:09:51.360326 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:51.360306 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-596c58c69d-c7ql2" event={"ID":"8273c798-7589-4ac7-b45e-7addc7bbc841","Type":"ContainerDied","Data":"cc9f72ee731a57176d2a64b0b412a8f64a83d7074da8cb2ee63ab3a7ad2ded41"} Apr 21 11:09:51.360494 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:51.360337 2577 scope.go:117] "RemoveContainer" containerID="e83afc8119f44da128f28a51f9b494ccae119f5f2662a43c23d731347e754e59" Apr 21 11:09:51.360720 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:51.360702 2577 scope.go:117] "RemoveContainer" containerID="e83afc8119f44da128f28a51f9b494ccae119f5f2662a43c23d731347e754e59" Apr 21 11:09:51.370891 ip-10-0-133-157 kubenswrapper[2577]: E0421 11:09:51.370854 2577 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-serving-fail-predictor-596c58c69d-c7ql2_kserve-ci-e2e-test_8273c798-7589-4ac7-b45e-7addc7bbc841_0 in pod sandbox 63cdab851d925abb8f360dffbbdddeabb4adfc155bc1ad2206f2b93830306aec from index: no such id: 'e83afc8119f44da128f28a51f9b494ccae119f5f2662a43c23d731347e754e59'" containerID="e83afc8119f44da128f28a51f9b494ccae119f5f2662a43c23d731347e754e59" Apr 21 11:09:51.371007 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:51.370898 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e83afc8119f44da128f28a51f9b494ccae119f5f2662a43c23d731347e754e59"} err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-serving-fail-predictor-596c58c69d-c7ql2_kserve-ci-e2e-test_8273c798-7589-4ac7-b45e-7addc7bbc841_0 in pod sandbox 63cdab851d925abb8f360dffbbdddeabb4adfc155bc1ad2206f2b93830306aec from index: no such id: 'e83afc8119f44da128f28a51f9b494ccae119f5f2662a43c23d731347e754e59'" Apr 21 11:09:51.371091 ip-10-0-133-157 kubenswrapper[2577]: E0421 11:09:51.371070 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-sklearn-s3-tls-serving-fail-predictor-596c58c69d-c7ql2_kserve-ci-e2e-test(8273c798-7589-4ac7-b45e-7addc7bbc841)\"" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-596c58c69d-c7ql2" podUID="8273c798-7589-4ac7-b45e-7addc7bbc841" Apr 21 11:09:51.789149 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:51.789111 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-596c58c69d-c7ql2"] Apr 21 11:09:52.366985 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:52.366959 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-596c58c69d-c7ql2_8273c798-7589-4ac7-b45e-7addc7bbc841/storage-initializer/1.log" Apr 21 11:09:52.497955 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:52.497926 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-596c58c69d-c7ql2_8273c798-7589-4ac7-b45e-7addc7bbc841/storage-initializer/1.log" Apr 21 11:09:52.498130 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:52.497996 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-596c58c69d-c7ql2" Apr 21 11:09:52.534468 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:52.534423 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8273c798-7589-4ac7-b45e-7addc7bbc841-kserve-provision-location\") pod \"8273c798-7589-4ac7-b45e-7addc7bbc841\" (UID: \"8273c798-7589-4ac7-b45e-7addc7bbc841\") " Apr 21 11:09:52.534733 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:52.534709 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8273c798-7589-4ac7-b45e-7addc7bbc841-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8273c798-7589-4ac7-b45e-7addc7bbc841" (UID: "8273c798-7589-4ac7-b45e-7addc7bbc841"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 11:09:52.634995 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:52.634914 2577 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8273c798-7589-4ac7-b45e-7addc7bbc841-kserve-provision-location\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 11:09:53.370545 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:53.370515 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-596c58c69d-c7ql2_8273c798-7589-4ac7-b45e-7addc7bbc841/storage-initializer/1.log" Apr 21 11:09:53.371058 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:53.370625 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-596c58c69d-c7ql2" event={"ID":"8273c798-7589-4ac7-b45e-7addc7bbc841","Type":"ContainerDied","Data":"63cdab851d925abb8f360dffbbdddeabb4adfc155bc1ad2206f2b93830306aec"} Apr 21 11:09:53.371058 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:53.370659 2577 scope.go:117] "RemoveContainer" containerID="cc9f72ee731a57176d2a64b0b412a8f64a83d7074da8cb2ee63ab3a7ad2ded41" Apr 21 11:09:53.371058 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:53.370728 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-596c58c69d-c7ql2" Apr 21 11:09:53.405973 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:53.405939 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-596c58c69d-c7ql2"] Apr 21 11:09:53.409226 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:53.409193 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-596c58c69d-c7ql2"] Apr 21 11:09:53.567415 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:53.567378 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-456xf/must-gather-dfjvv"] Apr 21 11:09:53.567720 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:53.567709 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8273c798-7589-4ac7-b45e-7addc7bbc841" containerName="storage-initializer" Apr 21 11:09:53.567811 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:53.567722 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="8273c798-7589-4ac7-b45e-7addc7bbc841" containerName="storage-initializer" Apr 21 11:09:53.567811 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:53.567741 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ed0833ab-60c4-4801-aa3d-221bb5509e54" containerName="kserve-container" Apr 21 11:09:53.567811 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:53.567761 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed0833ab-60c4-4801-aa3d-221bb5509e54" containerName="kserve-container" Apr 21 11:09:53.567811 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:53.567770 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ed0833ab-60c4-4801-aa3d-221bb5509e54" containerName="storage-initializer" Apr 21 11:09:53.567811 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:53.567778 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed0833ab-60c4-4801-aa3d-221bb5509e54" containerName="storage-initializer" Apr 21 11:09:53.567811 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:53.567792 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8273c798-7589-4ac7-b45e-7addc7bbc841" containerName="storage-initializer" Apr 21 11:09:53.567811 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:53.567798 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="8273c798-7589-4ac7-b45e-7addc7bbc841" containerName="storage-initializer" Apr 21 11:09:53.568035 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:53.567842 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="8273c798-7589-4ac7-b45e-7addc7bbc841" containerName="storage-initializer" Apr 21 11:09:53.568035 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:53.567851 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="8273c798-7589-4ac7-b45e-7addc7bbc841" containerName="storage-initializer" Apr 21 11:09:53.568035 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:53.567860 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="ed0833ab-60c4-4801-aa3d-221bb5509e54" containerName="kserve-container" Apr 21 11:09:53.572054 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:53.572034 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-456xf/must-gather-dfjvv" Apr 21 11:09:53.574603 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:53.574577 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-456xf\"/\"default-dockercfg-fkr78\"" Apr 21 11:09:53.574738 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:53.574577 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-456xf\"/\"openshift-service-ca.crt\"" Apr 21 11:09:53.574738 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:53.574670 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-456xf\"/\"kube-root-ca.crt\"" Apr 21 11:09:53.580629 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:53.580603 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-456xf/must-gather-dfjvv"] Apr 21 11:09:53.648246 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:53.648139 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bckzn\" (UniqueName: \"kubernetes.io/projected/b59e365d-cbe1-4dd4-b357-66fec4e8c781-kube-api-access-bckzn\") pod \"must-gather-dfjvv\" (UID: \"b59e365d-cbe1-4dd4-b357-66fec4e8c781\") " pod="openshift-must-gather-456xf/must-gather-dfjvv" Apr 21 11:09:53.648246 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:53.648183 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b59e365d-cbe1-4dd4-b357-66fec4e8c781-must-gather-output\") pod \"must-gather-dfjvv\" (UID: \"b59e365d-cbe1-4dd4-b357-66fec4e8c781\") " pod="openshift-must-gather-456xf/must-gather-dfjvv" Apr 21 11:09:53.749447 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:53.749399 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bckzn\" (UniqueName: \"kubernetes.io/projected/b59e365d-cbe1-4dd4-b357-66fec4e8c781-kube-api-access-bckzn\") pod \"must-gather-dfjvv\" (UID: \"b59e365d-cbe1-4dd4-b357-66fec4e8c781\") " pod="openshift-must-gather-456xf/must-gather-dfjvv" Apr 21 11:09:53.749447 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:53.749449 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b59e365d-cbe1-4dd4-b357-66fec4e8c781-must-gather-output\") pod \"must-gather-dfjvv\" (UID: \"b59e365d-cbe1-4dd4-b357-66fec4e8c781\") " pod="openshift-must-gather-456xf/must-gather-dfjvv" Apr 21 11:09:53.749884 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:53.749861 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b59e365d-cbe1-4dd4-b357-66fec4e8c781-must-gather-output\") pod \"must-gather-dfjvv\" (UID: \"b59e365d-cbe1-4dd4-b357-66fec4e8c781\") " pod="openshift-must-gather-456xf/must-gather-dfjvv" Apr 21 11:09:53.758028 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:53.758003 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bckzn\" (UniqueName: \"kubernetes.io/projected/b59e365d-cbe1-4dd4-b357-66fec4e8c781-kube-api-access-bckzn\") pod \"must-gather-dfjvv\" (UID: \"b59e365d-cbe1-4dd4-b357-66fec4e8c781\") " pod="openshift-must-gather-456xf/must-gather-dfjvv" Apr 21 11:09:53.882405 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:53.882350 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-456xf/must-gather-dfjvv" Apr 21 11:09:54.004905 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:54.004865 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-456xf/must-gather-dfjvv"] Apr 21 11:09:54.008672 ip-10-0-133-157 kubenswrapper[2577]: W0421 11:09:54.008639 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb59e365d_cbe1_4dd4_b357_66fec4e8c781.slice/crio-5633790c7faf236d6eda356efb94d08617dca11b7e5ca29c1d82ab5e5036ebeb WatchSource:0}: Error finding container 5633790c7faf236d6eda356efb94d08617dca11b7e5ca29c1d82ab5e5036ebeb: Status 404 returned error can't find the container with id 5633790c7faf236d6eda356efb94d08617dca11b7e5ca29c1d82ab5e5036ebeb Apr 21 11:09:54.231813 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:54.231774 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8273c798-7589-4ac7-b45e-7addc7bbc841" path="/var/lib/kubelet/pods/8273c798-7589-4ac7-b45e-7addc7bbc841/volumes" Apr 21 11:09:54.375659 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:54.375625 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-456xf/must-gather-dfjvv" event={"ID":"b59e365d-cbe1-4dd4-b357-66fec4e8c781","Type":"ContainerStarted","Data":"5633790c7faf236d6eda356efb94d08617dca11b7e5ca29c1d82ab5e5036ebeb"} Apr 21 11:09:59.393327 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:59.393281 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-456xf/must-gather-dfjvv" event={"ID":"b59e365d-cbe1-4dd4-b357-66fec4e8c781","Type":"ContainerStarted","Data":"4f512c67c115bda6197669582f550e0cb44587842cc0b3699a1e5a58a03e42d6"} Apr 21 11:09:59.393327 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:59.393331 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-456xf/must-gather-dfjvv" event={"ID":"b59e365d-cbe1-4dd4-b357-66fec4e8c781","Type":"ContainerStarted","Data":"3e01b125b3f4942e3919d94c703b96200e04075c131e9d0496d6ba47c077e41f"} Apr 21 11:09:59.411356 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:09:59.411303 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-456xf/must-gather-dfjvv" podStartSLOduration=2.12844764 podStartE2EDuration="6.411287268s" podCreationTimestamp="2026-04-21 11:09:53 +0000 UTC" firstStartedPulling="2026-04-21 11:09:54.010424263 +0000 UTC m=+3962.348958258" lastFinishedPulling="2026-04-21 11:09:58.293263876 +0000 UTC m=+3966.631797886" observedRunningTime="2026-04-21 11:09:59.409445587 +0000 UTC m=+3967.747979605" watchObservedRunningTime="2026-04-21 11:09:59.411287268 +0000 UTC m=+3967.749821285" Apr 21 11:10:19.464611 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:19.464573 2577 generic.go:358] "Generic (PLEG): container finished" podID="b59e365d-cbe1-4dd4-b357-66fec4e8c781" containerID="3e01b125b3f4942e3919d94c703b96200e04075c131e9d0496d6ba47c077e41f" exitCode=0 Apr 21 11:10:19.465063 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:19.464637 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-456xf/must-gather-dfjvv" event={"ID":"b59e365d-cbe1-4dd4-b357-66fec4e8c781","Type":"ContainerDied","Data":"3e01b125b3f4942e3919d94c703b96200e04075c131e9d0496d6ba47c077e41f"} Apr 21 11:10:19.465063 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:19.464971 2577 scope.go:117] "RemoveContainer" containerID="3e01b125b3f4942e3919d94c703b96200e04075c131e9d0496d6ba47c077e41f" Apr 21 11:10:19.812443 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:19.812360 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-456xf_must-gather-dfjvv_b59e365d-cbe1-4dd4-b357-66fec4e8c781/gather/0.log" Apr 21 11:10:23.086779 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:23.086729 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-gslnz_6c7463f0-ead6-4060-b7c0-a5b97811f455/global-pull-secret-syncer/0.log" Apr 21 11:10:23.317909 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:23.317879 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-8c424_b7b9a85e-3e84-4e88-b193-a2eff9d45b6a/konnectivity-agent/0.log" Apr 21 11:10:23.372055 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:23.371955 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-133-157.ec2.internal_6c2c9c59cdaa3a4ce6126af55beb4c88/haproxy/0.log" Apr 21 11:10:25.263424 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:25.263388 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-456xf/must-gather-dfjvv"] Apr 21 11:10:25.264006 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:25.263603 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-456xf/must-gather-dfjvv" podUID="b59e365d-cbe1-4dd4-b357-66fec4e8c781" containerName="copy" containerID="cri-o://4f512c67c115bda6197669582f550e0cb44587842cc0b3699a1e5a58a03e42d6" gracePeriod=2 Apr 21 11:10:25.266293 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:25.266256 2577 status_manager.go:895] "Failed to get status for pod" podUID="b59e365d-cbe1-4dd4-b357-66fec4e8c781" pod="openshift-must-gather-456xf/must-gather-dfjvv" err="pods \"must-gather-dfjvv\" is forbidden: User \"system:node:ip-10-0-133-157.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-456xf\": no relationship found between node 'ip-10-0-133-157.ec2.internal' and this object" Apr 21 11:10:25.267121 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:25.267097 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-456xf/must-gather-dfjvv"] Apr 21 11:10:25.483871 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:25.483842 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-456xf_must-gather-dfjvv_b59e365d-cbe1-4dd4-b357-66fec4e8c781/copy/0.log" Apr 21 11:10:25.484245 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:25.484220 2577 generic.go:358] "Generic (PLEG): container finished" podID="b59e365d-cbe1-4dd4-b357-66fec4e8c781" containerID="4f512c67c115bda6197669582f550e0cb44587842cc0b3699a1e5a58a03e42d6" exitCode=143 Apr 21 11:10:25.532048 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:25.532024 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-456xf_must-gather-dfjvv_b59e365d-cbe1-4dd4-b357-66fec4e8c781/copy/0.log" Apr 21 11:10:25.532394 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:25.532378 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-456xf/must-gather-dfjvv" Apr 21 11:10:25.534531 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:25.534506 2577 status_manager.go:895] "Failed to get status for pod" podUID="b59e365d-cbe1-4dd4-b357-66fec4e8c781" pod="openshift-must-gather-456xf/must-gather-dfjvv" err="pods \"must-gather-dfjvv\" is forbidden: User \"system:node:ip-10-0-133-157.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-456xf\": no relationship found between node 'ip-10-0-133-157.ec2.internal' and this object" Apr 21 11:10:25.623105 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:25.623068 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bckzn\" (UniqueName: \"kubernetes.io/projected/b59e365d-cbe1-4dd4-b357-66fec4e8c781-kube-api-access-bckzn\") pod \"b59e365d-cbe1-4dd4-b357-66fec4e8c781\" (UID: \"b59e365d-cbe1-4dd4-b357-66fec4e8c781\") " Apr 21 11:10:25.623302 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:25.623150 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b59e365d-cbe1-4dd4-b357-66fec4e8c781-must-gather-output\") pod \"b59e365d-cbe1-4dd4-b357-66fec4e8c781\" (UID: \"b59e365d-cbe1-4dd4-b357-66fec4e8c781\") " Apr 21 11:10:25.624761 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:25.624724 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b59e365d-cbe1-4dd4-b357-66fec4e8c781-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "b59e365d-cbe1-4dd4-b357-66fec4e8c781" (UID: "b59e365d-cbe1-4dd4-b357-66fec4e8c781"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 21 11:10:25.625372 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:25.625346 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b59e365d-cbe1-4dd4-b357-66fec4e8c781-kube-api-access-bckzn" (OuterVolumeSpecName: "kube-api-access-bckzn") pod "b59e365d-cbe1-4dd4-b357-66fec4e8c781" (UID: "b59e365d-cbe1-4dd4-b357-66fec4e8c781"). InnerVolumeSpecName "kube-api-access-bckzn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 21 11:10:25.724668 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:25.724630 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bckzn\" (UniqueName: \"kubernetes.io/projected/b59e365d-cbe1-4dd4-b357-66fec4e8c781-kube-api-access-bckzn\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 11:10:25.724668 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:25.724659 2577 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/b59e365d-cbe1-4dd4-b357-66fec4e8c781-must-gather-output\") on node \"ip-10-0-133-157.ec2.internal\" DevicePath \"\"" Apr 21 11:10:26.232967 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:26.232933 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b59e365d-cbe1-4dd4-b357-66fec4e8c781" path="/var/lib/kubelet/pods/b59e365d-cbe1-4dd4-b357-66fec4e8c781/volumes" Apr 21 11:10:26.488055 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:26.487980 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-456xf_must-gather-dfjvv_b59e365d-cbe1-4dd4-b357-66fec4e8c781/copy/0.log" Apr 21 11:10:26.488431 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:26.488337 2577 scope.go:117] "RemoveContainer" containerID="4f512c67c115bda6197669582f550e0cb44587842cc0b3699a1e5a58a03e42d6" Apr 21 11:10:26.488431 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:26.488361 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-456xf/must-gather-dfjvv" Apr 21 11:10:26.495623 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:26.495602 2577 scope.go:117] "RemoveContainer" containerID="3e01b125b3f4942e3919d94c703b96200e04075c131e9d0496d6ba47c077e41f" Apr 21 11:10:27.220093 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:27.220060 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-4bsk9_8f4fd6b3-4c2d-49bd-a059-b4f9a8b90d38/monitoring-plugin/0.log" Apr 21 11:10:27.397945 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:27.397914 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-wfs5l_1957a663-aa91-445e-af62-0b93aec5c600/node-exporter/0.log" Apr 21 11:10:27.418182 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:27.418153 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-wfs5l_1957a663-aa91-445e-af62-0b93aec5c600/kube-rbac-proxy/0.log" Apr 21 11:10:27.441946 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:27.441923 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-wfs5l_1957a663-aa91-445e-af62-0b93aec5c600/init-textfile/0.log" Apr 21 11:10:27.894026 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:27.893990 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5d57f5d8f8-lvwjb_afe4eaaa-27cd-4437-9086-34a188a2d172/thanos-query/0.log" Apr 21 11:10:27.921448 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:27.921421 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5d57f5d8f8-lvwjb_afe4eaaa-27cd-4437-9086-34a188a2d172/kube-rbac-proxy-web/0.log" Apr 21 11:10:27.947700 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:27.947673 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5d57f5d8f8-lvwjb_afe4eaaa-27cd-4437-9086-34a188a2d172/kube-rbac-proxy/0.log" Apr 21 11:10:27.972902 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:27.972870 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5d57f5d8f8-lvwjb_afe4eaaa-27cd-4437-9086-34a188a2d172/prom-label-proxy/0.log" Apr 21 11:10:27.999081 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:27.999048 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5d57f5d8f8-lvwjb_afe4eaaa-27cd-4437-9086-34a188a2d172/kube-rbac-proxy-rules/0.log" Apr 21 11:10:28.023429 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:28.023397 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5d57f5d8f8-lvwjb_afe4eaaa-27cd-4437-9086-34a188a2d172/kube-rbac-proxy-metrics/0.log" Apr 21 11:10:29.510698 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:29.510666 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-9vsls_c7fc40ec-8080-4847-9b12-78671f916c03/console-operator/2.log" Apr 21 11:10:29.518597 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:29.518572 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-9vsls_c7fc40ec-8080-4847-9b12-78671f916c03/console-operator/3.log" Apr 21 11:10:29.814931 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:29.814850 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-x9mhh/perf-node-gather-daemonset-4tl7k"] Apr 21 11:10:29.815169 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:29.815156 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b59e365d-cbe1-4dd4-b357-66fec4e8c781" containerName="copy" Apr 21 11:10:29.815226 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:29.815171 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="b59e365d-cbe1-4dd4-b357-66fec4e8c781" containerName="copy" Apr 21 11:10:29.815226 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:29.815188 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b59e365d-cbe1-4dd4-b357-66fec4e8c781" containerName="gather" Apr 21 11:10:29.815226 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:29.815194 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="b59e365d-cbe1-4dd4-b357-66fec4e8c781" containerName="gather" Apr 21 11:10:29.815337 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:29.815243 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="b59e365d-cbe1-4dd4-b357-66fec4e8c781" containerName="copy" Apr 21 11:10:29.815337 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:29.815253 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="b59e365d-cbe1-4dd4-b357-66fec4e8c781" containerName="gather" Apr 21 11:10:29.820224 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:29.820203 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-4tl7k" Apr 21 11:10:29.822609 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:29.822584 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-x9mhh\"/\"kube-root-ca.crt\"" Apr 21 11:10:29.822609 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:29.822604 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-x9mhh\"/\"openshift-service-ca.crt\"" Apr 21 11:10:29.823359 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:29.823340 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-x9mhh\"/\"default-dockercfg-bdbtk\"" Apr 21 11:10:29.828987 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:29.828961 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-x9mhh/perf-node-gather-daemonset-4tl7k"] Apr 21 11:10:29.913171 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:29.913130 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-65885cbd5d-6gqst_276784b9-3960-4ce8-b86d-cc031d35d6cd/console/0.log" Apr 21 11:10:29.958804 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:29.958761 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8f26c376-70b3-41e7-af4f-ba5f5cbc06db-lib-modules\") pod \"perf-node-gather-daemonset-4tl7k\" (UID: \"8f26c376-70b3-41e7-af4f-ba5f5cbc06db\") " pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-4tl7k" Apr 21 11:10:29.959021 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:29.958818 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8f26c376-70b3-41e7-af4f-ba5f5cbc06db-podres\") pod \"perf-node-gather-daemonset-4tl7k\" (UID: \"8f26c376-70b3-41e7-af4f-ba5f5cbc06db\") " pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-4tl7k" Apr 21 11:10:29.959021 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:29.958885 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8f26c376-70b3-41e7-af4f-ba5f5cbc06db-proc\") pod \"perf-node-gather-daemonset-4tl7k\" (UID: \"8f26c376-70b3-41e7-af4f-ba5f5cbc06db\") " pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-4tl7k" Apr 21 11:10:29.959021 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:29.958990 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8f26c376-70b3-41e7-af4f-ba5f5cbc06db-sys\") pod \"perf-node-gather-daemonset-4tl7k\" (UID: \"8f26c376-70b3-41e7-af4f-ba5f5cbc06db\") " pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-4tl7k" Apr 21 11:10:29.959224 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:29.959027 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zph9z\" (UniqueName: \"kubernetes.io/projected/8f26c376-70b3-41e7-af4f-ba5f5cbc06db-kube-api-access-zph9z\") pod \"perf-node-gather-daemonset-4tl7k\" (UID: \"8f26c376-70b3-41e7-af4f-ba5f5cbc06db\") " pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-4tl7k" Apr 21 11:10:29.959275 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:29.959255 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-4wrmx_00a9f177-852d-4071-8823-418bcec59544/download-server/0.log" Apr 21 11:10:30.059938 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:30.059896 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8f26c376-70b3-41e7-af4f-ba5f5cbc06db-lib-modules\") pod \"perf-node-gather-daemonset-4tl7k\" (UID: \"8f26c376-70b3-41e7-af4f-ba5f5cbc06db\") " pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-4tl7k" Apr 21 11:10:30.059938 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:30.059942 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8f26c376-70b3-41e7-af4f-ba5f5cbc06db-podres\") pod \"perf-node-gather-daemonset-4tl7k\" (UID: \"8f26c376-70b3-41e7-af4f-ba5f5cbc06db\") " pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-4tl7k" Apr 21 11:10:30.060158 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:30.059985 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8f26c376-70b3-41e7-af4f-ba5f5cbc06db-proc\") pod \"perf-node-gather-daemonset-4tl7k\" (UID: \"8f26c376-70b3-41e7-af4f-ba5f5cbc06db\") " pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-4tl7k" Apr 21 11:10:30.060158 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:30.060027 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8f26c376-70b3-41e7-af4f-ba5f5cbc06db-sys\") pod \"perf-node-gather-daemonset-4tl7k\" (UID: \"8f26c376-70b3-41e7-af4f-ba5f5cbc06db\") " pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-4tl7k" Apr 21 11:10:30.060158 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:30.060044 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zph9z\" (UniqueName: \"kubernetes.io/projected/8f26c376-70b3-41e7-af4f-ba5f5cbc06db-kube-api-access-zph9z\") pod \"perf-node-gather-daemonset-4tl7k\" (UID: \"8f26c376-70b3-41e7-af4f-ba5f5cbc06db\") " pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-4tl7k" Apr 21 11:10:30.060158 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:30.060088 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8f26c376-70b3-41e7-af4f-ba5f5cbc06db-lib-modules\") pod \"perf-node-gather-daemonset-4tl7k\" (UID: \"8f26c376-70b3-41e7-af4f-ba5f5cbc06db\") " pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-4tl7k" Apr 21 11:10:30.060158 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:30.060104 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8f26c376-70b3-41e7-af4f-ba5f5cbc06db-proc\") pod \"perf-node-gather-daemonset-4tl7k\" (UID: \"8f26c376-70b3-41e7-af4f-ba5f5cbc06db\") " pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-4tl7k" Apr 21 11:10:30.060158 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:30.060127 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8f26c376-70b3-41e7-af4f-ba5f5cbc06db-sys\") pod \"perf-node-gather-daemonset-4tl7k\" (UID: \"8f26c376-70b3-41e7-af4f-ba5f5cbc06db\") " pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-4tl7k" Apr 21 11:10:30.060158 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:30.060131 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8f26c376-70b3-41e7-af4f-ba5f5cbc06db-podres\") pod \"perf-node-gather-daemonset-4tl7k\" (UID: \"8f26c376-70b3-41e7-af4f-ba5f5cbc06db\") " pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-4tl7k" Apr 21 11:10:30.068138 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:30.068054 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zph9z\" (UniqueName: \"kubernetes.io/projected/8f26c376-70b3-41e7-af4f-ba5f5cbc06db-kube-api-access-zph9z\") pod \"perf-node-gather-daemonset-4tl7k\" (UID: \"8f26c376-70b3-41e7-af4f-ba5f5cbc06db\") " pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-4tl7k" Apr 21 11:10:30.131248 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:30.131206 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-4tl7k" Apr 21 11:10:30.260360 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:30.260321 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-x9mhh/perf-node-gather-daemonset-4tl7k"] Apr 21 11:10:30.263847 ip-10-0-133-157 kubenswrapper[2577]: W0421 11:10:30.263815 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8f26c376_70b3_41e7_af4f_ba5f5cbc06db.slice/crio-27e234cc1a1e20dd0145be370a11348f77ff98d9aaca79d25fe7bcce3f2a3117 WatchSource:0}: Error finding container 27e234cc1a1e20dd0145be370a11348f77ff98d9aaca79d25fe7bcce3f2a3117: Status 404 returned error can't find the container with id 27e234cc1a1e20dd0145be370a11348f77ff98d9aaca79d25fe7bcce3f2a3117 Apr 21 11:10:30.265325 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:30.265310 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 21 11:10:30.347484 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:30.347407 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-wbqzj_1b0d20d2-b3e2-4083-9dd1-a3a892bbf98a/volume-data-source-validator/0.log" Apr 21 11:10:30.501801 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:30.501760 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-4tl7k" event={"ID":"8f26c376-70b3-41e7-af4f-ba5f5cbc06db","Type":"ContainerStarted","Data":"a6428f85fc0b05ae72f12619c4028971c8e904387970b2f57390522bf48520a2"} Apr 21 11:10:30.501801 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:30.501806 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-4tl7k" event={"ID":"8f26c376-70b3-41e7-af4f-ba5f5cbc06db","Type":"ContainerStarted","Data":"27e234cc1a1e20dd0145be370a11348f77ff98d9aaca79d25fe7bcce3f2a3117"} Apr 21 11:10:30.502048 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:30.501919 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-4tl7k" Apr 21 11:10:30.518119 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:30.518056 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-4tl7k" podStartSLOduration=1.518035792 podStartE2EDuration="1.518035792s" podCreationTimestamp="2026-04-21 11:10:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-21 11:10:30.51742742 +0000 UTC m=+3998.855961437" watchObservedRunningTime="2026-04-21 11:10:30.518035792 +0000 UTC m=+3998.856569811" Apr 21 11:10:30.993894 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:30.993867 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-75v2d_5a4cb6b4-0870-47bc-b13c-24f96bc4d282/dns/0.log" Apr 21 11:10:31.018330 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:31.018302 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-75v2d_5a4cb6b4-0870-47bc-b13c-24f96bc4d282/kube-rbac-proxy/0.log" Apr 21 11:10:31.201721 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:31.201693 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-c588q_186c1594-0ba9-495b-8213-27692e681b57/dns-node-resolver/0.log" Apr 21 11:10:31.614095 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:31.614064 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-4ggkj_b4e66db4-f8f5-415c-aac2-60c02dfc43ff/node-ca/0.log" Apr 21 11:10:32.638980 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:32.638951 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-6lwhm_b4056ea2-3fd8-4fb5-8ed3-575f7ee5cda0/serve-healthcheck-canary/0.log" Apr 21 11:10:33.190030 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:33.190001 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-lttth_93143241-5e63-4495-914e-e2c61040261e/kube-rbac-proxy/0.log" Apr 21 11:10:33.215782 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:33.215739 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-lttth_93143241-5e63-4495-914e-e2c61040261e/exporter/0.log" Apr 21 11:10:33.240319 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:33.240287 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-lttth_93143241-5e63-4495-914e-e2c61040261e/extractor/0.log" Apr 21 11:10:35.649391 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:35.649357 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-serving-t6nkp_c90f570d-18e2-4d5f-9d01-40adb6de844e/s3-tls-init-serving/0.log" Apr 21 11:10:35.677831 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:35.677805 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-c2h6w_728a7d4c-4b8c-45db-a0ab-ff664f9148bf/seaweedfs/0.log" Apr 21 11:10:36.514797 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:36.514770 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-x9mhh/perf-node-gather-daemonset-4tl7k" Apr 21 11:10:39.694048 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:39.694007 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-7829w_ad55884a-4753-4348-a129-272c6dfc8db3/migrator/0.log" Apr 21 11:10:39.713699 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:39.713671 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-7829w_ad55884a-4753-4348-a129-272c6dfc8db3/graceful-termination/0.log" Apr 21 11:10:41.000544 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:41.000517 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g77pc_1e8275fc-da1c-442f-8dd4-cc1ad0f529fe/kube-multus-additional-cni-plugins/0.log" Apr 21 11:10:41.023264 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:41.023229 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g77pc_1e8275fc-da1c-442f-8dd4-cc1ad0f529fe/egress-router-binary-copy/0.log" Apr 21 11:10:41.054196 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:41.054174 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g77pc_1e8275fc-da1c-442f-8dd4-cc1ad0f529fe/cni-plugins/0.log" Apr 21 11:10:41.085258 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:41.085228 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g77pc_1e8275fc-da1c-442f-8dd4-cc1ad0f529fe/bond-cni-plugin/0.log" Apr 21 11:10:41.109582 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:41.109552 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g77pc_1e8275fc-da1c-442f-8dd4-cc1ad0f529fe/routeoverride-cni/0.log" Apr 21 11:10:41.134626 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:41.134595 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g77pc_1e8275fc-da1c-442f-8dd4-cc1ad0f529fe/whereabouts-cni-bincopy/0.log" Apr 21 11:10:41.156842 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:41.156806 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g77pc_1e8275fc-da1c-442f-8dd4-cc1ad0f529fe/whereabouts-cni/0.log" Apr 21 11:10:41.587725 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:41.587690 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xtxxg_a327162a-7ff0-4ea9-9be0-15ce746f80a2/kube-multus/0.log" Apr 21 11:10:41.661169 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:41.661137 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-kckvj_e437b5da-7e75-4ed5-8d79-e418168b80fe/network-metrics-daemon/0.log" Apr 21 11:10:41.682869 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:41.682839 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-kckvj_e437b5da-7e75-4ed5-8d79-e418168b80fe/kube-rbac-proxy/0.log" Apr 21 11:10:42.477921 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:42.477887 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hc44q_7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f/ovn-controller/0.log" Apr 21 11:10:42.533586 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:42.533550 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hc44q_7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f/ovn-acl-logging/0.log" Apr 21 11:10:42.558164 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:42.558134 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hc44q_7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f/kube-rbac-proxy-node/0.log" Apr 21 11:10:42.582788 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:42.582741 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hc44q_7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f/kube-rbac-proxy-ovn-metrics/0.log" Apr 21 11:10:42.601757 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:42.601712 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hc44q_7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f/northd/0.log" Apr 21 11:10:42.625685 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:42.625644 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hc44q_7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f/nbdb/0.log" Apr 21 11:10:42.650632 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:42.650606 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hc44q_7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f/sbdb/0.log" Apr 21 11:10:42.830914 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:42.830834 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hc44q_7e21d9fd-fcef-42e2-8f1f-ef4aa8d9171f/ovnkube-controller/0.log" Apr 21 11:10:44.647272 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:44.647242 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-qxvf7_8723d4f5-441e-4586-b642-f008d599b082/network-check-target-container/0.log" Apr 21 11:10:45.548120 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:45.548083 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-gwnnd_c6bbb2a8-d1d8-46e7-a943-36618a64adb4/iptables-alerter/0.log" Apr 21 11:10:46.242090 ip-10-0-133-157 kubenswrapper[2577]: I0421 11:10:46.242061 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-h7tlg_451f825c-7185-464f-967b-97007b1437b8/tuned/0.log"