Apr 16 16:02:56.519469 ip-10-0-137-150 systemd[1]: Starting Kubernetes Kubelet... Apr 16 16:02:56.987877 ip-10-0-137-150 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:02:56.987877 ip-10-0-137-150 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 16:02:56.987877 ip-10-0-137-150 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:02:56.987877 ip-10-0-137-150 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 16:02:56.987877 ip-10-0-137-150 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:02:56.989621 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.989537 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 16:02:56.992609 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992593 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:02:56.992609 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992609 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:02:56.992672 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992613 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:02:56.992672 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992617 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:02:56.992672 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992620 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:02:56.992672 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992623 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:02:56.992672 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992626 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:02:56.992672 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992629 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:02:56.992672 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992632 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:02:56.992672 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992634 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:02:56.992672 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992638 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:02:56.992672 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992641 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:02:56.992672 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992644 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:02:56.992672 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992646 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:02:56.992672 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992649 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:02:56.992672 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992652 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:02:56.992672 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992654 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:02:56.992672 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992657 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:02:56.992672 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992660 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:02:56.992672 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992663 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:02:56.992672 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992665 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:02:56.992672 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992669 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:02:56.993140 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992671 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:02:56.993140 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992674 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:02:56.993140 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992677 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:02:56.993140 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992680 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:02:56.993140 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992683 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:02:56.993140 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992687 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:02:56.993140 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992691 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:02:56.993140 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992695 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:02:56.993140 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992698 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:02:56.993140 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992701 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:02:56.993140 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992704 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:02:56.993140 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992707 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:02:56.993140 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992710 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:02:56.993140 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992713 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:02:56.993140 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992715 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:02:56.993140 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992718 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:02:56.993140 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992721 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:02:56.993140 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992724 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:02:56.993140 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992727 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:02:56.993651 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992730 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:02:56.993651 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992733 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:02:56.993651 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992736 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:02:56.993651 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992739 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:02:56.993651 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992742 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:02:56.993651 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992744 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:02:56.993651 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992747 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:02:56.993651 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992749 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:02:56.993651 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992752 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:02:56.993651 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992754 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:02:56.993651 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992757 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:02:56.993651 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992761 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:02:56.993651 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992764 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:02:56.993651 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992767 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:02:56.993651 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992771 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:02:56.993651 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992774 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:02:56.993651 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992776 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:02:56.993651 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992779 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:02:56.993651 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992782 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:02:56.993651 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992785 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:02:56.994136 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992788 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:02:56.994136 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992791 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:02:56.994136 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992794 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:02:56.994136 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992797 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:02:56.994136 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992801 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:02:56.994136 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992803 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:02:56.994136 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992806 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:02:56.994136 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992808 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:02:56.994136 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992811 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:02:56.994136 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992814 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:02:56.994136 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992817 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:02:56.994136 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992820 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:02:56.994136 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992822 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:02:56.994136 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992825 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:02:56.994136 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992828 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:02:56.994136 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992830 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:02:56.994136 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992833 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:02:56.994136 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992836 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:02:56.994136 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992838 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:02:56.994136 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992841 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:02:56.994713 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992844 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:02:56.994713 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992846 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:02:56.994713 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992849 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:02:56.994713 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992852 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:02:56.994713 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.992854 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:02:56.994713 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993256 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:02:56.994713 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993264 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:02:56.994713 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993267 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:02:56.994713 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993269 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:02:56.994713 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993274 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:02:56.994713 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993278 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:02:56.994713 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993282 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:02:56.994713 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993285 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:02:56.994713 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993288 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:02:56.994713 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993290 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:02:56.994713 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993294 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:02:56.994713 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993296 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:02:56.994713 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993299 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:02:56.994713 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993302 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:02:56.994713 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993304 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:02:56.995193 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993307 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:02:56.995193 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993311 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:02:56.995193 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993313 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:02:56.995193 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993316 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:02:56.995193 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993318 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:02:56.995193 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993321 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:02:56.995193 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993324 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:02:56.995193 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993326 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:02:56.995193 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993329 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:02:56.995193 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993332 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:02:56.995193 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993335 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:02:56.995193 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993337 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:02:56.995193 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993339 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:02:56.995193 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993342 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:02:56.995193 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993344 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:02:56.995193 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993347 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:02:56.995193 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993350 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:02:56.995193 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993352 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:02:56.995193 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993356 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:02:56.995193 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993359 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:02:56.995710 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993362 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:02:56.995710 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993365 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:02:56.995710 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993368 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:02:56.995710 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993370 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:02:56.995710 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993373 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:02:56.995710 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993375 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:02:56.995710 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993379 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:02:56.995710 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993383 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:02:56.995710 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993386 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:02:56.995710 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993389 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:02:56.995710 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993392 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:02:56.995710 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993394 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:02:56.995710 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993397 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:02:56.995710 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993400 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:02:56.995710 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993403 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:02:56.995710 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993405 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:02:56.995710 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993408 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:02:56.995710 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993410 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:02:56.995710 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993413 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:02:56.996171 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993416 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:02:56.996171 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993418 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:02:56.996171 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993421 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:02:56.996171 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993424 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:02:56.996171 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993427 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:02:56.996171 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993430 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:02:56.996171 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993433 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:02:56.996171 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993435 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:02:56.996171 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993438 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:02:56.996171 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993441 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:02:56.996171 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993443 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:02:56.996171 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993446 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:02:56.996171 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993449 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:02:56.996171 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993451 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:02:56.996171 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993454 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:02:56.996171 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993456 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:02:56.996171 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993459 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:02:56.996171 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993461 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:02:56.996171 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993464 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:02:56.996655 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993466 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:02:56.996655 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993469 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:02:56.996655 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993471 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:02:56.996655 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993474 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:02:56.996655 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993477 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:02:56.996655 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993479 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:02:56.996655 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993482 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:02:56.996655 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993484 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:02:56.996655 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993487 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:02:56.996655 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993494 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:02:56.996655 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993497 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:02:56.996655 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993500 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:02:56.996655 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.993502 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:02:56.996655 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994358 2576 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 16:02:56.996655 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994371 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 16:02:56.996655 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994379 2576 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 16:02:56.996655 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994384 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 16:02:56.996655 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994388 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 16:02:56.996655 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994393 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 16:02:56.996655 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994397 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 16:02:56.996655 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994402 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 16:02:56.997166 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994405 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 16:02:56.997166 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994408 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 16:02:56.997166 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994412 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 16:02:56.997166 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994415 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 16:02:56.997166 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994418 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 16:02:56.997166 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994422 2576 flags.go:64] FLAG: --cgroup-root="" Apr 16 16:02:56.997166 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994425 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 16:02:56.997166 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994428 2576 flags.go:64] FLAG: --client-ca-file="" Apr 16 16:02:56.997166 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994431 2576 flags.go:64] FLAG: --cloud-config="" Apr 16 16:02:56.997166 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994434 2576 flags.go:64] FLAG: --cloud-provider="external" Apr 16 16:02:56.997166 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994437 2576 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 16:02:56.997166 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994441 2576 flags.go:64] FLAG: --cluster-domain="" Apr 16 16:02:56.997166 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994444 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 16:02:56.997166 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994447 2576 flags.go:64] FLAG: --config-dir="" Apr 16 16:02:56.997166 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994450 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 16:02:56.997166 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994453 2576 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 16:02:56.997166 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994457 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 16:02:56.997166 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994460 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 16:02:56.997166 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994463 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 16:02:56.997166 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994467 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 16:02:56.997166 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994470 2576 flags.go:64] FLAG: --contention-profiling="false" Apr 16 16:02:56.997166 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994474 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 16:02:56.997166 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994477 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 16:02:56.997166 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994481 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 16:02:56.997166 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994483 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 16:02:56.997801 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994488 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 16:02:56.997801 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994491 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 16:02:56.997801 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994494 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 16:02:56.997801 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994497 2576 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 16:02:56.997801 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994500 2576 flags.go:64] FLAG: --enable-server="true" Apr 16 16:02:56.997801 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994503 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 16:02:56.997801 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994508 2576 flags.go:64] FLAG: --event-burst="100" Apr 16 16:02:56.997801 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994511 2576 flags.go:64] FLAG: --event-qps="50" Apr 16 16:02:56.997801 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994514 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 16:02:56.997801 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994517 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 16:02:56.997801 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994520 2576 flags.go:64] FLAG: --eviction-hard="" Apr 16 16:02:56.997801 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994525 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 16:02:56.997801 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994528 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 16:02:56.997801 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994531 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 16:02:56.997801 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994534 2576 flags.go:64] FLAG: --eviction-soft="" Apr 16 16:02:56.997801 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994537 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 16:02:56.997801 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994540 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 16:02:56.997801 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994543 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 16:02:56.997801 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994546 2576 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 16:02:56.997801 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994550 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 16:02:56.997801 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994553 2576 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 16:02:56.997801 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994556 2576 flags.go:64] FLAG: --feature-gates="" Apr 16 16:02:56.997801 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994559 2576 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 16:02:56.997801 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994563 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 16:02:56.997801 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994566 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 16:02:56.998431 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994571 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 16:02:56.998431 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994574 2576 flags.go:64] FLAG: --healthz-port="10248" Apr 16 16:02:56.998431 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994577 2576 flags.go:64] FLAG: --help="false" Apr 16 16:02:56.998431 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994580 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-137-150.ec2.internal" Apr 16 16:02:56.998431 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994584 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 16:02:56.998431 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994587 2576 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 16:02:56.998431 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994590 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 16:02:56.998431 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994593 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 16:02:56.998431 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994597 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 16:02:56.998431 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994600 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 16:02:56.998431 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994624 2576 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 16:02:56.998431 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994629 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 16:02:56.998431 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994632 2576 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 16:02:56.998431 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994636 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 16:02:56.998431 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994639 2576 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 16:02:56.998431 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994642 2576 flags.go:64] FLAG: --kube-reserved="" Apr 16 16:02:56.998431 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994645 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 16:02:56.998431 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994648 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 16:02:56.998431 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994651 2576 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 16:02:56.998431 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994654 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 16:02:56.998431 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994657 2576 flags.go:64] FLAG: --lock-file="" Apr 16 16:02:56.998431 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994660 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 16:02:56.998431 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994663 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 16:02:56.998431 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994666 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 16:02:56.999001 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994672 2576 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 16:02:56.999001 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994675 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 16:02:56.999001 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994678 2576 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 16:02:56.999001 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994680 2576 flags.go:64] FLAG: --logging-format="text" Apr 16 16:02:56.999001 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994684 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 16:02:56.999001 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994687 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 16:02:56.999001 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994690 2576 flags.go:64] FLAG: --manifest-url="" Apr 16 16:02:56.999001 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994693 2576 flags.go:64] FLAG: --manifest-url-header="" Apr 16 16:02:56.999001 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994699 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 16:02:56.999001 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994702 2576 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 16:02:56.999001 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994706 2576 flags.go:64] FLAG: --max-pods="110" Apr 16 16:02:56.999001 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994709 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 16:02:56.999001 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994712 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 16:02:56.999001 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994716 2576 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 16:02:56.999001 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994719 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 16:02:56.999001 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994722 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 16:02:56.999001 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994726 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 16:02:56.999001 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994729 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 16:02:56.999001 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994737 2576 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 16:02:56.999001 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994740 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 16:02:56.999001 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994743 2576 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 16:02:56.999001 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994747 2576 flags.go:64] FLAG: --pod-cidr="" Apr 16 16:02:56.999001 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994750 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 16:02:56.999601 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994756 2576 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 16:02:56.999601 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994759 2576 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 16:02:56.999601 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994762 2576 flags.go:64] FLAG: --pods-per-core="0" Apr 16 16:02:56.999601 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994765 2576 flags.go:64] FLAG: --port="10250" Apr 16 16:02:56.999601 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994768 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 16:02:56.999601 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994771 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-008ce19bcf0b14fe1" Apr 16 16:02:56.999601 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994774 2576 flags.go:64] FLAG: --qos-reserved="" Apr 16 16:02:56.999601 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994777 2576 flags.go:64] FLAG: --read-only-port="10255" Apr 16 16:02:56.999601 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994780 2576 flags.go:64] FLAG: --register-node="true" Apr 16 16:02:56.999601 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994783 2576 flags.go:64] FLAG: --register-schedulable="true" Apr 16 16:02:56.999601 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994786 2576 flags.go:64] FLAG: --register-with-taints="" Apr 16 16:02:56.999601 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994790 2576 flags.go:64] FLAG: --registry-burst="10" Apr 16 16:02:56.999601 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994793 2576 flags.go:64] FLAG: --registry-qps="5" Apr 16 16:02:56.999601 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994795 2576 flags.go:64] FLAG: --reserved-cpus="" Apr 16 16:02:56.999601 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994798 2576 flags.go:64] FLAG: --reserved-memory="" Apr 16 16:02:56.999601 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994802 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 16:02:56.999601 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994804 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 16:02:56.999601 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994809 2576 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 16:02:56.999601 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994812 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 16:02:56.999601 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994815 2576 flags.go:64] FLAG: --runonce="false" Apr 16 16:02:56.999601 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994818 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 16:02:56.999601 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994821 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 16:02:56.999601 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994824 2576 flags.go:64] FLAG: --seccomp-default="false" Apr 16 16:02:56.999601 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994828 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 16:02:56.999601 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994831 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 16:02:56.999601 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994834 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 16:02:57.000239 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994837 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 16:02:57.000239 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994840 2576 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 16:02:57.000239 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994844 2576 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 16:02:57.000239 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994847 2576 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 16:02:57.000239 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994850 2576 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 16:02:57.000239 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994852 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 16:02:57.000239 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994856 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 16:02:57.000239 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994859 2576 flags.go:64] FLAG: --system-cgroups="" Apr 16 16:02:57.000239 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994862 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 16:02:57.000239 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994867 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 16:02:57.000239 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994870 2576 flags.go:64] FLAG: --tls-cert-file="" Apr 16 16:02:57.000239 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994872 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 16:02:57.000239 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994876 2576 flags.go:64] FLAG: --tls-min-version="" Apr 16 16:02:57.000239 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994879 2576 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 16:02:57.000239 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994881 2576 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 16:02:57.000239 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994884 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 16:02:57.000239 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994887 2576 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 16:02:57.000239 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994890 2576 flags.go:64] FLAG: --v="2" Apr 16 16:02:57.000239 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994894 2576 flags.go:64] FLAG: --version="false" Apr 16 16:02:57.000239 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994898 2576 flags.go:64] FLAG: --vmodule="" Apr 16 16:02:57.000239 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994903 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 16:02:57.000239 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.994906 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 16:02:57.000239 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.994993 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:02:57.000239 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.994998 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:02:57.000239 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995002 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:02:57.000875 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995004 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:02:57.000875 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995007 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:02:57.000875 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995009 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:02:57.000875 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995012 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:02:57.000875 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995014 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:02:57.000875 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995017 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:02:57.000875 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995020 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:02:57.000875 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995022 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:02:57.000875 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995025 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:02:57.000875 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995028 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:02:57.000875 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995031 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:02:57.000875 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995033 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:02:57.000875 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995036 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:02:57.000875 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995039 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:02:57.000875 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995041 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:02:57.000875 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995044 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:02:57.000875 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995046 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:02:57.000875 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995049 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:02:57.000875 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995051 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:02:57.000875 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995054 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:02:57.001433 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995057 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:02:57.001433 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995059 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:02:57.001433 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995062 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:02:57.001433 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995065 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:02:57.001433 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995067 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:02:57.001433 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995070 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:02:57.001433 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995073 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:02:57.001433 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995075 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:02:57.001433 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995078 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:02:57.001433 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995080 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:02:57.001433 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995084 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:02:57.001433 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995086 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:02:57.001433 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995089 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:02:57.001433 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995091 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:02:57.001433 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995094 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:02:57.001433 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995096 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:02:57.001433 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995099 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:02:57.001433 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995102 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:02:57.001433 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995110 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:02:57.001433 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995113 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:02:57.001913 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995116 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:02:57.001913 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995119 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:02:57.001913 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995122 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:02:57.001913 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995124 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:02:57.001913 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995127 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:02:57.001913 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995131 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:02:57.001913 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995134 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:02:57.001913 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995138 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:02:57.001913 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995141 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:02:57.001913 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995144 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:02:57.001913 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995147 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:02:57.001913 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995150 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:02:57.001913 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995153 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:02:57.001913 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995155 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:02:57.001913 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995158 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:02:57.001913 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995161 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:02:57.001913 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995165 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:02:57.001913 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995168 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:02:57.001913 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995171 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:02:57.002396 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995173 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:02:57.002396 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995176 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:02:57.002396 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995179 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:02:57.002396 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995183 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:02:57.002396 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995185 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:02:57.002396 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995188 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:02:57.002396 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995191 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:02:57.002396 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995193 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:02:57.002396 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995196 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:02:57.002396 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995199 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:02:57.002396 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995202 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:02:57.002396 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995205 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:02:57.002396 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995225 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:02:57.002396 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995229 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:02:57.002396 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995231 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:02:57.002396 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995234 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:02:57.002396 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995236 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:02:57.002396 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995239 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:02:57.002396 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995242 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:02:57.002396 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995244 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:02:57.002897 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995246 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:02:57.002897 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995249 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:02:57.002897 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995252 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:02:57.002897 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:56.995254 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:02:57.002897 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:56.995974 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:02:57.002897 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.002432 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 16:02:57.002897 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.002450 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 16:02:57.002897 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002506 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:02:57.002897 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002511 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:02:57.002897 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002514 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:02:57.002897 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002518 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:02:57.002897 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002521 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:02:57.002897 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002524 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:02:57.002897 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002527 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:02:57.002897 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002530 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:02:57.003308 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002533 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:02:57.003308 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002536 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:02:57.003308 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002539 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:02:57.003308 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002542 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:02:57.003308 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002545 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:02:57.003308 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002547 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:02:57.003308 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002550 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:02:57.003308 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002553 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:02:57.003308 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002556 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:02:57.003308 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002559 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:02:57.003308 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002561 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:02:57.003308 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002564 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:02:57.003308 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002567 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:02:57.003308 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002569 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:02:57.003308 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002572 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:02:57.003308 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002574 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:02:57.003308 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002577 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:02:57.003308 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002579 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:02:57.003308 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002583 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:02:57.003308 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002586 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:02:57.003799 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002603 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:02:57.003799 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002607 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:02:57.003799 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002610 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:02:57.003799 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002613 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:02:57.003799 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002616 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:02:57.003799 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002618 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:02:57.003799 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002621 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:02:57.003799 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002624 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:02:57.003799 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002627 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:02:57.003799 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002629 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:02:57.003799 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002633 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:02:57.003799 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002637 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:02:57.003799 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002640 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:02:57.003799 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002644 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:02:57.003799 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002647 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:02:57.003799 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002650 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:02:57.003799 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002653 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:02:57.003799 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002656 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:02:57.003799 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002658 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:02:57.004315 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002661 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:02:57.004315 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002664 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:02:57.004315 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002666 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:02:57.004315 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002670 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:02:57.004315 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002672 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:02:57.004315 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002675 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:02:57.004315 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002677 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:02:57.004315 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002680 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:02:57.004315 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002682 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:02:57.004315 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002685 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:02:57.004315 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002688 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:02:57.004315 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002691 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:02:57.004315 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002693 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:02:57.004315 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002696 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:02:57.004315 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002700 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:02:57.004315 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002704 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:02:57.004315 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002706 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:02:57.004315 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002709 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:02:57.004315 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002712 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:02:57.004782 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002714 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:02:57.004782 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002717 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:02:57.004782 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002720 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:02:57.004782 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002722 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:02:57.004782 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002725 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:02:57.004782 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002728 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:02:57.004782 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002730 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:02:57.004782 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002733 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:02:57.004782 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002736 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:02:57.004782 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002738 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:02:57.004782 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002741 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:02:57.004782 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002743 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:02:57.004782 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002746 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:02:57.004782 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002749 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:02:57.004782 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002751 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:02:57.004782 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002754 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:02:57.004782 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002757 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:02:57.004782 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002760 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:02:57.004782 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002763 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:02:57.005248 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002766 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:02:57.005248 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.002771 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:02:57.005248 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002873 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:02:57.005248 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002879 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:02:57.005248 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002882 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:02:57.005248 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002887 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:02:57.005248 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002890 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:02:57.005248 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002894 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:02:57.005248 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002897 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:02:57.005248 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002900 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:02:57.005248 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002903 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:02:57.005248 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002906 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:02:57.005248 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002909 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:02:57.005248 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002912 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:02:57.005248 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002915 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:02:57.005616 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002918 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:02:57.005616 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002921 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:02:57.005616 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002924 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:02:57.005616 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002927 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:02:57.005616 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002929 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:02:57.005616 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002932 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:02:57.005616 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002934 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:02:57.005616 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002937 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:02:57.005616 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002939 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:02:57.005616 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002942 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:02:57.005616 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002945 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:02:57.005616 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002947 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:02:57.005616 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002950 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:02:57.005616 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002952 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:02:57.005616 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002955 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:02:57.005616 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002958 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:02:57.005616 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002961 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:02:57.005616 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002964 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:02:57.005616 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002966 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:02:57.005616 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002969 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:02:57.006094 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002972 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:02:57.006094 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002974 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:02:57.006094 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002977 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:02:57.006094 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002980 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:02:57.006094 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002982 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:02:57.006094 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002985 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:02:57.006094 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002987 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:02:57.006094 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002990 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:02:57.006094 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002992 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:02:57.006094 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002995 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:02:57.006094 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.002998 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:02:57.006094 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.003001 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:02:57.006094 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.003005 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:02:57.006094 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.003008 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:02:57.006094 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.003010 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:02:57.006094 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.003013 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:02:57.006094 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.003016 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:02:57.006094 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.003018 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:02:57.006094 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.003021 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:02:57.006574 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.003023 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:02:57.006574 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.003026 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:02:57.006574 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.003028 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:02:57.006574 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.003031 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:02:57.006574 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.003034 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:02:57.006574 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.003036 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:02:57.006574 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.003039 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:02:57.006574 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.003042 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:02:57.006574 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.003045 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:02:57.006574 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.003048 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:02:57.006574 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.003051 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:02:57.006574 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.003054 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:02:57.006574 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.003056 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:02:57.006574 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.003058 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:02:57.006574 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.003061 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:02:57.006574 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.003063 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:02:57.006574 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.003066 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:02:57.006574 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.003069 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:02:57.006574 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.003071 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:02:57.006574 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.003074 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:02:57.007059 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.003076 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:02:57.007059 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.003079 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:02:57.007059 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.003081 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:02:57.007059 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.003084 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:02:57.007059 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.003086 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:02:57.007059 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.003089 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:02:57.007059 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.003091 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:02:57.007059 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.003094 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:02:57.007059 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.003096 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:02:57.007059 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.003099 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:02:57.007059 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.003101 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:02:57.007059 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.003104 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:02:57.007059 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.003106 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:02:57.007059 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:57.003109 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:02:57.007059 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.003114 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:02:57.007539 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.004008 2576 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 16:02:57.010033 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.010019 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 16:02:57.010984 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.010973 2576 server.go:1019] "Starting client certificate rotation" Apr 16 16:02:57.011105 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.011087 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 16:02:57.011154 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.011133 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 16:02:57.038922 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.038900 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 16:02:57.043349 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.043326 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 16:02:57.060352 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.060337 2576 log.go:25] "Validated CRI v1 runtime API" Apr 16 16:02:57.065412 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.065398 2576 log.go:25] "Validated CRI v1 image API" Apr 16 16:02:57.066694 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.066672 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 16:02:57.069779 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.069751 2576 fs.go:135] Filesystem UUIDs: map[7157087e-d8c9-4b0a-92d1-586482d317af:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 e005bbd5-3b70-41a6-95ef-5ad544de8feb:/dev/nvme0n1p3] Apr 16 16:02:57.069860 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.069778 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 16:02:57.071028 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.071003 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 16:02:57.075677 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.075565 2576 manager.go:217] Machine: {Timestamp:2026-04-16 16:02:57.07360883 +0000 UTC m=+0.428598950 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100714 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec238715fcdb0d47aa6794f1b3f89ab5 SystemUUID:ec238715-fcdb-0d47-aa67-94f1b3f89ab5 BootID:145d2df8-49ff-4e8a-8a4c-49d5b76106e1 Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:df:1d:c1:4d:79 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:df:1d:c1:4d:79 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:62:51:86:e3:af:1e Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 16:02:57.075677 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.075667 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 16:02:57.075810 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.075741 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 16:02:57.077292 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.077265 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 16:02:57.077466 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.077294 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-137-150.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 16:02:57.077514 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.077476 2576 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 16:02:57.077514 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.077485 2576 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 16:02:57.077514 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.077497 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 16:02:57.078239 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.078229 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 16:02:57.079699 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.079689 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 16 16:02:57.079801 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.079792 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 16:02:57.082220 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.082199 2576 kubelet.go:491] "Attempting to sync node with API server" Apr 16 16:02:57.082266 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.082228 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 16:02:57.082266 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.082241 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 16:02:57.082266 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.082251 2576 kubelet.go:397] "Adding apiserver pod source" Apr 16 16:02:57.082266 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.082259 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 16:02:57.083566 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.083549 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 16:02:57.083658 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.083577 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 16:02:57.086652 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.086636 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 16:02:57.088328 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.088315 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 16:02:57.089592 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.089577 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 16:02:57.089592 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.089593 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 16:02:57.089710 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.089600 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 16:02:57.089710 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.089609 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 16:02:57.089710 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.089619 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 16:02:57.089710 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.089627 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 16:02:57.089710 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.089633 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 16:02:57.089710 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.089639 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 16:02:57.089710 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.089646 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 16:02:57.089710 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.089652 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 16:02:57.089710 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.089672 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 16:02:57.089710 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.089685 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 16:02:57.090540 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.090530 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 16:02:57.090540 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.090540 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 16:02:57.091305 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.091287 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-w8vwk" Apr 16 16:02:57.093930 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.093917 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 16:02:57.093991 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.093952 2576 server.go:1295] "Started kubelet" Apr 16 16:02:57.094065 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.094043 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 16:02:57.098616 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.094126 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 16:02:57.098701 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.098647 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 16:02:57.099143 ip-10-0-137-150 systemd[1]: Started Kubernetes Kubelet. Apr 16 16:02:57.099781 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:02:57.099755 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-137-150.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 16:02:57.099868 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.099784 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 16:02:57.099968 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:02:57.099828 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 16:02:57.099968 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.099925 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-137-150.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 16:02:57.100566 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.100539 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-w8vwk" Apr 16 16:02:57.102165 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.102145 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 16 16:02:57.106661 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.106645 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 16:02:57.106793 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.106752 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 16:02:57.107625 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.107588 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 16:02:57.107728 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.107684 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 16:02:57.107728 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.107701 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 16:02:57.107830 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.107799 2576 reconstruct.go:97] "Volume reconstruction finished" Apr 16 16:02:57.107830 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.107804 2576 reconciler.go:26] "Reconciler: start to sync state" Apr 16 16:02:57.107985 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:02:57.107958 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-150.ec2.internal\" not found" Apr 16 16:02:57.108274 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.108247 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 16:02:57.108274 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.108270 2576 factory.go:55] Registering systemd factory Apr 16 16:02:57.108489 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.108279 2576 factory.go:223] Registration of the systemd container factory successfully Apr 16 16:02:57.108622 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.108606 2576 factory.go:153] Registering CRI-O factory Apr 16 16:02:57.108667 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.108626 2576 factory.go:223] Registration of the crio container factory successfully Apr 16 16:02:57.108667 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.108643 2576 factory.go:103] Registering Raw factory Apr 16 16:02:57.108667 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.108656 2576 manager.go:1196] Started watching for new ooms in manager Apr 16 16:02:57.109250 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.109234 2576 manager.go:319] Starting recovery of all containers Apr 16 16:02:57.109359 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:02:57.109337 2576 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 16:02:57.111221 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.111187 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:02:57.113895 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:02:57.113876 2576 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-137-150.ec2.internal\" not found" node="ip-10-0-137-150.ec2.internal" Apr 16 16:02:57.119854 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.119759 2576 manager.go:324] Recovery completed Apr 16 16:02:57.124023 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.124013 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:02:57.126257 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.126242 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-150.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:02:57.126322 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.126269 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-150.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:02:57.126322 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.126279 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-150.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:02:57.126741 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.126729 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 16:02:57.126785 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.126741 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 16:02:57.126785 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.126756 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 16 16:02:57.128911 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.128900 2576 policy_none.go:49] "None policy: Start" Apr 16 16:02:57.128953 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.128915 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 16:02:57.128953 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.128925 2576 state_mem.go:35] "Initializing new in-memory state store" Apr 16 16:02:57.162857 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.162839 2576 manager.go:341] "Starting Device Plugin manager" Apr 16 16:02:57.179579 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:02:57.162871 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 16:02:57.179579 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.162883 2576 server.go:85] "Starting device plugin registration server" Apr 16 16:02:57.179579 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.163087 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 16:02:57.179579 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.163099 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 16:02:57.179579 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.163225 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 16:02:57.179579 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.163297 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 16:02:57.179579 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.163303 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 16:02:57.179579 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:02:57.163726 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 16:02:57.179579 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:02:57.163758 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-137-150.ec2.internal\" not found" Apr 16 16:02:57.264013 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.263946 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:02:57.264971 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.264954 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-150.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:02:57.265054 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.264982 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-150.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:02:57.265054 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.264993 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-150.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:02:57.265054 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.265017 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-137-150.ec2.internal" Apr 16 16:02:57.272133 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.272105 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 16:02:57.273297 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.273280 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 16:02:57.273376 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.273305 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 16:02:57.273376 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.273327 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 16:02:57.273376 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.273336 2576 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 16:02:57.273376 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:02:57.273374 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 16:02:57.273556 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.273413 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-137-150.ec2.internal" Apr 16 16:02:57.273556 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:02:57.273439 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-137-150.ec2.internal\": node \"ip-10-0-137-150.ec2.internal\" not found" Apr 16 16:02:57.276927 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.276911 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:02:57.316775 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:02:57.316753 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-150.ec2.internal\" not found" Apr 16 16:02:57.373839 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.373819 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-150.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-137-150.ec2.internal"] Apr 16 16:02:57.373892 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.373880 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:02:57.374704 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.374683 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-150.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:02:57.374784 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.374713 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-150.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:02:57.374784 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.374723 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-150.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:02:57.375919 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.375908 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:02:57.376047 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.376033 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-150.ec2.internal" Apr 16 16:02:57.376092 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.376062 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:02:57.377617 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.377604 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-150.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:02:57.377680 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.377609 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-150.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:02:57.377680 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.377654 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-150.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:02:57.377680 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.377664 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-150.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:02:57.377680 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.377631 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-150.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:02:57.377818 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.377691 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-150.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:02:57.379438 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.379423 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-150.ec2.internal" Apr 16 16:02:57.379504 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.379452 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:02:57.380124 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.380110 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-150.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:02:57.380185 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.380140 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-150.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:02:57.380185 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.380153 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-150.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:02:57.398732 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:02:57.398714 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-150.ec2.internal\" not found" node="ip-10-0-137-150.ec2.internal" Apr 16 16:02:57.402691 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:02:57.402677 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-150.ec2.internal\" not found" node="ip-10-0-137-150.ec2.internal" Apr 16 16:02:57.409912 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.409896 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/86a89fa024a7f053b23d64348497745e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-150.ec2.internal\" (UID: \"86a89fa024a7f053b23d64348497745e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-150.ec2.internal" Apr 16 16:02:57.409958 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.409919 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6ba071faccce2e60097e197ecb90d16a-config\") pod \"kube-apiserver-proxy-ip-10-0-137-150.ec2.internal\" (UID: \"6ba071faccce2e60097e197ecb90d16a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-150.ec2.internal" Apr 16 16:02:57.409958 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.409935 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/86a89fa024a7f053b23d64348497745e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-150.ec2.internal\" (UID: \"86a89fa024a7f053b23d64348497745e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-150.ec2.internal" Apr 16 16:02:57.417577 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:02:57.417565 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-150.ec2.internal\" not found" Apr 16 16:02:57.510204 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.510178 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/86a89fa024a7f053b23d64348497745e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-150.ec2.internal\" (UID: \"86a89fa024a7f053b23d64348497745e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-150.ec2.internal" Apr 16 16:02:57.510321 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.510222 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/86a89fa024a7f053b23d64348497745e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-150.ec2.internal\" (UID: \"86a89fa024a7f053b23d64348497745e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-150.ec2.internal" Apr 16 16:02:57.510321 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.510239 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6ba071faccce2e60097e197ecb90d16a-config\") pod \"kube-apiserver-proxy-ip-10-0-137-150.ec2.internal\" (UID: \"6ba071faccce2e60097e197ecb90d16a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-150.ec2.internal" Apr 16 16:02:57.510321 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.510279 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/86a89fa024a7f053b23d64348497745e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-150.ec2.internal\" (UID: \"86a89fa024a7f053b23d64348497745e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-150.ec2.internal" Apr 16 16:02:57.510321 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.510283 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/6ba071faccce2e60097e197ecb90d16a-config\") pod \"kube-apiserver-proxy-ip-10-0-137-150.ec2.internal\" (UID: \"6ba071faccce2e60097e197ecb90d16a\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-150.ec2.internal" Apr 16 16:02:57.510442 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.510330 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/86a89fa024a7f053b23d64348497745e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-150.ec2.internal\" (UID: \"86a89fa024a7f053b23d64348497745e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-150.ec2.internal" Apr 16 16:02:57.518315 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:02:57.518269 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-150.ec2.internal\" not found" Apr 16 16:02:57.618943 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:02:57.618922 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-150.ec2.internal\" not found" Apr 16 16:02:57.702092 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.702060 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-150.ec2.internal" Apr 16 16:02:57.705810 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:57.705791 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-150.ec2.internal" Apr 16 16:02:57.719751 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:02:57.719732 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-150.ec2.internal\" not found" Apr 16 16:02:57.820238 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:02:57.820169 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-150.ec2.internal\" not found" Apr 16 16:02:57.920709 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:02:57.920687 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-150.ec2.internal\" not found" Apr 16 16:02:58.011161 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:58.011141 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 16:02:58.011618 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:58.011290 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 16:02:58.011618 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:58.011311 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 16:02:58.021416 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:02:58.021397 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-150.ec2.internal\" not found" Apr 16 16:02:58.102567 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:58.102508 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 15:57:57 +0000 UTC" deadline="2027-12-12 17:41:59.741108534 +0000 UTC" Apr 16 16:02:58.102567 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:58.102532 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14521h39m1.638579479s" Apr 16 16:02:58.107650 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:58.107627 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 16:02:58.122162 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:02:58.122143 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-150.ec2.internal\" not found" Apr 16 16:02:58.129348 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:58.129329 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 16:02:58.167622 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:58.167591 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-qgbbl" Apr 16 16:02:58.178071 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:58.178050 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-qgbbl" Apr 16 16:02:58.222366 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:58.222340 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ba071faccce2e60097e197ecb90d16a.slice/crio-71525fd93611fdb761dd25e2ac714d820cc37f6a82ea373d841ccf649ff66883 WatchSource:0}: Error finding container 71525fd93611fdb761dd25e2ac714d820cc37f6a82ea373d841ccf649ff66883: Status 404 returned error can't find the container with id 71525fd93611fdb761dd25e2ac714d820cc37f6a82ea373d841ccf649ff66883 Apr 16 16:02:58.222366 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:02:58.222350 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-150.ec2.internal\" not found" Apr 16 16:02:58.222757 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:58.222738 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86a89fa024a7f053b23d64348497745e.slice/crio-0aa611bbca4f5fcdf135083b1100aab7852b402bce7d817244925d036e9565dd WatchSource:0}: Error finding container 0aa611bbca4f5fcdf135083b1100aab7852b402bce7d817244925d036e9565dd: Status 404 returned error can't find the container with id 0aa611bbca4f5fcdf135083b1100aab7852b402bce7d817244925d036e9565dd Apr 16 16:02:58.226621 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:58.226607 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:02:58.276630 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:58.276581 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-150.ec2.internal" event={"ID":"6ba071faccce2e60097e197ecb90d16a","Type":"ContainerStarted","Data":"71525fd93611fdb761dd25e2ac714d820cc37f6a82ea373d841ccf649ff66883"} Apr 16 16:02:58.277408 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:58.277387 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-150.ec2.internal" event={"ID":"86a89fa024a7f053b23d64348497745e","Type":"ContainerStarted","Data":"0aa611bbca4f5fcdf135083b1100aab7852b402bce7d817244925d036e9565dd"} Apr 16 16:02:58.322923 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:02:58.322902 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-150.ec2.internal\" not found" Apr 16 16:02:58.423404 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:02:58.423378 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-150.ec2.internal\" not found" Apr 16 16:02:58.493478 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:58.493451 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:02:58.507336 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:58.507303 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-150.ec2.internal" Apr 16 16:02:58.545137 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:58.545118 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 16:02:58.547016 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:58.547002 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-150.ec2.internal" Apr 16 16:02:58.564702 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:58.564681 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:02:58.564808 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:58.564800 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 16:02:58.978037 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:58.978008 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:02:59.083050 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.083024 2576 apiserver.go:52] "Watching apiserver" Apr 16 16:02:59.090385 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.090355 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 16:02:59.091956 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.091920 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-150.ec2.internal","openshift-multus/multus-additional-cni-plugins-rvbp5","openshift-multus/network-metrics-daemon-gvndv","openshift-network-diagnostics/network-check-target-xjfdj","kube-system/konnectivity-agent-8rbfc","kube-system/kube-apiserver-proxy-ip-10-0-137-150.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72kw2","openshift-multus/multus-mjlcb","openshift-network-operator/iptables-alerter-rvgfg","openshift-ovn-kubernetes/ovnkube-node-5fl2m","openshift-cluster-node-tuning-operator/tuned-tw4j2","openshift-dns/node-resolver-6bkvd","openshift-image-registry/node-ca-cgg4d"] Apr 16 16:02:59.093640 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.093619 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72kw2" Apr 16 16:02:59.095112 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.095091 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-rvbp5" Apr 16 16:02:59.096330 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.096308 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gvndv" Apr 16 16:02:59.096444 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:02:59.096377 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gvndv" podUID="916e5e50-1aef-4277-971a-7f2e8ffd2703" Apr 16 16:02:59.096734 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.096716 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 16:02:59.097080 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.097061 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-lpsvq\"" Apr 16 16:02:59.097164 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.097145 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 16:02:59.097838 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.097819 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 16:02:59.098596 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.098580 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xjfdj" Apr 16 16:02:59.098683 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:02:59.098650 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xjfdj" podUID="31ee7aaa-d858-49a4-becd-246ec9f1a8c5" Apr 16 16:02:59.099354 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.099334 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 16:02:59.099449 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.099383 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 16:02:59.099449 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.099416 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 16:02:59.099554 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.099457 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 16:02:59.099554 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.099538 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 16:02:59.099554 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.099546 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-gqfjh\"" Apr 16 16:02:59.099991 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.099961 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-8rbfc" Apr 16 16:02:59.101611 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.101328 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mjlcb" Apr 16 16:02:59.102916 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.102611 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-rvgfg" Apr 16 16:02:59.102916 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.102823 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 16:02:59.103092 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.103076 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 16:02:59.103150 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.103113 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-cm6fv\"" Apr 16 16:02:59.103816 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.103798 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 16:02:59.103914 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.103824 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-hfdxs\"" Apr 16 16:02:59.105009 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.104993 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.106342 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.106325 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-tw4j2" Apr 16 16:02:59.108086 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.108060 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-6bkvd" Apr 16 16:02:59.109081 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.109060 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 16:02:59.109167 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.109087 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 16:02:59.109244 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.109179 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 16:02:59.109244 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.109061 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-p2x9m\"" Apr 16 16:02:59.109244 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.109064 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:02:59.109244 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.109198 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:02:59.109437 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.109351 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 16:02:59.109603 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.109523 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-cgg4d" Apr 16 16:02:59.109603 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.109594 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 16:02:59.110987 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.110936 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 16:02:59.111227 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.111104 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 16:02:59.113314 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.111999 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 16:02:59.113314 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.112133 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 16:02:59.113314 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.112273 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-kd9ss\"" Apr 16 16:02:59.113314 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.112552 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 16:02:59.113314 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.113041 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 16:02:59.113314 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.113140 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 16:02:59.113314 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.113312 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-279dv\"" Apr 16 16:02:59.113663 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.113383 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-fkwk5\"" Apr 16 16:02:59.113851 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.113826 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-94srw\"" Apr 16 16:02:59.117636 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.117612 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e8c9cbbd-500b-410c-bc31-680efc2b8a0c-iptables-alerter-script\") pod \"iptables-alerter-rvgfg\" (UID: \"e8c9cbbd-500b-410c-bc31-680efc2b8a0c\") " pod="openshift-network-operator/iptables-alerter-rvgfg" Apr 16 16:02:59.117716 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.117651 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f2c0205d-61f8-4c10-bd7f-f8fcf4336fba-etc-sysctl-conf\") pod \"tuned-tw4j2\" (UID: \"f2c0205d-61f8-4c10-bd7f-f8fcf4336fba\") " pod="openshift-cluster-node-tuning-operator/tuned-tw4j2" Apr 16 16:02:59.117716 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.117678 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f2c0205d-61f8-4c10-bd7f-f8fcf4336fba-var-lib-kubelet\") pod \"tuned-tw4j2\" (UID: \"f2c0205d-61f8-4c10-bd7f-f8fcf4336fba\") " pod="openshift-cluster-node-tuning-operator/tuned-tw4j2" Apr 16 16:02:59.117716 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.117703 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gz7z\" (UniqueName: \"kubernetes.io/projected/345463d3-76fd-4233-8808-6df63a64c4b5-kube-api-access-6gz7z\") pod \"multus-additional-cni-plugins-rvbp5\" (UID: \"345463d3-76fd-4233-8808-6df63a64c4b5\") " pod="openshift-multus/multus-additional-cni-plugins-rvbp5" Apr 16 16:02:59.117873 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.117780 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c969e45c-de0a-46a1-b293-93c2eb9bcd6f-konnectivity-ca\") pod \"konnectivity-agent-8rbfc\" (UID: \"c969e45c-de0a-46a1-b293-93c2eb9bcd6f\") " pod="kube-system/konnectivity-agent-8rbfc" Apr 16 16:02:59.117873 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.117819 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9eca3407-9be3-44a0-8eb5-274e3e43107d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-72kw2\" (UID: \"9eca3407-9be3-44a0-8eb5-274e3e43107d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72kw2" Apr 16 16:02:59.117962 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.117864 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4a32280f-6aec-4142-bb8f-1547ac4378ab-host-var-lib-cni-bin\") pod \"multus-mjlcb\" (UID: \"4a32280f-6aec-4142-bb8f-1547ac4378ab\") " pod="openshift-multus/multus-mjlcb" Apr 16 16:02:59.117962 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.117898 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e8c9cbbd-500b-410c-bc31-680efc2b8a0c-host-slash\") pod \"iptables-alerter-rvgfg\" (UID: \"e8c9cbbd-500b-410c-bc31-680efc2b8a0c\") " pod="openshift-network-operator/iptables-alerter-rvgfg" Apr 16 16:02:59.117962 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.117925 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d0578a47-b539-4acd-9e68-0468bd267183-ovnkube-config\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.117962 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.117949 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f2c0205d-61f8-4c10-bd7f-f8fcf4336fba-etc-sysctl-d\") pod \"tuned-tw4j2\" (UID: \"f2c0205d-61f8-4c10-bd7f-f8fcf4336fba\") " pod="openshift-cluster-node-tuning-operator/tuned-tw4j2" Apr 16 16:02:59.118142 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.117973 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f2c0205d-61f8-4c10-bd7f-f8fcf4336fba-sys\") pod \"tuned-tw4j2\" (UID: \"f2c0205d-61f8-4c10-bd7f-f8fcf4336fba\") " pod="openshift-cluster-node-tuning-operator/tuned-tw4j2" Apr 16 16:02:59.118142 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.117996 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f2c0205d-61f8-4c10-bd7f-f8fcf4336fba-etc-tuned\") pod \"tuned-tw4j2\" (UID: \"f2c0205d-61f8-4c10-bd7f-f8fcf4336fba\") " pod="openshift-cluster-node-tuning-operator/tuned-tw4j2" Apr 16 16:02:59.118142 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.118029 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9eca3407-9be3-44a0-8eb5-274e3e43107d-etc-selinux\") pod \"aws-ebs-csi-driver-node-72kw2\" (UID: \"9eca3407-9be3-44a0-8eb5-274e3e43107d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72kw2" Apr 16 16:02:59.118142 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.118066 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4a32280f-6aec-4142-bb8f-1547ac4378ab-cni-binary-copy\") pod \"multus-mjlcb\" (UID: \"4a32280f-6aec-4142-bb8f-1547ac4378ab\") " pod="openshift-multus/multus-mjlcb" Apr 16 16:02:59.118142 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.118116 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d0578a47-b539-4acd-9e68-0468bd267183-run-systemd\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.118349 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.118152 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d0578a47-b539-4acd-9e68-0468bd267183-var-lib-openvswitch\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.118349 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.118176 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfm2n\" (UniqueName: \"kubernetes.io/projected/f2c0205d-61f8-4c10-bd7f-f8fcf4336fba-kube-api-access-xfm2n\") pod \"tuned-tw4j2\" (UID: \"f2c0205d-61f8-4c10-bd7f-f8fcf4336fba\") " pod="openshift-cluster-node-tuning-operator/tuned-tw4j2" Apr 16 16:02:59.118349 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.118201 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4a32280f-6aec-4142-bb8f-1547ac4378ab-host-var-lib-cni-multus\") pod \"multus-mjlcb\" (UID: \"4a32280f-6aec-4142-bb8f-1547ac4378ab\") " pod="openshift-multus/multus-mjlcb" Apr 16 16:02:59.118349 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.118247 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/916e5e50-1aef-4277-971a-7f2e8ffd2703-metrics-certs\") pod \"network-metrics-daemon-gvndv\" (UID: \"916e5e50-1aef-4277-971a-7f2e8ffd2703\") " pod="openshift-multus/network-metrics-daemon-gvndv" Apr 16 16:02:59.118349 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.118270 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4a32280f-6aec-4142-bb8f-1547ac4378ab-cnibin\") pod \"multus-mjlcb\" (UID: \"4a32280f-6aec-4142-bb8f-1547ac4378ab\") " pod="openshift-multus/multus-mjlcb" Apr 16 16:02:59.118349 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.118287 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f2c0205d-61f8-4c10-bd7f-f8fcf4336fba-lib-modules\") pod \"tuned-tw4j2\" (UID: \"f2c0205d-61f8-4c10-bd7f-f8fcf4336fba\") " pod="openshift-cluster-node-tuning-operator/tuned-tw4j2" Apr 16 16:02:59.118349 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.118310 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/345463d3-76fd-4233-8808-6df63a64c4b5-cni-binary-copy\") pod \"multus-additional-cni-plugins-rvbp5\" (UID: \"345463d3-76fd-4233-8808-6df63a64c4b5\") " pod="openshift-multus/multus-additional-cni-plugins-rvbp5" Apr 16 16:02:59.118349 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.118336 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c969e45c-de0a-46a1-b293-93c2eb9bcd6f-agent-certs\") pod \"konnectivity-agent-8rbfc\" (UID: \"c969e45c-de0a-46a1-b293-93c2eb9bcd6f\") " pod="kube-system/konnectivity-agent-8rbfc" Apr 16 16:02:59.118644 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.118358 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9eca3407-9be3-44a0-8eb5-274e3e43107d-socket-dir\") pod \"aws-ebs-csi-driver-node-72kw2\" (UID: \"9eca3407-9be3-44a0-8eb5-274e3e43107d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72kw2" Apr 16 16:02:59.118644 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.118404 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4a32280f-6aec-4142-bb8f-1547ac4378ab-host-run-multus-certs\") pod \"multus-mjlcb\" (UID: \"4a32280f-6aec-4142-bb8f-1547ac4378ab\") " pod="openshift-multus/multus-mjlcb" Apr 16 16:02:59.118644 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.118425 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d0578a47-b539-4acd-9e68-0468bd267183-host-cni-netd\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.118644 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.118444 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d0578a47-b539-4acd-9e68-0468bd267183-ovn-node-metrics-cert\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.118644 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.118479 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f2c0205d-61f8-4c10-bd7f-f8fcf4336fba-etc-kubernetes\") pod \"tuned-tw4j2\" (UID: \"f2c0205d-61f8-4c10-bd7f-f8fcf4336fba\") " pod="openshift-cluster-node-tuning-operator/tuned-tw4j2" Apr 16 16:02:59.118644 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.118514 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f2c0205d-61f8-4c10-bd7f-f8fcf4336fba-run\") pod \"tuned-tw4j2\" (UID: \"f2c0205d-61f8-4c10-bd7f-f8fcf4336fba\") " pod="openshift-cluster-node-tuning-operator/tuned-tw4j2" Apr 16 16:02:59.118644 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.118540 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9eca3407-9be3-44a0-8eb5-274e3e43107d-sys-fs\") pod \"aws-ebs-csi-driver-node-72kw2\" (UID: \"9eca3407-9be3-44a0-8eb5-274e3e43107d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72kw2" Apr 16 16:02:59.118644 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.118576 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d0578a47-b539-4acd-9e68-0468bd267183-node-log\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.118644 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.118612 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f2c0205d-61f8-4c10-bd7f-f8fcf4336fba-tmp\") pod \"tuned-tw4j2\" (UID: \"f2c0205d-61f8-4c10-bd7f-f8fcf4336fba\") " pod="openshift-cluster-node-tuning-operator/tuned-tw4j2" Apr 16 16:02:59.118644 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.118641 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/345463d3-76fd-4233-8808-6df63a64c4b5-cnibin\") pod \"multus-additional-cni-plugins-rvbp5\" (UID: \"345463d3-76fd-4233-8808-6df63a64c4b5\") " pod="openshift-multus/multus-additional-cni-plugins-rvbp5" Apr 16 16:02:59.119054 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.118665 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4a32280f-6aec-4142-bb8f-1547ac4378ab-system-cni-dir\") pod \"multus-mjlcb\" (UID: \"4a32280f-6aec-4142-bb8f-1547ac4378ab\") " pod="openshift-multus/multus-mjlcb" Apr 16 16:02:59.119054 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.118688 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4a32280f-6aec-4142-bb8f-1547ac4378ab-host-var-lib-kubelet\") pod \"multus-mjlcb\" (UID: \"4a32280f-6aec-4142-bb8f-1547ac4378ab\") " pod="openshift-multus/multus-mjlcb" Apr 16 16:02:59.119054 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.118710 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d0578a47-b539-4acd-9e68-0468bd267183-env-overrides\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.119054 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.118751 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/345463d3-76fd-4233-8808-6df63a64c4b5-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rvbp5\" (UID: \"345463d3-76fd-4233-8808-6df63a64c4b5\") " pod="openshift-multus/multus-additional-cni-plugins-rvbp5" Apr 16 16:02:59.119054 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.118777 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e5e28615-1240-4149-a23f-752b612f8a06-hosts-file\") pod \"node-resolver-6bkvd\" (UID: \"e5e28615-1240-4149-a23f-752b612f8a06\") " pod="openshift-dns/node-resolver-6bkvd" Apr 16 16:02:59.119054 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.118799 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4a32280f-6aec-4142-bb8f-1547ac4378ab-hostroot\") pod \"multus-mjlcb\" (UID: \"4a32280f-6aec-4142-bb8f-1547ac4378ab\") " pod="openshift-multus/multus-mjlcb" Apr 16 16:02:59.119054 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.118823 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4a32280f-6aec-4142-bb8f-1547ac4378ab-multus-daemon-config\") pod \"multus-mjlcb\" (UID: \"4a32280f-6aec-4142-bb8f-1547ac4378ab\") " pod="openshift-multus/multus-mjlcb" Apr 16 16:02:59.119054 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.118860 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/345463d3-76fd-4233-8808-6df63a64c4b5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rvbp5\" (UID: \"345463d3-76fd-4233-8808-6df63a64c4b5\") " pod="openshift-multus/multus-additional-cni-plugins-rvbp5" Apr 16 16:02:59.119054 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.118883 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d0578a47-b539-4acd-9e68-0468bd267183-etc-openvswitch\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.119054 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.118904 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d0578a47-b539-4acd-9e68-0468bd267183-run-ovn\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.119054 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.118925 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f2c0205d-61f8-4c10-bd7f-f8fcf4336fba-etc-sysconfig\") pod \"tuned-tw4j2\" (UID: \"f2c0205d-61f8-4c10-bd7f-f8fcf4336fba\") " pod="openshift-cluster-node-tuning-operator/tuned-tw4j2" Apr 16 16:02:59.119054 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.118950 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4a32280f-6aec-4142-bb8f-1547ac4378ab-host-run-netns\") pod \"multus-mjlcb\" (UID: \"4a32280f-6aec-4142-bb8f-1547ac4378ab\") " pod="openshift-multus/multus-mjlcb" Apr 16 16:02:59.119054 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.118972 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a32280f-6aec-4142-bb8f-1547ac4378ab-etc-kubernetes\") pod \"multus-mjlcb\" (UID: \"4a32280f-6aec-4142-bb8f-1547ac4378ab\") " pod="openshift-multus/multus-mjlcb" Apr 16 16:02:59.119054 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.118992 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d0578a47-b539-4acd-9e68-0468bd267183-log-socket\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.119054 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.119038 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/345463d3-76fd-4233-8808-6df63a64c4b5-os-release\") pod \"multus-additional-cni-plugins-rvbp5\" (UID: \"345463d3-76fd-4233-8808-6df63a64c4b5\") " pod="openshift-multus/multus-additional-cni-plugins-rvbp5" Apr 16 16:02:59.119635 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.119061 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/345463d3-76fd-4233-8808-6df63a64c4b5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rvbp5\" (UID: \"345463d3-76fd-4233-8808-6df63a64c4b5\") " pod="openshift-multus/multus-additional-cni-plugins-rvbp5" Apr 16 16:02:59.119635 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.119089 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9eca3407-9be3-44a0-8eb5-274e3e43107d-registration-dir\") pod \"aws-ebs-csi-driver-node-72kw2\" (UID: \"9eca3407-9be3-44a0-8eb5-274e3e43107d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72kw2" Apr 16 16:02:59.119635 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.119111 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktggg\" (UniqueName: \"kubernetes.io/projected/9eca3407-9be3-44a0-8eb5-274e3e43107d-kube-api-access-ktggg\") pod \"aws-ebs-csi-driver-node-72kw2\" (UID: \"9eca3407-9be3-44a0-8eb5-274e3e43107d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72kw2" Apr 16 16:02:59.119635 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.119135 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwmpg\" (UniqueName: \"kubernetes.io/projected/916e5e50-1aef-4277-971a-7f2e8ffd2703-kube-api-access-pwmpg\") pod \"network-metrics-daemon-gvndv\" (UID: \"916e5e50-1aef-4277-971a-7f2e8ffd2703\") " pod="openshift-multus/network-metrics-daemon-gvndv" Apr 16 16:02:59.119635 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.119157 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4a32280f-6aec-4142-bb8f-1547ac4378ab-multus-cni-dir\") pod \"multus-mjlcb\" (UID: \"4a32280f-6aec-4142-bb8f-1547ac4378ab\") " pod="openshift-multus/multus-mjlcb" Apr 16 16:02:59.119635 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.119179 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4a32280f-6aec-4142-bb8f-1547ac4378ab-host-run-k8s-cni-cncf-io\") pod \"multus-mjlcb\" (UID: \"4a32280f-6aec-4142-bb8f-1547ac4378ab\") " pod="openshift-multus/multus-mjlcb" Apr 16 16:02:59.119635 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.119200 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4a32280f-6aec-4142-bb8f-1547ac4378ab-multus-conf-dir\") pod \"multus-mjlcb\" (UID: \"4a32280f-6aec-4142-bb8f-1547ac4378ab\") " pod="openshift-multus/multus-mjlcb" Apr 16 16:02:59.119635 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.119240 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwvgt\" (UniqueName: \"kubernetes.io/projected/31ee7aaa-d858-49a4-becd-246ec9f1a8c5-kube-api-access-zwvgt\") pod \"network-check-target-xjfdj\" (UID: \"31ee7aaa-d858-49a4-becd-246ec9f1a8c5\") " pod="openshift-network-diagnostics/network-check-target-xjfdj" Apr 16 16:02:59.119635 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.119267 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r7q2\" (UniqueName: \"kubernetes.io/projected/4a32280f-6aec-4142-bb8f-1547ac4378ab-kube-api-access-4r7q2\") pod \"multus-mjlcb\" (UID: \"4a32280f-6aec-4142-bb8f-1547ac4378ab\") " pod="openshift-multus/multus-mjlcb" Apr 16 16:02:59.119635 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.119288 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d0578a47-b539-4acd-9e68-0468bd267183-systemd-units\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.119635 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.119358 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d0578a47-b539-4acd-9e68-0468bd267183-ovnkube-script-lib\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.119635 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.119390 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f2c0205d-61f8-4c10-bd7f-f8fcf4336fba-etc-modprobe-d\") pod \"tuned-tw4j2\" (UID: \"f2c0205d-61f8-4c10-bd7f-f8fcf4336fba\") " pod="openshift-cluster-node-tuning-operator/tuned-tw4j2" Apr 16 16:02:59.119635 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.119412 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e5e28615-1240-4149-a23f-752b612f8a06-tmp-dir\") pod \"node-resolver-6bkvd\" (UID: \"e5e28615-1240-4149-a23f-752b612f8a06\") " pod="openshift-dns/node-resolver-6bkvd" Apr 16 16:02:59.119635 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.119446 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d0578a47-b539-4acd-9e68-0468bd267183-host-kubelet\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.119635 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.119468 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d0578a47-b539-4acd-9e68-0468bd267183-host-slash\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.119635 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.119491 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/345463d3-76fd-4233-8808-6df63a64c4b5-system-cni-dir\") pod \"multus-additional-cni-plugins-rvbp5\" (UID: \"345463d3-76fd-4233-8808-6df63a64c4b5\") " pod="openshift-multus/multus-additional-cni-plugins-rvbp5" Apr 16 16:02:59.120272 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.119543 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bd7ws\" (UniqueName: \"kubernetes.io/projected/e5e28615-1240-4149-a23f-752b612f8a06-kube-api-access-bd7ws\") pod \"node-resolver-6bkvd\" (UID: \"e5e28615-1240-4149-a23f-752b612f8a06\") " pod="openshift-dns/node-resolver-6bkvd" Apr 16 16:02:59.120272 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.119580 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9eca3407-9be3-44a0-8eb5-274e3e43107d-device-dir\") pod \"aws-ebs-csi-driver-node-72kw2\" (UID: \"9eca3407-9be3-44a0-8eb5-274e3e43107d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72kw2" Apr 16 16:02:59.120272 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.119609 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4a32280f-6aec-4142-bb8f-1547ac4378ab-os-release\") pod \"multus-mjlcb\" (UID: \"4a32280f-6aec-4142-bb8f-1547ac4378ab\") " pod="openshift-multus/multus-mjlcb" Apr 16 16:02:59.120272 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.119632 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5bzw\" (UniqueName: \"kubernetes.io/projected/e8c9cbbd-500b-410c-bc31-680efc2b8a0c-kube-api-access-f5bzw\") pod \"iptables-alerter-rvgfg\" (UID: \"e8c9cbbd-500b-410c-bc31-680efc2b8a0c\") " pod="openshift-network-operator/iptables-alerter-rvgfg" Apr 16 16:02:59.120272 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.119655 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d0578a47-b539-4acd-9e68-0468bd267183-host-run-netns\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.120272 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.119675 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d0578a47-b539-4acd-9e68-0468bd267183-run-openvswitch\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.120272 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.119694 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d0578a47-b539-4acd-9e68-0468bd267183-host-run-ovn-kubernetes\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.120272 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.119709 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d0578a47-b539-4acd-9e68-0468bd267183-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.120272 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.119729 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f2c0205d-61f8-4c10-bd7f-f8fcf4336fba-etc-systemd\") pod \"tuned-tw4j2\" (UID: \"f2c0205d-61f8-4c10-bd7f-f8fcf4336fba\") " pod="openshift-cluster-node-tuning-operator/tuned-tw4j2" Apr 16 16:02:59.120272 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.119750 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4a32280f-6aec-4142-bb8f-1547ac4378ab-multus-socket-dir-parent\") pod \"multus-mjlcb\" (UID: \"4a32280f-6aec-4142-bb8f-1547ac4378ab\") " pod="openshift-multus/multus-mjlcb" Apr 16 16:02:59.120272 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.119772 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d0578a47-b539-4acd-9e68-0468bd267183-host-cni-bin\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.120272 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.119793 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8fch\" (UniqueName: \"kubernetes.io/projected/d0578a47-b539-4acd-9e68-0468bd267183-kube-api-access-f8fch\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.120272 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.119813 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f2c0205d-61f8-4c10-bd7f-f8fcf4336fba-host\") pod \"tuned-tw4j2\" (UID: \"f2c0205d-61f8-4c10-bd7f-f8fcf4336fba\") " pod="openshift-cluster-node-tuning-operator/tuned-tw4j2" Apr 16 16:02:59.123157 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.123139 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 16:02:59.123506 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.123488 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 16:02:59.179445 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.179413 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 15:57:58 +0000 UTC" deadline="2027-11-11 10:53:55.58943222 +0000 UTC" Apr 16 16:02:59.179445 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.179441 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13770h50m56.409993328s" Apr 16 16:02:59.208955 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.208925 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 16:02:59.220456 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.220425 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4a32280f-6aec-4142-bb8f-1547ac4378ab-cnibin\") pod \"multus-mjlcb\" (UID: \"4a32280f-6aec-4142-bb8f-1547ac4378ab\") " pod="openshift-multus/multus-mjlcb" Apr 16 16:02:59.220592 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.220469 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f2c0205d-61f8-4c10-bd7f-f8fcf4336fba-lib-modules\") pod \"tuned-tw4j2\" (UID: \"f2c0205d-61f8-4c10-bd7f-f8fcf4336fba\") " pod="openshift-cluster-node-tuning-operator/tuned-tw4j2" Apr 16 16:02:59.220592 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.220508 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/345463d3-76fd-4233-8808-6df63a64c4b5-cni-binary-copy\") pod \"multus-additional-cni-plugins-rvbp5\" (UID: \"345463d3-76fd-4233-8808-6df63a64c4b5\") " pod="openshift-multus/multus-additional-cni-plugins-rvbp5" Apr 16 16:02:59.220592 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.220511 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4a32280f-6aec-4142-bb8f-1547ac4378ab-cnibin\") pod \"multus-mjlcb\" (UID: \"4a32280f-6aec-4142-bb8f-1547ac4378ab\") " pod="openshift-multus/multus-mjlcb" Apr 16 16:02:59.220592 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.220533 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c969e45c-de0a-46a1-b293-93c2eb9bcd6f-agent-certs\") pod \"konnectivity-agent-8rbfc\" (UID: \"c969e45c-de0a-46a1-b293-93c2eb9bcd6f\") " pod="kube-system/konnectivity-agent-8rbfc" Apr 16 16:02:59.220592 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.220556 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9eca3407-9be3-44a0-8eb5-274e3e43107d-socket-dir\") pod \"aws-ebs-csi-driver-node-72kw2\" (UID: \"9eca3407-9be3-44a0-8eb5-274e3e43107d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72kw2" Apr 16 16:02:59.220592 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.220578 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4a32280f-6aec-4142-bb8f-1547ac4378ab-host-run-multus-certs\") pod \"multus-mjlcb\" (UID: \"4a32280f-6aec-4142-bb8f-1547ac4378ab\") " pod="openshift-multus/multus-mjlcb" Apr 16 16:02:59.220880 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.220605 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d0578a47-b539-4acd-9e68-0468bd267183-host-cni-netd\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.220880 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.220631 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d0578a47-b539-4acd-9e68-0468bd267183-ovn-node-metrics-cert\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.220880 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.220656 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f2c0205d-61f8-4c10-bd7f-f8fcf4336fba-etc-kubernetes\") pod \"tuned-tw4j2\" (UID: \"f2c0205d-61f8-4c10-bd7f-f8fcf4336fba\") " pod="openshift-cluster-node-tuning-operator/tuned-tw4j2" Apr 16 16:02:59.220880 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.220673 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f2c0205d-61f8-4c10-bd7f-f8fcf4336fba-lib-modules\") pod \"tuned-tw4j2\" (UID: \"f2c0205d-61f8-4c10-bd7f-f8fcf4336fba\") " pod="openshift-cluster-node-tuning-operator/tuned-tw4j2" Apr 16 16:02:59.220880 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.220679 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f2c0205d-61f8-4c10-bd7f-f8fcf4336fba-run\") pod \"tuned-tw4j2\" (UID: \"f2c0205d-61f8-4c10-bd7f-f8fcf4336fba\") " pod="openshift-cluster-node-tuning-operator/tuned-tw4j2" Apr 16 16:02:59.220880 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.220732 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f2c0205d-61f8-4c10-bd7f-f8fcf4336fba-run\") pod \"tuned-tw4j2\" (UID: \"f2c0205d-61f8-4c10-bd7f-f8fcf4336fba\") " pod="openshift-cluster-node-tuning-operator/tuned-tw4j2" Apr 16 16:02:59.220880 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.220731 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9eca3407-9be3-44a0-8eb5-274e3e43107d-sys-fs\") pod \"aws-ebs-csi-driver-node-72kw2\" (UID: \"9eca3407-9be3-44a0-8eb5-274e3e43107d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72kw2" Apr 16 16:02:59.220880 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.220757 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9eca3407-9be3-44a0-8eb5-274e3e43107d-socket-dir\") pod \"aws-ebs-csi-driver-node-72kw2\" (UID: \"9eca3407-9be3-44a0-8eb5-274e3e43107d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72kw2" Apr 16 16:02:59.220880 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.220765 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d0578a47-b539-4acd-9e68-0468bd267183-node-log\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.220880 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.220792 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f2c0205d-61f8-4c10-bd7f-f8fcf4336fba-tmp\") pod \"tuned-tw4j2\" (UID: \"f2c0205d-61f8-4c10-bd7f-f8fcf4336fba\") " pod="openshift-cluster-node-tuning-operator/tuned-tw4j2" Apr 16 16:02:59.220880 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.220803 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/9eca3407-9be3-44a0-8eb5-274e3e43107d-sys-fs\") pod \"aws-ebs-csi-driver-node-72kw2\" (UID: \"9eca3407-9be3-44a0-8eb5-274e3e43107d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72kw2" Apr 16 16:02:59.220880 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.220820 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/345463d3-76fd-4233-8808-6df63a64c4b5-cnibin\") pod \"multus-additional-cni-plugins-rvbp5\" (UID: \"345463d3-76fd-4233-8808-6df63a64c4b5\") " pod="openshift-multus/multus-additional-cni-plugins-rvbp5" Apr 16 16:02:59.220880 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.220847 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4a32280f-6aec-4142-bb8f-1547ac4378ab-system-cni-dir\") pod \"multus-mjlcb\" (UID: \"4a32280f-6aec-4142-bb8f-1547ac4378ab\") " pod="openshift-multus/multus-mjlcb" Apr 16 16:02:59.220880 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.220857 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4a32280f-6aec-4142-bb8f-1547ac4378ab-host-run-multus-certs\") pod \"multus-mjlcb\" (UID: \"4a32280f-6aec-4142-bb8f-1547ac4378ab\") " pod="openshift-multus/multus-mjlcb" Apr 16 16:02:59.220880 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.220875 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4a32280f-6aec-4142-bb8f-1547ac4378ab-host-var-lib-kubelet\") pod \"multus-mjlcb\" (UID: \"4a32280f-6aec-4142-bb8f-1547ac4378ab\") " pod="openshift-multus/multus-mjlcb" Apr 16 16:02:59.221512 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.220915 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4a32280f-6aec-4142-bb8f-1547ac4378ab-host-var-lib-kubelet\") pod \"multus-mjlcb\" (UID: \"4a32280f-6aec-4142-bb8f-1547ac4378ab\") " pod="openshift-multus/multus-mjlcb" Apr 16 16:02:59.221512 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.220908 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d0578a47-b539-4acd-9e68-0468bd267183-env-overrides\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.221512 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.220912 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 16:02:59.221512 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.220965 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d0578a47-b539-4acd-9e68-0468bd267183-host-cni-netd\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.221512 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.221020 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/345463d3-76fd-4233-8808-6df63a64c4b5-cnibin\") pod \"multus-additional-cni-plugins-rvbp5\" (UID: \"345463d3-76fd-4233-8808-6df63a64c4b5\") " pod="openshift-multus/multus-additional-cni-plugins-rvbp5" Apr 16 16:02:59.221512 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.221082 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4a32280f-6aec-4142-bb8f-1547ac4378ab-system-cni-dir\") pod \"multus-mjlcb\" (UID: \"4a32280f-6aec-4142-bb8f-1547ac4378ab\") " pod="openshift-multus/multus-mjlcb" Apr 16 16:02:59.221512 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.221132 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f2c0205d-61f8-4c10-bd7f-f8fcf4336fba-etc-kubernetes\") pod \"tuned-tw4j2\" (UID: \"f2c0205d-61f8-4c10-bd7f-f8fcf4336fba\") " pod="openshift-cluster-node-tuning-operator/tuned-tw4j2" Apr 16 16:02:59.221512 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.221177 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d0578a47-b539-4acd-9e68-0468bd267183-node-log\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.221512 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.221185 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/345463d3-76fd-4233-8808-6df63a64c4b5-cni-binary-copy\") pod \"multus-additional-cni-plugins-rvbp5\" (UID: \"345463d3-76fd-4233-8808-6df63a64c4b5\") " pod="openshift-multus/multus-additional-cni-plugins-rvbp5" Apr 16 16:02:59.221512 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.221282 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/345463d3-76fd-4233-8808-6df63a64c4b5-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rvbp5\" (UID: \"345463d3-76fd-4233-8808-6df63a64c4b5\") " pod="openshift-multus/multus-additional-cni-plugins-rvbp5" Apr 16 16:02:59.221512 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.221291 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d0578a47-b539-4acd-9e68-0468bd267183-env-overrides\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.221512 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.221311 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e5e28615-1240-4149-a23f-752b612f8a06-hosts-file\") pod \"node-resolver-6bkvd\" (UID: \"e5e28615-1240-4149-a23f-752b612f8a06\") " pod="openshift-dns/node-resolver-6bkvd" Apr 16 16:02:59.221512 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.221369 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e5e28615-1240-4149-a23f-752b612f8a06-hosts-file\") pod \"node-resolver-6bkvd\" (UID: \"e5e28615-1240-4149-a23f-752b612f8a06\") " pod="openshift-dns/node-resolver-6bkvd" Apr 16 16:02:59.221512 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.221400 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4a32280f-6aec-4142-bb8f-1547ac4378ab-hostroot\") pod \"multus-mjlcb\" (UID: \"4a32280f-6aec-4142-bb8f-1547ac4378ab\") " pod="openshift-multus/multus-mjlcb" Apr 16 16:02:59.221512 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.221427 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4a32280f-6aec-4142-bb8f-1547ac4378ab-multus-daemon-config\") pod \"multus-mjlcb\" (UID: \"4a32280f-6aec-4142-bb8f-1547ac4378ab\") " pod="openshift-multus/multus-mjlcb" Apr 16 16:02:59.221512 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.221474 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4a32280f-6aec-4142-bb8f-1547ac4378ab-hostroot\") pod \"multus-mjlcb\" (UID: \"4a32280f-6aec-4142-bb8f-1547ac4378ab\") " pod="openshift-multus/multus-mjlcb" Apr 16 16:02:59.221512 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.221519 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/345463d3-76fd-4233-8808-6df63a64c4b5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rvbp5\" (UID: \"345463d3-76fd-4233-8808-6df63a64c4b5\") " pod="openshift-multus/multus-additional-cni-plugins-rvbp5" Apr 16 16:02:59.222283 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.221578 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcfxz\" (UniqueName: \"kubernetes.io/projected/4b05a2af-d8f2-42c1-a086-851d57791b5f-kube-api-access-fcfxz\") pod \"node-ca-cgg4d\" (UID: \"4b05a2af-d8f2-42c1-a086-851d57791b5f\") " pod="openshift-image-registry/node-ca-cgg4d" Apr 16 16:02:59.222283 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.221607 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d0578a47-b539-4acd-9e68-0468bd267183-etc-openvswitch\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.222283 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.221631 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d0578a47-b539-4acd-9e68-0468bd267183-run-ovn\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.222283 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.221660 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f2c0205d-61f8-4c10-bd7f-f8fcf4336fba-etc-sysconfig\") pod \"tuned-tw4j2\" (UID: \"f2c0205d-61f8-4c10-bd7f-f8fcf4336fba\") " pod="openshift-cluster-node-tuning-operator/tuned-tw4j2" Apr 16 16:02:59.222283 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.221692 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4a32280f-6aec-4142-bb8f-1547ac4378ab-host-run-netns\") pod \"multus-mjlcb\" (UID: \"4a32280f-6aec-4142-bb8f-1547ac4378ab\") " pod="openshift-multus/multus-mjlcb" Apr 16 16:02:59.222283 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.221717 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a32280f-6aec-4142-bb8f-1547ac4378ab-etc-kubernetes\") pod \"multus-mjlcb\" (UID: \"4a32280f-6aec-4142-bb8f-1547ac4378ab\") " pod="openshift-multus/multus-mjlcb" Apr 16 16:02:59.222283 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.221744 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d0578a47-b539-4acd-9e68-0468bd267183-log-socket\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.222283 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.221774 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/345463d3-76fd-4233-8808-6df63a64c4b5-os-release\") pod \"multus-additional-cni-plugins-rvbp5\" (UID: \"345463d3-76fd-4233-8808-6df63a64c4b5\") " pod="openshift-multus/multus-additional-cni-plugins-rvbp5" Apr 16 16:02:59.222283 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.221781 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/345463d3-76fd-4233-8808-6df63a64c4b5-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rvbp5\" (UID: \"345463d3-76fd-4233-8808-6df63a64c4b5\") " pod="openshift-multus/multus-additional-cni-plugins-rvbp5" Apr 16 16:02:59.222283 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.221773 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d0578a47-b539-4acd-9e68-0468bd267183-run-ovn\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.222283 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.221805 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/345463d3-76fd-4233-8808-6df63a64c4b5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rvbp5\" (UID: \"345463d3-76fd-4233-8808-6df63a64c4b5\") " pod="openshift-multus/multus-additional-cni-plugins-rvbp5" Apr 16 16:02:59.222283 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.221894 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/345463d3-76fd-4233-8808-6df63a64c4b5-os-release\") pod \"multus-additional-cni-plugins-rvbp5\" (UID: \"345463d3-76fd-4233-8808-6df63a64c4b5\") " pod="openshift-multus/multus-additional-cni-plugins-rvbp5" Apr 16 16:02:59.222283 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.221930 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d0578a47-b539-4acd-9e68-0468bd267183-log-socket\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.222283 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.221954 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d0578a47-b539-4acd-9e68-0468bd267183-etc-openvswitch\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.222283 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.222002 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/345463d3-76fd-4233-8808-6df63a64c4b5-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rvbp5\" (UID: \"345463d3-76fd-4233-8808-6df63a64c4b5\") " pod="openshift-multus/multus-additional-cni-plugins-rvbp5" Apr 16 16:02:59.222283 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.221987 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4a32280f-6aec-4142-bb8f-1547ac4378ab-etc-kubernetes\") pod \"multus-mjlcb\" (UID: \"4a32280f-6aec-4142-bb8f-1547ac4378ab\") " pod="openshift-multus/multus-mjlcb" Apr 16 16:02:59.222283 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.222050 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f2c0205d-61f8-4c10-bd7f-f8fcf4336fba-etc-sysconfig\") pod \"tuned-tw4j2\" (UID: \"f2c0205d-61f8-4c10-bd7f-f8fcf4336fba\") " pod="openshift-cluster-node-tuning-operator/tuned-tw4j2" Apr 16 16:02:59.223034 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.222080 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4a32280f-6aec-4142-bb8f-1547ac4378ab-host-run-netns\") pod \"multus-mjlcb\" (UID: \"4a32280f-6aec-4142-bb8f-1547ac4378ab\") " pod="openshift-multus/multus-mjlcb" Apr 16 16:02:59.223034 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.222128 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9eca3407-9be3-44a0-8eb5-274e3e43107d-registration-dir\") pod \"aws-ebs-csi-driver-node-72kw2\" (UID: \"9eca3407-9be3-44a0-8eb5-274e3e43107d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72kw2" Apr 16 16:02:59.223034 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.222195 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9eca3407-9be3-44a0-8eb5-274e3e43107d-registration-dir\") pod \"aws-ebs-csi-driver-node-72kw2\" (UID: \"9eca3407-9be3-44a0-8eb5-274e3e43107d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72kw2" Apr 16 16:02:59.223034 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.222225 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ktggg\" (UniqueName: \"kubernetes.io/projected/9eca3407-9be3-44a0-8eb5-274e3e43107d-kube-api-access-ktggg\") pod \"aws-ebs-csi-driver-node-72kw2\" (UID: \"9eca3407-9be3-44a0-8eb5-274e3e43107d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72kw2" Apr 16 16:02:59.223034 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.222273 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pwmpg\" (UniqueName: \"kubernetes.io/projected/916e5e50-1aef-4277-971a-7f2e8ffd2703-kube-api-access-pwmpg\") pod \"network-metrics-daemon-gvndv\" (UID: \"916e5e50-1aef-4277-971a-7f2e8ffd2703\") " pod="openshift-multus/network-metrics-daemon-gvndv" Apr 16 16:02:59.223034 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.222306 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4a32280f-6aec-4142-bb8f-1547ac4378ab-multus-cni-dir\") pod \"multus-mjlcb\" (UID: \"4a32280f-6aec-4142-bb8f-1547ac4378ab\") " pod="openshift-multus/multus-mjlcb" Apr 16 16:02:59.223034 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.222340 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4a32280f-6aec-4142-bb8f-1547ac4378ab-host-run-k8s-cni-cncf-io\") pod \"multus-mjlcb\" (UID: \"4a32280f-6aec-4142-bb8f-1547ac4378ab\") " pod="openshift-multus/multus-mjlcb" Apr 16 16:02:59.223034 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.222365 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4a32280f-6aec-4142-bb8f-1547ac4378ab-multus-conf-dir\") pod \"multus-mjlcb\" (UID: \"4a32280f-6aec-4142-bb8f-1547ac4378ab\") " pod="openshift-multus/multus-mjlcb" Apr 16 16:02:59.223034 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.222394 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zwvgt\" (UniqueName: \"kubernetes.io/projected/31ee7aaa-d858-49a4-becd-246ec9f1a8c5-kube-api-access-zwvgt\") pod \"network-check-target-xjfdj\" (UID: \"31ee7aaa-d858-49a4-becd-246ec9f1a8c5\") " pod="openshift-network-diagnostics/network-check-target-xjfdj" Apr 16 16:02:59.223034 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.222445 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4a32280f-6aec-4142-bb8f-1547ac4378ab-host-run-k8s-cni-cncf-io\") pod \"multus-mjlcb\" (UID: \"4a32280f-6aec-4142-bb8f-1547ac4378ab\") " pod="openshift-multus/multus-mjlcb" Apr 16 16:02:59.223034 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.222522 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/345463d3-76fd-4233-8808-6df63a64c4b5-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rvbp5\" (UID: \"345463d3-76fd-4233-8808-6df63a64c4b5\") " pod="openshift-multus/multus-additional-cni-plugins-rvbp5" Apr 16 16:02:59.223034 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.222535 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4b05a2af-d8f2-42c1-a086-851d57791b5f-serviceca\") pod \"node-ca-cgg4d\" (UID: \"4b05a2af-d8f2-42c1-a086-851d57791b5f\") " pod="openshift-image-registry/node-ca-cgg4d" Apr 16 16:02:59.223034 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.222537 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4a32280f-6aec-4142-bb8f-1547ac4378ab-multus-cni-dir\") pod \"multus-mjlcb\" (UID: \"4a32280f-6aec-4142-bb8f-1547ac4378ab\") " pod="openshift-multus/multus-mjlcb" Apr 16 16:02:59.223034 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.222595 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4a32280f-6aec-4142-bb8f-1547ac4378ab-multus-conf-dir\") pod \"multus-mjlcb\" (UID: \"4a32280f-6aec-4142-bb8f-1547ac4378ab\") " pod="openshift-multus/multus-mjlcb" Apr 16 16:02:59.223034 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.222720 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4r7q2\" (UniqueName: \"kubernetes.io/projected/4a32280f-6aec-4142-bb8f-1547ac4378ab-kube-api-access-4r7q2\") pod \"multus-mjlcb\" (UID: \"4a32280f-6aec-4142-bb8f-1547ac4378ab\") " pod="openshift-multus/multus-mjlcb" Apr 16 16:02:59.223034 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.222770 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d0578a47-b539-4acd-9e68-0468bd267183-systemd-units\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.223034 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.222812 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d0578a47-b539-4acd-9e68-0468bd267183-systemd-units\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.223825 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.222867 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d0578a47-b539-4acd-9e68-0468bd267183-ovnkube-script-lib\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.223825 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.223047 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f2c0205d-61f8-4c10-bd7f-f8fcf4336fba-etc-modprobe-d\") pod \"tuned-tw4j2\" (UID: \"f2c0205d-61f8-4c10-bd7f-f8fcf4336fba\") " pod="openshift-cluster-node-tuning-operator/tuned-tw4j2" Apr 16 16:02:59.223825 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.223395 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4a32280f-6aec-4142-bb8f-1547ac4378ab-multus-daemon-config\") pod \"multus-mjlcb\" (UID: \"4a32280f-6aec-4142-bb8f-1547ac4378ab\") " pod="openshift-multus/multus-mjlcb" Apr 16 16:02:59.223825 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.223399 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f2c0205d-61f8-4c10-bd7f-f8fcf4336fba-etc-modprobe-d\") pod \"tuned-tw4j2\" (UID: \"f2c0205d-61f8-4c10-bd7f-f8fcf4336fba\") " pod="openshift-cluster-node-tuning-operator/tuned-tw4j2" Apr 16 16:02:59.223825 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.223460 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e5e28615-1240-4149-a23f-752b612f8a06-tmp-dir\") pod \"node-resolver-6bkvd\" (UID: \"e5e28615-1240-4149-a23f-752b612f8a06\") " pod="openshift-dns/node-resolver-6bkvd" Apr 16 16:02:59.223825 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.223499 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d0578a47-b539-4acd-9e68-0468bd267183-host-kubelet\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.223825 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.223522 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d0578a47-b539-4acd-9e68-0468bd267183-host-slash\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.223825 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.223548 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/345463d3-76fd-4233-8808-6df63a64c4b5-system-cni-dir\") pod \"multus-additional-cni-plugins-rvbp5\" (UID: \"345463d3-76fd-4233-8808-6df63a64c4b5\") " pod="openshift-multus/multus-additional-cni-plugins-rvbp5" Apr 16 16:02:59.223825 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.223581 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bd7ws\" (UniqueName: \"kubernetes.io/projected/e5e28615-1240-4149-a23f-752b612f8a06-kube-api-access-bd7ws\") pod \"node-resolver-6bkvd\" (UID: \"e5e28615-1240-4149-a23f-752b612f8a06\") " pod="openshift-dns/node-resolver-6bkvd" Apr 16 16:02:59.223825 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.223627 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4b05a2af-d8f2-42c1-a086-851d57791b5f-host\") pod \"node-ca-cgg4d\" (UID: \"4b05a2af-d8f2-42c1-a086-851d57791b5f\") " pod="openshift-image-registry/node-ca-cgg4d" Apr 16 16:02:59.223825 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.223652 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9eca3407-9be3-44a0-8eb5-274e3e43107d-device-dir\") pod \"aws-ebs-csi-driver-node-72kw2\" (UID: \"9eca3407-9be3-44a0-8eb5-274e3e43107d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72kw2" Apr 16 16:02:59.223825 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.223684 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4a32280f-6aec-4142-bb8f-1547ac4378ab-os-release\") pod \"multus-mjlcb\" (UID: \"4a32280f-6aec-4142-bb8f-1547ac4378ab\") " pod="openshift-multus/multus-mjlcb" Apr 16 16:02:59.223825 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.223710 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f5bzw\" (UniqueName: \"kubernetes.io/projected/e8c9cbbd-500b-410c-bc31-680efc2b8a0c-kube-api-access-f5bzw\") pod \"iptables-alerter-rvgfg\" (UID: \"e8c9cbbd-500b-410c-bc31-680efc2b8a0c\") " pod="openshift-network-operator/iptables-alerter-rvgfg" Apr 16 16:02:59.223825 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.223737 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d0578a47-b539-4acd-9e68-0468bd267183-host-run-netns\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.223825 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.223771 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d0578a47-b539-4acd-9e68-0468bd267183-run-openvswitch\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.223825 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.223809 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e5e28615-1240-4149-a23f-752b612f8a06-tmp-dir\") pod \"node-resolver-6bkvd\" (UID: \"e5e28615-1240-4149-a23f-752b612f8a06\") " pod="openshift-dns/node-resolver-6bkvd" Apr 16 16:02:59.224638 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.223855 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d0578a47-b539-4acd-9e68-0468bd267183-host-kubelet\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.224638 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.223886 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d0578a47-b539-4acd-9e68-0468bd267183-host-run-ovn-kubernetes\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.224638 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.223902 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d0578a47-b539-4acd-9e68-0468bd267183-host-slash\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.224638 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.223813 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d0578a47-b539-4acd-9e68-0468bd267183-host-run-ovn-kubernetes\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.224638 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.223932 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/345463d3-76fd-4233-8808-6df63a64c4b5-system-cni-dir\") pod \"multus-additional-cni-plugins-rvbp5\" (UID: \"345463d3-76fd-4233-8808-6df63a64c4b5\") " pod="openshift-multus/multus-additional-cni-plugins-rvbp5" Apr 16 16:02:59.224638 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.223944 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d0578a47-b539-4acd-9e68-0468bd267183-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.224638 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.223973 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f2c0205d-61f8-4c10-bd7f-f8fcf4336fba-etc-systemd\") pod \"tuned-tw4j2\" (UID: \"f2c0205d-61f8-4c10-bd7f-f8fcf4336fba\") " pod="openshift-cluster-node-tuning-operator/tuned-tw4j2" Apr 16 16:02:59.224638 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.223998 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4a32280f-6aec-4142-bb8f-1547ac4378ab-multus-socket-dir-parent\") pod \"multus-mjlcb\" (UID: \"4a32280f-6aec-4142-bb8f-1547ac4378ab\") " pod="openshift-multus/multus-mjlcb" Apr 16 16:02:59.224638 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.224023 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d0578a47-b539-4acd-9e68-0468bd267183-host-cni-bin\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.224638 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.224050 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f8fch\" (UniqueName: \"kubernetes.io/projected/d0578a47-b539-4acd-9e68-0468bd267183-kube-api-access-f8fch\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.224638 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.224076 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f2c0205d-61f8-4c10-bd7f-f8fcf4336fba-host\") pod \"tuned-tw4j2\" (UID: \"f2c0205d-61f8-4c10-bd7f-f8fcf4336fba\") " pod="openshift-cluster-node-tuning-operator/tuned-tw4j2" Apr 16 16:02:59.224638 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.224111 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e8c9cbbd-500b-410c-bc31-680efc2b8a0c-iptables-alerter-script\") pod \"iptables-alerter-rvgfg\" (UID: \"e8c9cbbd-500b-410c-bc31-680efc2b8a0c\") " pod="openshift-network-operator/iptables-alerter-rvgfg" Apr 16 16:02:59.224638 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.224126 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4a32280f-6aec-4142-bb8f-1547ac4378ab-os-release\") pod \"multus-mjlcb\" (UID: \"4a32280f-6aec-4142-bb8f-1547ac4378ab\") " pod="openshift-multus/multus-mjlcb" Apr 16 16:02:59.224638 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.224138 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f2c0205d-61f8-4c10-bd7f-f8fcf4336fba-etc-sysctl-conf\") pod \"tuned-tw4j2\" (UID: \"f2c0205d-61f8-4c10-bd7f-f8fcf4336fba\") " pod="openshift-cluster-node-tuning-operator/tuned-tw4j2" Apr 16 16:02:59.224638 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.224159 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d0578a47-b539-4acd-9e68-0468bd267183-host-run-netns\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.224638 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.224197 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f2c0205d-61f8-4c10-bd7f-f8fcf4336fba-var-lib-kubelet\") pod \"tuned-tw4j2\" (UID: \"f2c0205d-61f8-4c10-bd7f-f8fcf4336fba\") " pod="openshift-cluster-node-tuning-operator/tuned-tw4j2" Apr 16 16:02:59.224638 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.224231 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d0578a47-b539-4acd-9e68-0468bd267183-run-openvswitch\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.225417 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.224248 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6gz7z\" (UniqueName: \"kubernetes.io/projected/345463d3-76fd-4233-8808-6df63a64c4b5-kube-api-access-6gz7z\") pod \"multus-additional-cni-plugins-rvbp5\" (UID: \"345463d3-76fd-4233-8808-6df63a64c4b5\") " pod="openshift-multus/multus-additional-cni-plugins-rvbp5" Apr 16 16:02:59.225417 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.224273 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c969e45c-de0a-46a1-b293-93c2eb9bcd6f-konnectivity-ca\") pod \"konnectivity-agent-8rbfc\" (UID: \"c969e45c-de0a-46a1-b293-93c2eb9bcd6f\") " pod="kube-system/konnectivity-agent-8rbfc" Apr 16 16:02:59.225417 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.224281 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f2c0205d-61f8-4c10-bd7f-f8fcf4336fba-host\") pod \"tuned-tw4j2\" (UID: \"f2c0205d-61f8-4c10-bd7f-f8fcf4336fba\") " pod="openshift-cluster-node-tuning-operator/tuned-tw4j2" Apr 16 16:02:59.225417 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.224314 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9eca3407-9be3-44a0-8eb5-274e3e43107d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-72kw2\" (UID: \"9eca3407-9be3-44a0-8eb5-274e3e43107d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72kw2" Apr 16 16:02:59.225417 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.224327 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d0578a47-b539-4acd-9e68-0468bd267183-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.225417 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.224341 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4a32280f-6aec-4142-bb8f-1547ac4378ab-host-var-lib-cni-bin\") pod \"multus-mjlcb\" (UID: \"4a32280f-6aec-4142-bb8f-1547ac4378ab\") " pod="openshift-multus/multus-mjlcb" Apr 16 16:02:59.225417 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.224366 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e8c9cbbd-500b-410c-bc31-680efc2b8a0c-host-slash\") pod \"iptables-alerter-rvgfg\" (UID: \"e8c9cbbd-500b-410c-bc31-680efc2b8a0c\") " pod="openshift-network-operator/iptables-alerter-rvgfg" Apr 16 16:02:59.225417 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.224375 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f2c0205d-61f8-4c10-bd7f-f8fcf4336fba-etc-systemd\") pod \"tuned-tw4j2\" (UID: \"f2c0205d-61f8-4c10-bd7f-f8fcf4336fba\") " pod="openshift-cluster-node-tuning-operator/tuned-tw4j2" Apr 16 16:02:59.225417 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.224397 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d0578a47-b539-4acd-9e68-0468bd267183-ovnkube-config\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.225417 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.224423 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4a32280f-6aec-4142-bb8f-1547ac4378ab-multus-socket-dir-parent\") pod \"multus-mjlcb\" (UID: \"4a32280f-6aec-4142-bb8f-1547ac4378ab\") " pod="openshift-multus/multus-mjlcb" Apr 16 16:02:59.225417 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.224454 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f2c0205d-61f8-4c10-bd7f-f8fcf4336fba-etc-sysctl-d\") pod \"tuned-tw4j2\" (UID: \"f2c0205d-61f8-4c10-bd7f-f8fcf4336fba\") " pod="openshift-cluster-node-tuning-operator/tuned-tw4j2" Apr 16 16:02:59.225417 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.224456 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d0578a47-b539-4acd-9e68-0468bd267183-host-cni-bin\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.225417 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.224488 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f2c0205d-61f8-4c10-bd7f-f8fcf4336fba-sys\") pod \"tuned-tw4j2\" (UID: \"f2c0205d-61f8-4c10-bd7f-f8fcf4336fba\") " pod="openshift-cluster-node-tuning-operator/tuned-tw4j2" Apr 16 16:02:59.225417 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.224562 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f2c0205d-61f8-4c10-bd7f-f8fcf4336fba-sys\") pod \"tuned-tw4j2\" (UID: \"f2c0205d-61f8-4c10-bd7f-f8fcf4336fba\") " pod="openshift-cluster-node-tuning-operator/tuned-tw4j2" Apr 16 16:02:59.225417 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.224572 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4a32280f-6aec-4142-bb8f-1547ac4378ab-host-var-lib-cni-bin\") pod \"multus-mjlcb\" (UID: \"4a32280f-6aec-4142-bb8f-1547ac4378ab\") " pod="openshift-multus/multus-mjlcb" Apr 16 16:02:59.225417 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.224617 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e8c9cbbd-500b-410c-bc31-680efc2b8a0c-host-slash\") pod \"iptables-alerter-rvgfg\" (UID: \"e8c9cbbd-500b-410c-bc31-680efc2b8a0c\") " pod="openshift-network-operator/iptables-alerter-rvgfg" Apr 16 16:02:59.225417 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.224739 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f2c0205d-61f8-4c10-bd7f-f8fcf4336fba-etc-sysctl-d\") pod \"tuned-tw4j2\" (UID: \"f2c0205d-61f8-4c10-bd7f-f8fcf4336fba\") " pod="openshift-cluster-node-tuning-operator/tuned-tw4j2" Apr 16 16:02:59.226150 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.224734 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9eca3407-9be3-44a0-8eb5-274e3e43107d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-72kw2\" (UID: \"9eca3407-9be3-44a0-8eb5-274e3e43107d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72kw2" Apr 16 16:02:59.226150 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.224786 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f2c0205d-61f8-4c10-bd7f-f8fcf4336fba-etc-tuned\") pod \"tuned-tw4j2\" (UID: \"f2c0205d-61f8-4c10-bd7f-f8fcf4336fba\") " pod="openshift-cluster-node-tuning-operator/tuned-tw4j2" Apr 16 16:02:59.226150 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.224796 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f2c0205d-61f8-4c10-bd7f-f8fcf4336fba-var-lib-kubelet\") pod \"tuned-tw4j2\" (UID: \"f2c0205d-61f8-4c10-bd7f-f8fcf4336fba\") " pod="openshift-cluster-node-tuning-operator/tuned-tw4j2" Apr 16 16:02:59.226150 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.224814 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9eca3407-9be3-44a0-8eb5-274e3e43107d-etc-selinux\") pod \"aws-ebs-csi-driver-node-72kw2\" (UID: \"9eca3407-9be3-44a0-8eb5-274e3e43107d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72kw2" Apr 16 16:02:59.226150 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.224841 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4a32280f-6aec-4142-bb8f-1547ac4378ab-cni-binary-copy\") pod \"multus-mjlcb\" (UID: \"4a32280f-6aec-4142-bb8f-1547ac4378ab\") " pod="openshift-multus/multus-mjlcb" Apr 16 16:02:59.226150 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.224889 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d0578a47-b539-4acd-9e68-0468bd267183-run-systemd\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.226150 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.224917 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d0578a47-b539-4acd-9e68-0468bd267183-var-lib-openvswitch\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.226150 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.224940 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xfm2n\" (UniqueName: \"kubernetes.io/projected/f2c0205d-61f8-4c10-bd7f-f8fcf4336fba-kube-api-access-xfm2n\") pod \"tuned-tw4j2\" (UID: \"f2c0205d-61f8-4c10-bd7f-f8fcf4336fba\") " pod="openshift-cluster-node-tuning-operator/tuned-tw4j2" Apr 16 16:02:59.226150 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.224965 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4a32280f-6aec-4142-bb8f-1547ac4378ab-host-var-lib-cni-multus\") pod \"multus-mjlcb\" (UID: \"4a32280f-6aec-4142-bb8f-1547ac4378ab\") " pod="openshift-multus/multus-mjlcb" Apr 16 16:02:59.226150 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.224990 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/916e5e50-1aef-4277-971a-7f2e8ffd2703-metrics-certs\") pod \"network-metrics-daemon-gvndv\" (UID: \"916e5e50-1aef-4277-971a-7f2e8ffd2703\") " pod="openshift-multus/network-metrics-daemon-gvndv" Apr 16 16:02:59.226150 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.225044 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d0578a47-b539-4acd-9e68-0468bd267183-run-systemd\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.226150 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.225057 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f2c0205d-61f8-4c10-bd7f-f8fcf4336fba-etc-sysctl-conf\") pod \"tuned-tw4j2\" (UID: \"f2c0205d-61f8-4c10-bd7f-f8fcf4336fba\") " pod="openshift-cluster-node-tuning-operator/tuned-tw4j2" Apr 16 16:02:59.226150 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:02:59.225110 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:02:59.226150 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.225107 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/c969e45c-de0a-46a1-b293-93c2eb9bcd6f-konnectivity-ca\") pod \"konnectivity-agent-8rbfc\" (UID: \"c969e45c-de0a-46a1-b293-93c2eb9bcd6f\") " pod="kube-system/konnectivity-agent-8rbfc" Apr 16 16:02:59.226150 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.225151 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/9eca3407-9be3-44a0-8eb5-274e3e43107d-etc-selinux\") pod \"aws-ebs-csi-driver-node-72kw2\" (UID: \"9eca3407-9be3-44a0-8eb5-274e3e43107d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72kw2" Apr 16 16:02:59.226150 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:02:59.225179 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/916e5e50-1aef-4277-971a-7f2e8ffd2703-metrics-certs podName:916e5e50-1aef-4277-971a-7f2e8ffd2703 nodeName:}" failed. No retries permitted until 2026-04-16 16:02:59.725157006 +0000 UTC m=+3.080147132 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/916e5e50-1aef-4277-971a-7f2e8ffd2703-metrics-certs") pod "network-metrics-daemon-gvndv" (UID: "916e5e50-1aef-4277-971a-7f2e8ffd2703") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:02:59.226150 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.225185 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e8c9cbbd-500b-410c-bc31-680efc2b8a0c-iptables-alerter-script\") pod \"iptables-alerter-rvgfg\" (UID: \"e8c9cbbd-500b-410c-bc31-680efc2b8a0c\") " pod="openshift-network-operator/iptables-alerter-rvgfg" Apr 16 16:02:59.226731 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.225203 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4a32280f-6aec-4142-bb8f-1547ac4378ab-host-var-lib-cni-multus\") pod \"multus-mjlcb\" (UID: \"4a32280f-6aec-4142-bb8f-1547ac4378ab\") " pod="openshift-multus/multus-mjlcb" Apr 16 16:02:59.226731 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.225267 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d0578a47-b539-4acd-9e68-0468bd267183-var-lib-openvswitch\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.226731 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.225679 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d0578a47-b539-4acd-9e68-0468bd267183-ovn-node-metrics-cert\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.226731 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.225728 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9eca3407-9be3-44a0-8eb5-274e3e43107d-device-dir\") pod \"aws-ebs-csi-driver-node-72kw2\" (UID: \"9eca3407-9be3-44a0-8eb5-274e3e43107d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72kw2" Apr 16 16:02:59.226731 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.225785 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4a32280f-6aec-4142-bb8f-1547ac4378ab-cni-binary-copy\") pod \"multus-mjlcb\" (UID: \"4a32280f-6aec-4142-bb8f-1547ac4378ab\") " pod="openshift-multus/multus-mjlcb" Apr 16 16:02:59.226731 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.225812 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/c969e45c-de0a-46a1-b293-93c2eb9bcd6f-agent-certs\") pod \"konnectivity-agent-8rbfc\" (UID: \"c969e45c-de0a-46a1-b293-93c2eb9bcd6f\") " pod="kube-system/konnectivity-agent-8rbfc" Apr 16 16:02:59.226731 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.226117 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f2c0205d-61f8-4c10-bd7f-f8fcf4336fba-tmp\") pod \"tuned-tw4j2\" (UID: \"f2c0205d-61f8-4c10-bd7f-f8fcf4336fba\") " pod="openshift-cluster-node-tuning-operator/tuned-tw4j2" Apr 16 16:02:59.226731 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.226194 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d0578a47-b539-4acd-9e68-0468bd267183-ovnkube-config\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.226731 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.226258 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d0578a47-b539-4acd-9e68-0468bd267183-ovnkube-script-lib\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.227412 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.227394 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f2c0205d-61f8-4c10-bd7f-f8fcf4336fba-etc-tuned\") pod \"tuned-tw4j2\" (UID: \"f2c0205d-61f8-4c10-bd7f-f8fcf4336fba\") " pod="openshift-cluster-node-tuning-operator/tuned-tw4j2" Apr 16 16:02:59.235440 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:02:59.235386 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:02:59.235440 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:02:59.235413 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:02:59.235586 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:02:59.235442 2576 projected.go:194] Error preparing data for projected volume kube-api-access-zwvgt for pod openshift-network-diagnostics/network-check-target-xjfdj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:02:59.235586 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:02:59.235516 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/31ee7aaa-d858-49a4-becd-246ec9f1a8c5-kube-api-access-zwvgt podName:31ee7aaa-d858-49a4-becd-246ec9f1a8c5 nodeName:}" failed. No retries permitted until 2026-04-16 16:02:59.735487479 +0000 UTC m=+3.090477592 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-zwvgt" (UniqueName: "kubernetes.io/projected/31ee7aaa-d858-49a4-becd-246ec9f1a8c5-kube-api-access-zwvgt") pod "network-check-target-xjfdj" (UID: "31ee7aaa-d858-49a4-becd-246ec9f1a8c5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:02:59.237855 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.237833 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktggg\" (UniqueName: \"kubernetes.io/projected/9eca3407-9be3-44a0-8eb5-274e3e43107d-kube-api-access-ktggg\") pod \"aws-ebs-csi-driver-node-72kw2\" (UID: \"9eca3407-9be3-44a0-8eb5-274e3e43107d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72kw2" Apr 16 16:02:59.237936 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.237881 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwmpg\" (UniqueName: \"kubernetes.io/projected/916e5e50-1aef-4277-971a-7f2e8ffd2703-kube-api-access-pwmpg\") pod \"network-metrics-daemon-gvndv\" (UID: \"916e5e50-1aef-4277-971a-7f2e8ffd2703\") " pod="openshift-multus/network-metrics-daemon-gvndv" Apr 16 16:02:59.238992 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.238975 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r7q2\" (UniqueName: \"kubernetes.io/projected/4a32280f-6aec-4142-bb8f-1547ac4378ab-kube-api-access-4r7q2\") pod \"multus-mjlcb\" (UID: \"4a32280f-6aec-4142-bb8f-1547ac4378ab\") " pod="openshift-multus/multus-mjlcb" Apr 16 16:02:59.246335 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.246313 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8fch\" (UniqueName: \"kubernetes.io/projected/d0578a47-b539-4acd-9e68-0468bd267183-kube-api-access-f8fch\") pod \"ovnkube-node-5fl2m\" (UID: \"d0578a47-b539-4acd-9e68-0468bd267183\") " pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.250871 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.250845 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5bzw\" (UniqueName: \"kubernetes.io/projected/e8c9cbbd-500b-410c-bc31-680efc2b8a0c-kube-api-access-f5bzw\") pod \"iptables-alerter-rvgfg\" (UID: \"e8c9cbbd-500b-410c-bc31-680efc2b8a0c\") " pod="openshift-network-operator/iptables-alerter-rvgfg" Apr 16 16:02:59.251766 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.251748 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gz7z\" (UniqueName: \"kubernetes.io/projected/345463d3-76fd-4233-8808-6df63a64c4b5-kube-api-access-6gz7z\") pod \"multus-additional-cni-plugins-rvbp5\" (UID: \"345463d3-76fd-4233-8808-6df63a64c4b5\") " pod="openshift-multus/multus-additional-cni-plugins-rvbp5" Apr 16 16:02:59.253290 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.253273 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bd7ws\" (UniqueName: \"kubernetes.io/projected/e5e28615-1240-4149-a23f-752b612f8a06-kube-api-access-bd7ws\") pod \"node-resolver-6bkvd\" (UID: \"e5e28615-1240-4149-a23f-752b612f8a06\") " pod="openshift-dns/node-resolver-6bkvd" Apr 16 16:02:59.253668 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.253651 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfm2n\" (UniqueName: \"kubernetes.io/projected/f2c0205d-61f8-4c10-bd7f-f8fcf4336fba-kube-api-access-xfm2n\") pod \"tuned-tw4j2\" (UID: \"f2c0205d-61f8-4c10-bd7f-f8fcf4336fba\") " pod="openshift-cluster-node-tuning-operator/tuned-tw4j2" Apr 16 16:02:59.325394 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.325362 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fcfxz\" (UniqueName: \"kubernetes.io/projected/4b05a2af-d8f2-42c1-a086-851d57791b5f-kube-api-access-fcfxz\") pod \"node-ca-cgg4d\" (UID: \"4b05a2af-d8f2-42c1-a086-851d57791b5f\") " pod="openshift-image-registry/node-ca-cgg4d" Apr 16 16:02:59.325566 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.325420 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4b05a2af-d8f2-42c1-a086-851d57791b5f-serviceca\") pod \"node-ca-cgg4d\" (UID: \"4b05a2af-d8f2-42c1-a086-851d57791b5f\") " pod="openshift-image-registry/node-ca-cgg4d" Apr 16 16:02:59.325566 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.325451 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4b05a2af-d8f2-42c1-a086-851d57791b5f-host\") pod \"node-ca-cgg4d\" (UID: \"4b05a2af-d8f2-42c1-a086-851d57791b5f\") " pod="openshift-image-registry/node-ca-cgg4d" Apr 16 16:02:59.325566 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.325545 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4b05a2af-d8f2-42c1-a086-851d57791b5f-host\") pod \"node-ca-cgg4d\" (UID: \"4b05a2af-d8f2-42c1-a086-851d57791b5f\") " pod="openshift-image-registry/node-ca-cgg4d" Apr 16 16:02:59.325833 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.325815 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4b05a2af-d8f2-42c1-a086-851d57791b5f-serviceca\") pod \"node-ca-cgg4d\" (UID: \"4b05a2af-d8f2-42c1-a086-851d57791b5f\") " pod="openshift-image-registry/node-ca-cgg4d" Apr 16 16:02:59.338831 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.338809 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcfxz\" (UniqueName: \"kubernetes.io/projected/4b05a2af-d8f2-42c1-a086-851d57791b5f-kube-api-access-fcfxz\") pod \"node-ca-cgg4d\" (UID: \"4b05a2af-d8f2-42c1-a086-851d57791b5f\") " pod="openshift-image-registry/node-ca-cgg4d" Apr 16 16:02:59.341767 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.341745 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:02:59.404986 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.404955 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72kw2" Apr 16 16:02:59.412876 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.412847 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-rvbp5" Apr 16 16:02:59.426632 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.426607 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-8rbfc" Apr 16 16:02:59.431375 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.431353 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mjlcb" Apr 16 16:02:59.437923 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.437905 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-rvgfg" Apr 16 16:02:59.444872 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.444852 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:02:59.450540 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.450520 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-tw4j2" Apr 16 16:02:59.458090 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.458071 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-6bkvd" Apr 16 16:02:59.463560 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.463539 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-cgg4d" Apr 16 16:02:59.728901 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.728870 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/916e5e50-1aef-4277-971a-7f2e8ffd2703-metrics-certs\") pod \"network-metrics-daemon-gvndv\" (UID: \"916e5e50-1aef-4277-971a-7f2e8ffd2703\") " pod="openshift-multus/network-metrics-daemon-gvndv" Apr 16 16:02:59.729067 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:02:59.728980 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:02:59.729067 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:02:59.729047 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/916e5e50-1aef-4277-971a-7f2e8ffd2703-metrics-certs podName:916e5e50-1aef-4277-971a-7f2e8ffd2703 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:00.72902801 +0000 UTC m=+4.084018122 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/916e5e50-1aef-4277-971a-7f2e8ffd2703-metrics-certs") pod "network-metrics-daemon-gvndv" (UID: "916e5e50-1aef-4277-971a-7f2e8ffd2703") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:02:59.829406 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:02:59.829376 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zwvgt\" (UniqueName: \"kubernetes.io/projected/31ee7aaa-d858-49a4-becd-246ec9f1a8c5-kube-api-access-zwvgt\") pod \"network-check-target-xjfdj\" (UID: \"31ee7aaa-d858-49a4-becd-246ec9f1a8c5\") " pod="openshift-network-diagnostics/network-check-target-xjfdj" Apr 16 16:02:59.829545 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:02:59.829493 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:02:59.829545 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:02:59.829508 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:02:59.829545 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:02:59.829517 2576 projected.go:194] Error preparing data for projected volume kube-api-access-zwvgt for pod openshift-network-diagnostics/network-check-target-xjfdj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:02:59.829659 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:02:59.829563 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/31ee7aaa-d858-49a4-becd-246ec9f1a8c5-kube-api-access-zwvgt podName:31ee7aaa-d858-49a4-becd-246ec9f1a8c5 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:00.829551156 +0000 UTC m=+4.184541264 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-zwvgt" (UniqueName: "kubernetes.io/projected/31ee7aaa-d858-49a4-becd-246ec9f1a8c5-kube-api-access-zwvgt") pod "network-check-target-xjfdj" (UID: "31ee7aaa-d858-49a4-becd-246ec9f1a8c5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:02:59.871932 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:59.871893 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc969e45c_de0a_46a1_b293_93c2eb9bcd6f.slice/crio-d02e2b769d9bfaab9583585620bf587db271967db9393509337baeffe7922cd3 WatchSource:0}: Error finding container d02e2b769d9bfaab9583585620bf587db271967db9393509337baeffe7922cd3: Status 404 returned error can't find the container with id d02e2b769d9bfaab9583585620bf587db271967db9393509337baeffe7922cd3 Apr 16 16:02:59.872948 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:59.872927 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5e28615_1240_4149_a23f_752b612f8a06.slice/crio-cc163711a05aac519c3627609f97fb583aa9f0fcc5f94db38b3673a255d092b3 WatchSource:0}: Error finding container cc163711a05aac519c3627609f97fb583aa9f0fcc5f94db38b3673a255d092b3: Status 404 returned error can't find the container with id cc163711a05aac519c3627609f97fb583aa9f0fcc5f94db38b3673a255d092b3 Apr 16 16:02:59.874275 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:59.874050 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod345463d3_76fd_4233_8808_6df63a64c4b5.slice/crio-765caf927fd090705245d758c060df954519448bf1494b69cd41b963edab54f2 WatchSource:0}: Error finding container 765caf927fd090705245d758c060df954519448bf1494b69cd41b963edab54f2: Status 404 returned error can't find the container with id 765caf927fd090705245d758c060df954519448bf1494b69cd41b963edab54f2 Apr 16 16:02:59.876486 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:59.876372 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b05a2af_d8f2_42c1_a086_851d57791b5f.slice/crio-975f7728f3050a4f8300c4a4fd891861632ff21744f21c1eff7a202e62e22403 WatchSource:0}: Error finding container 975f7728f3050a4f8300c4a4fd891861632ff21744f21c1eff7a202e62e22403: Status 404 returned error can't find the container with id 975f7728f3050a4f8300c4a4fd891861632ff21744f21c1eff7a202e62e22403 Apr 16 16:02:59.877155 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:59.877135 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8c9cbbd_500b_410c_bc31_680efc2b8a0c.slice/crio-223d01be1b907f2de5b4356fa076de0bf72d33a14a4c2f37505e187df713b4e2 WatchSource:0}: Error finding container 223d01be1b907f2de5b4356fa076de0bf72d33a14a4c2f37505e187df713b4e2: Status 404 returned error can't find the container with id 223d01be1b907f2de5b4356fa076de0bf72d33a14a4c2f37505e187df713b4e2 Apr 16 16:02:59.878274 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:02:59.878239 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a32280f_6aec_4142_bb8f_1547ac4378ab.slice/crio-2724db965ba0375209494c715911c9bda7b6621db05e69e4d7b35c84b38ccc20 WatchSource:0}: Error finding container 2724db965ba0375209494c715911c9bda7b6621db05e69e4d7b35c84b38ccc20: Status 404 returned error can't find the container with id 2724db965ba0375209494c715911c9bda7b6621db05e69e4d7b35c84b38ccc20 Apr 16 16:03:00.180521 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:00.180341 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 15:57:58 +0000 UTC" deadline="2028-01-11 03:46:09.365525964 +0000 UTC" Apr 16 16:03:00.180521 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:00.180513 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15227h43m9.185015718s" Apr 16 16:03:00.273843 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:00.273573 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xjfdj" Apr 16 16:03:00.273843 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:00.273692 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xjfdj" podUID="31ee7aaa-d858-49a4-becd-246ec9f1a8c5" Apr 16 16:03:00.282798 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:00.282743 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-cgg4d" event={"ID":"4b05a2af-d8f2-42c1-a086-851d57791b5f","Type":"ContainerStarted","Data":"975f7728f3050a4f8300c4a4fd891861632ff21744f21c1eff7a202e62e22403"} Apr 16 16:03:00.287965 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:00.287906 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rvbp5" event={"ID":"345463d3-76fd-4233-8808-6df63a64c4b5","Type":"ContainerStarted","Data":"765caf927fd090705245d758c060df954519448bf1494b69cd41b963edab54f2"} Apr 16 16:03:00.294857 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:00.294821 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-6bkvd" event={"ID":"e5e28615-1240-4149-a23f-752b612f8a06","Type":"ContainerStarted","Data":"cc163711a05aac519c3627609f97fb583aa9f0fcc5f94db38b3673a255d092b3"} Apr 16 16:03:00.297815 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:00.297787 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-8rbfc" event={"ID":"c969e45c-de0a-46a1-b293-93c2eb9bcd6f","Type":"ContainerStarted","Data":"d02e2b769d9bfaab9583585620bf587db271967db9393509337baeffe7922cd3"} Apr 16 16:03:00.305894 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:00.305858 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-150.ec2.internal" event={"ID":"6ba071faccce2e60097e197ecb90d16a","Type":"ContainerStarted","Data":"fe2af52231e8d495344ae8b88290074a1f50a8c12a10a6fd7c8e37d34fe5553a"} Apr 16 16:03:00.307865 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:00.307800 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72kw2" event={"ID":"9eca3407-9be3-44a0-8eb5-274e3e43107d","Type":"ContainerStarted","Data":"c138f116eb908af186138608e4f36c7ecf6a21afc32dafbd444427b7ef466684"} Apr 16 16:03:00.312978 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:00.312953 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" event={"ID":"d0578a47-b539-4acd-9e68-0468bd267183","Type":"ContainerStarted","Data":"62a1254ead2e113fd706bc1f60297bcb8b3bb1c5a53bad244e239b3e0308cb0a"} Apr 16 16:03:00.317232 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:00.317182 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-tw4j2" event={"ID":"f2c0205d-61f8-4c10-bd7f-f8fcf4336fba","Type":"ContainerStarted","Data":"24231a2ae1dfb155f9ffe7a6f0ce6399b7ac7adbeb130574f20997046cfe526f"} Apr 16 16:03:00.318933 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:00.318874 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mjlcb" event={"ID":"4a32280f-6aec-4142-bb8f-1547ac4378ab","Type":"ContainerStarted","Data":"2724db965ba0375209494c715911c9bda7b6621db05e69e4d7b35c84b38ccc20"} Apr 16 16:03:00.324779 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:00.324728 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-rvgfg" event={"ID":"e8c9cbbd-500b-410c-bc31-680efc2b8a0c","Type":"ContainerStarted","Data":"223d01be1b907f2de5b4356fa076de0bf72d33a14a4c2f37505e187df713b4e2"} Apr 16 16:03:00.738784 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:00.738748 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/916e5e50-1aef-4277-971a-7f2e8ffd2703-metrics-certs\") pod \"network-metrics-daemon-gvndv\" (UID: \"916e5e50-1aef-4277-971a-7f2e8ffd2703\") " pod="openshift-multus/network-metrics-daemon-gvndv" Apr 16 16:03:00.738959 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:00.738923 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:03:00.739017 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:00.738984 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/916e5e50-1aef-4277-971a-7f2e8ffd2703-metrics-certs podName:916e5e50-1aef-4277-971a-7f2e8ffd2703 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:02.738965894 +0000 UTC m=+6.093956006 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/916e5e50-1aef-4277-971a-7f2e8ffd2703-metrics-certs") pod "network-metrics-daemon-gvndv" (UID: "916e5e50-1aef-4277-971a-7f2e8ffd2703") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:03:00.839977 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:00.839943 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zwvgt\" (UniqueName: \"kubernetes.io/projected/31ee7aaa-d858-49a4-becd-246ec9f1a8c5-kube-api-access-zwvgt\") pod \"network-check-target-xjfdj\" (UID: \"31ee7aaa-d858-49a4-becd-246ec9f1a8c5\") " pod="openshift-network-diagnostics/network-check-target-xjfdj" Apr 16 16:03:00.840128 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:00.840109 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:03:00.840128 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:00.840128 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:03:00.840254 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:00.840140 2576 projected.go:194] Error preparing data for projected volume kube-api-access-zwvgt for pod openshift-network-diagnostics/network-check-target-xjfdj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:03:00.840254 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:00.840195 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/31ee7aaa-d858-49a4-becd-246ec9f1a8c5-kube-api-access-zwvgt podName:31ee7aaa-d858-49a4-becd-246ec9f1a8c5 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:02.840178162 +0000 UTC m=+6.195168272 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-zwvgt" (UniqueName: "kubernetes.io/projected/31ee7aaa-d858-49a4-becd-246ec9f1a8c5-kube-api-access-zwvgt") pod "network-check-target-xjfdj" (UID: "31ee7aaa-d858-49a4-becd-246ec9f1a8c5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:03:01.276787 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:01.276700 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gvndv" Apr 16 16:03:01.277231 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:01.276835 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gvndv" podUID="916e5e50-1aef-4277-971a-7f2e8ffd2703" Apr 16 16:03:01.339917 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:01.338794 2576 generic.go:358] "Generic (PLEG): container finished" podID="86a89fa024a7f053b23d64348497745e" containerID="459b5fb2c9bc4ea0dd4ba9ef44a3e76f0ded5138c2c1ddbcfaf4f1d66ce9e732" exitCode=0 Apr 16 16:03:01.339917 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:01.339684 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-150.ec2.internal" event={"ID":"86a89fa024a7f053b23d64348497745e","Type":"ContainerDied","Data":"459b5fb2c9bc4ea0dd4ba9ef44a3e76f0ded5138c2c1ddbcfaf4f1d66ce9e732"} Apr 16 16:03:01.365475 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:01.364575 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-150.ec2.internal" podStartSLOduration=3.364557687 podStartE2EDuration="3.364557687s" podCreationTimestamp="2026-04-16 16:02:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:03:00.320508291 +0000 UTC m=+3.675498423" watchObservedRunningTime="2026-04-16 16:03:01.364557687 +0000 UTC m=+4.719547818" Apr 16 16:03:02.273830 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:02.273767 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xjfdj" Apr 16 16:03:02.274006 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:02.273885 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xjfdj" podUID="31ee7aaa-d858-49a4-becd-246ec9f1a8c5" Apr 16 16:03:02.344840 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:02.344804 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-150.ec2.internal" event={"ID":"86a89fa024a7f053b23d64348497745e","Type":"ContainerStarted","Data":"7d82a6deb800ebae7fea68cabc91de22a8be322d04b757af7be1058e7ff63bb5"} Apr 16 16:03:02.359587 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:02.359524 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-150.ec2.internal" podStartSLOduration=4.359507233 podStartE2EDuration="4.359507233s" podCreationTimestamp="2026-04-16 16:02:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:03:02.359096113 +0000 UTC m=+5.714086244" watchObservedRunningTime="2026-04-16 16:03:02.359507233 +0000 UTC m=+5.714497363" Apr 16 16:03:02.754432 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:02.754394 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/916e5e50-1aef-4277-971a-7f2e8ffd2703-metrics-certs\") pod \"network-metrics-daemon-gvndv\" (UID: \"916e5e50-1aef-4277-971a-7f2e8ffd2703\") " pod="openshift-multus/network-metrics-daemon-gvndv" Apr 16 16:03:02.754609 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:02.754566 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:03:02.754676 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:02.754628 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/916e5e50-1aef-4277-971a-7f2e8ffd2703-metrics-certs podName:916e5e50-1aef-4277-971a-7f2e8ffd2703 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:06.754609823 +0000 UTC m=+10.109599937 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/916e5e50-1aef-4277-971a-7f2e8ffd2703-metrics-certs") pod "network-metrics-daemon-gvndv" (UID: "916e5e50-1aef-4277-971a-7f2e8ffd2703") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:03:02.855917 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:02.855166 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zwvgt\" (UniqueName: \"kubernetes.io/projected/31ee7aaa-d858-49a4-becd-246ec9f1a8c5-kube-api-access-zwvgt\") pod \"network-check-target-xjfdj\" (UID: \"31ee7aaa-d858-49a4-becd-246ec9f1a8c5\") " pod="openshift-network-diagnostics/network-check-target-xjfdj" Apr 16 16:03:02.855917 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:02.855429 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:03:02.855917 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:02.855452 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:03:02.855917 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:02.855467 2576 projected.go:194] Error preparing data for projected volume kube-api-access-zwvgt for pod openshift-network-diagnostics/network-check-target-xjfdj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:03:02.855917 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:02.855529 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/31ee7aaa-d858-49a4-becd-246ec9f1a8c5-kube-api-access-zwvgt podName:31ee7aaa-d858-49a4-becd-246ec9f1a8c5 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:06.855510324 +0000 UTC m=+10.210500444 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-zwvgt" (UniqueName: "kubernetes.io/projected/31ee7aaa-d858-49a4-becd-246ec9f1a8c5-kube-api-access-zwvgt") pod "network-check-target-xjfdj" (UID: "31ee7aaa-d858-49a4-becd-246ec9f1a8c5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:03:03.275181 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:03.274710 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gvndv" Apr 16 16:03:03.275181 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:03.274879 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gvndv" podUID="916e5e50-1aef-4277-971a-7f2e8ffd2703" Apr 16 16:03:03.325488 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:03.325457 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-ntvgj"] Apr 16 16:03:03.327854 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:03.327471 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ntvgj" Apr 16 16:03:03.327854 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:03.327547 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ntvgj" podUID="46b0410a-fdd8-490e-b05f-b4633630c446" Apr 16 16:03:03.360374 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:03.360344 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/46b0410a-fdd8-490e-b05f-b4633630c446-original-pull-secret\") pod \"global-pull-secret-syncer-ntvgj\" (UID: \"46b0410a-fdd8-490e-b05f-b4633630c446\") " pod="kube-system/global-pull-secret-syncer-ntvgj" Apr 16 16:03:03.360786 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:03.360406 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/46b0410a-fdd8-490e-b05f-b4633630c446-kubelet-config\") pod \"global-pull-secret-syncer-ntvgj\" (UID: \"46b0410a-fdd8-490e-b05f-b4633630c446\") " pod="kube-system/global-pull-secret-syncer-ntvgj" Apr 16 16:03:03.360786 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:03.360444 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/46b0410a-fdd8-490e-b05f-b4633630c446-dbus\") pod \"global-pull-secret-syncer-ntvgj\" (UID: \"46b0410a-fdd8-490e-b05f-b4633630c446\") " pod="kube-system/global-pull-secret-syncer-ntvgj" Apr 16 16:03:03.461597 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:03.461558 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/46b0410a-fdd8-490e-b05f-b4633630c446-original-pull-secret\") pod \"global-pull-secret-syncer-ntvgj\" (UID: \"46b0410a-fdd8-490e-b05f-b4633630c446\") " pod="kube-system/global-pull-secret-syncer-ntvgj" Apr 16 16:03:03.461767 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:03.461619 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/46b0410a-fdd8-490e-b05f-b4633630c446-kubelet-config\") pod \"global-pull-secret-syncer-ntvgj\" (UID: \"46b0410a-fdd8-490e-b05f-b4633630c446\") " pod="kube-system/global-pull-secret-syncer-ntvgj" Apr 16 16:03:03.461767 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:03.461657 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/46b0410a-fdd8-490e-b05f-b4633630c446-dbus\") pod \"global-pull-secret-syncer-ntvgj\" (UID: \"46b0410a-fdd8-490e-b05f-b4633630c446\") " pod="kube-system/global-pull-secret-syncer-ntvgj" Apr 16 16:03:03.461894 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:03.461828 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/46b0410a-fdd8-490e-b05f-b4633630c446-dbus\") pod \"global-pull-secret-syncer-ntvgj\" (UID: \"46b0410a-fdd8-490e-b05f-b4633630c446\") " pod="kube-system/global-pull-secret-syncer-ntvgj" Apr 16 16:03:03.461947 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:03.461933 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:03:03.462001 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:03.461988 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46b0410a-fdd8-490e-b05f-b4633630c446-original-pull-secret podName:46b0410a-fdd8-490e-b05f-b4633630c446 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:03.961970073 +0000 UTC m=+7.316960194 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/46b0410a-fdd8-490e-b05f-b4633630c446-original-pull-secret") pod "global-pull-secret-syncer-ntvgj" (UID: "46b0410a-fdd8-490e-b05f-b4633630c446") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:03:03.462139 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:03.462109 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/46b0410a-fdd8-490e-b05f-b4633630c446-kubelet-config\") pod \"global-pull-secret-syncer-ntvgj\" (UID: \"46b0410a-fdd8-490e-b05f-b4633630c446\") " pod="kube-system/global-pull-secret-syncer-ntvgj" Apr 16 16:03:03.966705 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:03.966661 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/46b0410a-fdd8-490e-b05f-b4633630c446-original-pull-secret\") pod \"global-pull-secret-syncer-ntvgj\" (UID: \"46b0410a-fdd8-490e-b05f-b4633630c446\") " pod="kube-system/global-pull-secret-syncer-ntvgj" Apr 16 16:03:03.966897 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:03.966837 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:03:03.966962 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:03.966899 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46b0410a-fdd8-490e-b05f-b4633630c446-original-pull-secret podName:46b0410a-fdd8-490e-b05f-b4633630c446 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:04.966881639 +0000 UTC m=+8.321871751 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/46b0410a-fdd8-490e-b05f-b4633630c446-original-pull-secret") pod "global-pull-secret-syncer-ntvgj" (UID: "46b0410a-fdd8-490e-b05f-b4633630c446") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:03:04.274039 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:04.273941 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xjfdj" Apr 16 16:03:04.274270 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:04.274060 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xjfdj" podUID="31ee7aaa-d858-49a4-becd-246ec9f1a8c5" Apr 16 16:03:04.975044 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:04.974951 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/46b0410a-fdd8-490e-b05f-b4633630c446-original-pull-secret\") pod \"global-pull-secret-syncer-ntvgj\" (UID: \"46b0410a-fdd8-490e-b05f-b4633630c446\") " pod="kube-system/global-pull-secret-syncer-ntvgj" Apr 16 16:03:04.975525 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:04.975140 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:03:04.975525 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:04.975231 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46b0410a-fdd8-490e-b05f-b4633630c446-original-pull-secret podName:46b0410a-fdd8-490e-b05f-b4633630c446 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:06.975195726 +0000 UTC m=+10.330185847 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/46b0410a-fdd8-490e-b05f-b4633630c446-original-pull-secret") pod "global-pull-secret-syncer-ntvgj" (UID: "46b0410a-fdd8-490e-b05f-b4633630c446") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:03:05.274288 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:05.273516 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ntvgj" Apr 16 16:03:05.274288 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:05.273518 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gvndv" Apr 16 16:03:05.274288 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:05.273642 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ntvgj" podUID="46b0410a-fdd8-490e-b05f-b4633630c446" Apr 16 16:03:05.274288 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:05.273738 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gvndv" podUID="916e5e50-1aef-4277-971a-7f2e8ffd2703" Apr 16 16:03:06.274018 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:06.273962 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xjfdj" Apr 16 16:03:06.274492 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:06.274082 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xjfdj" podUID="31ee7aaa-d858-49a4-becd-246ec9f1a8c5" Apr 16 16:03:06.788229 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:06.788101 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/916e5e50-1aef-4277-971a-7f2e8ffd2703-metrics-certs\") pod \"network-metrics-daemon-gvndv\" (UID: \"916e5e50-1aef-4277-971a-7f2e8ffd2703\") " pod="openshift-multus/network-metrics-daemon-gvndv" Apr 16 16:03:06.788400 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:06.788284 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:03:06.788400 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:06.788361 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/916e5e50-1aef-4277-971a-7f2e8ffd2703-metrics-certs podName:916e5e50-1aef-4277-971a-7f2e8ffd2703 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:14.788338622 +0000 UTC m=+18.143328740 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/916e5e50-1aef-4277-971a-7f2e8ffd2703-metrics-certs") pod "network-metrics-daemon-gvndv" (UID: "916e5e50-1aef-4277-971a-7f2e8ffd2703") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:03:06.888772 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:06.888735 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zwvgt\" (UniqueName: \"kubernetes.io/projected/31ee7aaa-d858-49a4-becd-246ec9f1a8c5-kube-api-access-zwvgt\") pod \"network-check-target-xjfdj\" (UID: \"31ee7aaa-d858-49a4-becd-246ec9f1a8c5\") " pod="openshift-network-diagnostics/network-check-target-xjfdj" Apr 16 16:03:06.888952 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:06.888892 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:03:06.888952 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:06.888909 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:03:06.888952 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:06.888933 2576 projected.go:194] Error preparing data for projected volume kube-api-access-zwvgt for pod openshift-network-diagnostics/network-check-target-xjfdj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:03:06.889101 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:06.888994 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/31ee7aaa-d858-49a4-becd-246ec9f1a8c5-kube-api-access-zwvgt podName:31ee7aaa-d858-49a4-becd-246ec9f1a8c5 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:14.888975087 +0000 UTC m=+18.243965198 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-zwvgt" (UniqueName: "kubernetes.io/projected/31ee7aaa-d858-49a4-becd-246ec9f1a8c5-kube-api-access-zwvgt") pod "network-check-target-xjfdj" (UID: "31ee7aaa-d858-49a4-becd-246ec9f1a8c5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:03:06.989523 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:06.989457 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/46b0410a-fdd8-490e-b05f-b4633630c446-original-pull-secret\") pod \"global-pull-secret-syncer-ntvgj\" (UID: \"46b0410a-fdd8-490e-b05f-b4633630c446\") " pod="kube-system/global-pull-secret-syncer-ntvgj" Apr 16 16:03:06.989681 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:06.989627 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:03:06.989751 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:06.989706 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46b0410a-fdd8-490e-b05f-b4633630c446-original-pull-secret podName:46b0410a-fdd8-490e-b05f-b4633630c446 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:10.989686628 +0000 UTC m=+14.344676750 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/46b0410a-fdd8-490e-b05f-b4633630c446-original-pull-secret") pod "global-pull-secret-syncer-ntvgj" (UID: "46b0410a-fdd8-490e-b05f-b4633630c446") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:03:07.274063 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:07.274028 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ntvgj" Apr 16 16:03:07.274522 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:07.274121 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ntvgj" podUID="46b0410a-fdd8-490e-b05f-b4633630c446" Apr 16 16:03:07.274522 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:07.274143 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gvndv" Apr 16 16:03:07.274522 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:07.274277 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gvndv" podUID="916e5e50-1aef-4277-971a-7f2e8ffd2703" Apr 16 16:03:08.274299 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:08.274269 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xjfdj" Apr 16 16:03:08.274835 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:08.274370 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xjfdj" podUID="31ee7aaa-d858-49a4-becd-246ec9f1a8c5" Apr 16 16:03:09.273943 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:09.273904 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gvndv" Apr 16 16:03:09.274110 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:09.273904 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ntvgj" Apr 16 16:03:09.274110 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:09.274017 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gvndv" podUID="916e5e50-1aef-4277-971a-7f2e8ffd2703" Apr 16 16:03:09.274222 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:09.274105 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ntvgj" podUID="46b0410a-fdd8-490e-b05f-b4633630c446" Apr 16 16:03:10.273798 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:10.273763 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xjfdj" Apr 16 16:03:10.274182 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:10.273893 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xjfdj" podUID="31ee7aaa-d858-49a4-becd-246ec9f1a8c5" Apr 16 16:03:11.020843 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:11.020806 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/46b0410a-fdd8-490e-b05f-b4633630c446-original-pull-secret\") pod \"global-pull-secret-syncer-ntvgj\" (UID: \"46b0410a-fdd8-490e-b05f-b4633630c446\") " pod="kube-system/global-pull-secret-syncer-ntvgj" Apr 16 16:03:11.021013 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:11.020951 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:03:11.021056 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:11.021016 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46b0410a-fdd8-490e-b05f-b4633630c446-original-pull-secret podName:46b0410a-fdd8-490e-b05f-b4633630c446 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:19.021000555 +0000 UTC m=+22.375990666 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/46b0410a-fdd8-490e-b05f-b4633630c446-original-pull-secret") pod "global-pull-secret-syncer-ntvgj" (UID: "46b0410a-fdd8-490e-b05f-b4633630c446") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:03:11.273972 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:11.273889 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ntvgj" Apr 16 16:03:11.274454 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:11.273890 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gvndv" Apr 16 16:03:11.274454 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:11.274011 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ntvgj" podUID="46b0410a-fdd8-490e-b05f-b4633630c446" Apr 16 16:03:11.274454 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:11.274118 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gvndv" podUID="916e5e50-1aef-4277-971a-7f2e8ffd2703" Apr 16 16:03:12.274533 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:12.274501 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xjfdj" Apr 16 16:03:12.274960 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:12.274609 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xjfdj" podUID="31ee7aaa-d858-49a4-becd-246ec9f1a8c5" Apr 16 16:03:13.274017 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:13.273988 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ntvgj" Apr 16 16:03:13.274181 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:13.274091 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ntvgj" podUID="46b0410a-fdd8-490e-b05f-b4633630c446" Apr 16 16:03:13.274181 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:13.274143 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gvndv" Apr 16 16:03:13.274300 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:13.274244 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gvndv" podUID="916e5e50-1aef-4277-971a-7f2e8ffd2703" Apr 16 16:03:14.274438 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:14.274405 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xjfdj" Apr 16 16:03:14.274933 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:14.274511 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xjfdj" podUID="31ee7aaa-d858-49a4-becd-246ec9f1a8c5" Apr 16 16:03:14.851666 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:14.851625 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/916e5e50-1aef-4277-971a-7f2e8ffd2703-metrics-certs\") pod \"network-metrics-daemon-gvndv\" (UID: \"916e5e50-1aef-4277-971a-7f2e8ffd2703\") " pod="openshift-multus/network-metrics-daemon-gvndv" Apr 16 16:03:14.851837 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:14.851795 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:03:14.851920 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:14.851866 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/916e5e50-1aef-4277-971a-7f2e8ffd2703-metrics-certs podName:916e5e50-1aef-4277-971a-7f2e8ffd2703 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:30.851845271 +0000 UTC m=+34.206835395 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/916e5e50-1aef-4277-971a-7f2e8ffd2703-metrics-certs") pod "network-metrics-daemon-gvndv" (UID: "916e5e50-1aef-4277-971a-7f2e8ffd2703") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:03:14.952777 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:14.952744 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zwvgt\" (UniqueName: \"kubernetes.io/projected/31ee7aaa-d858-49a4-becd-246ec9f1a8c5-kube-api-access-zwvgt\") pod \"network-check-target-xjfdj\" (UID: \"31ee7aaa-d858-49a4-becd-246ec9f1a8c5\") " pod="openshift-network-diagnostics/network-check-target-xjfdj" Apr 16 16:03:14.952935 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:14.952915 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:03:14.953006 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:14.952942 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:03:14.953006 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:14.952956 2576 projected.go:194] Error preparing data for projected volume kube-api-access-zwvgt for pod openshift-network-diagnostics/network-check-target-xjfdj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:03:14.953081 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:14.953019 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/31ee7aaa-d858-49a4-becd-246ec9f1a8c5-kube-api-access-zwvgt podName:31ee7aaa-d858-49a4-becd-246ec9f1a8c5 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:30.952998938 +0000 UTC m=+34.307989047 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-zwvgt" (UniqueName: "kubernetes.io/projected/31ee7aaa-d858-49a4-becd-246ec9f1a8c5-kube-api-access-zwvgt") pod "network-check-target-xjfdj" (UID: "31ee7aaa-d858-49a4-becd-246ec9f1a8c5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:03:15.274304 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:15.274266 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ntvgj" Apr 16 16:03:15.274481 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:15.274396 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ntvgj" podUID="46b0410a-fdd8-490e-b05f-b4633630c446" Apr 16 16:03:15.274481 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:15.274446 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gvndv" Apr 16 16:03:15.274895 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:15.274562 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gvndv" podUID="916e5e50-1aef-4277-971a-7f2e8ffd2703" Apr 16 16:03:16.273590 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:16.273550 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xjfdj" Apr 16 16:03:16.273783 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:16.273688 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xjfdj" podUID="31ee7aaa-d858-49a4-becd-246ec9f1a8c5" Apr 16 16:03:17.275085 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:17.275018 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gvndv" Apr 16 16:03:17.275945 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:17.275648 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ntvgj" Apr 16 16:03:17.275945 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:17.275737 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ntvgj" podUID="46b0410a-fdd8-490e-b05f-b4633630c446" Apr 16 16:03:17.275945 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:17.275903 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gvndv" podUID="916e5e50-1aef-4277-971a-7f2e8ffd2703" Apr 16 16:03:17.370602 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:17.370541 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72kw2" event={"ID":"9eca3407-9be3-44a0-8eb5-274e3e43107d","Type":"ContainerStarted","Data":"cd6bca0296c034859e0e6b541a50bcd23fa38c49d7cafad930003cc95f02d94a"} Apr 16 16:03:17.372119 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:17.372090 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-tw4j2" event={"ID":"f2c0205d-61f8-4c10-bd7f-f8fcf4336fba","Type":"ContainerStarted","Data":"6d347fa40ed408399a828635bee621a3659e2999496a96b4f75984f62dba9a60"} Apr 16 16:03:17.374523 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:17.374496 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mjlcb" event={"ID":"4a32280f-6aec-4142-bb8f-1547ac4378ab","Type":"ContainerStarted","Data":"2042eb2f344f0140c1a760dd9ff53286e0e50ae5fb572e1070f94f1c79fc0f67"} Apr 16 16:03:17.375826 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:17.375786 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-cgg4d" event={"ID":"4b05a2af-d8f2-42c1-a086-851d57791b5f","Type":"ContainerStarted","Data":"48ffef44906c32454c2d57c3b237c65facd6af12b8139053acd0fd407276c535"} Apr 16 16:03:17.381782 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:17.380670 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rvbp5" event={"ID":"345463d3-76fd-4233-8808-6df63a64c4b5","Type":"ContainerStarted","Data":"cb1eed2c420ea3c8b5bd22c997de6c462d433d4f5d37b6a4b0908d2023970785"} Apr 16 16:03:17.383431 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:17.383306 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-8rbfc" event={"ID":"c969e45c-de0a-46a1-b293-93c2eb9bcd6f","Type":"ContainerStarted","Data":"40e0f8a6d564dc2bea016e9d4ea80de04f99afebba2a1ded89891ae19183d2d6"} Apr 16 16:03:17.424075 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:17.424033 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-tw4j2" podStartSLOduration=3.3869712180000002 podStartE2EDuration="20.424019556s" podCreationTimestamp="2026-04-16 16:02:57 +0000 UTC" firstStartedPulling="2026-04-16 16:02:59.883391333 +0000 UTC m=+3.238381451" lastFinishedPulling="2026-04-16 16:03:16.920439668 +0000 UTC m=+20.275429789" observedRunningTime="2026-04-16 16:03:17.39771986 +0000 UTC m=+20.752709989" watchObservedRunningTime="2026-04-16 16:03:17.424019556 +0000 UTC m=+20.779009684" Apr 16 16:03:17.452464 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:17.452425 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-mjlcb" podStartSLOduration=3.38474678 podStartE2EDuration="20.452410664s" podCreationTimestamp="2026-04-16 16:02:57 +0000 UTC" firstStartedPulling="2026-04-16 16:02:59.881988899 +0000 UTC m=+3.236979007" lastFinishedPulling="2026-04-16 16:03:16.949652783 +0000 UTC m=+20.304642891" observedRunningTime="2026-04-16 16:03:17.450109834 +0000 UTC m=+20.805099964" watchObservedRunningTime="2026-04-16 16:03:17.452410664 +0000 UTC m=+20.807400771" Apr 16 16:03:17.471139 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:17.471091 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-cgg4d" podStartSLOduration=3.430408402 podStartE2EDuration="20.471072769s" podCreationTimestamp="2026-04-16 16:02:57 +0000 UTC" firstStartedPulling="2026-04-16 16:02:59.878284573 +0000 UTC m=+3.233274686" lastFinishedPulling="2026-04-16 16:03:16.918948946 +0000 UTC m=+20.273939053" observedRunningTime="2026-04-16 16:03:17.470681564 +0000 UTC m=+20.825671694" watchObservedRunningTime="2026-04-16 16:03:17.471072769 +0000 UTC m=+20.826062899" Apr 16 16:03:18.107994 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:18.107854 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 16:03:18.173821 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:18.173724 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T16:03:18.107988521Z","UUID":"f5e91afb-8b5a-42ac-969d-06b6a658c181","Handler":null,"Name":"","Endpoint":""} Apr 16 16:03:18.175419 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:18.175401 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 16:03:18.175419 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:18.175423 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 16:03:18.273764 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:18.273687 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xjfdj" Apr 16 16:03:18.273888 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:18.273783 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xjfdj" podUID="31ee7aaa-d858-49a4-becd-246ec9f1a8c5" Apr 16 16:03:18.386438 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:18.386406 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-rvgfg" event={"ID":"e8c9cbbd-500b-410c-bc31-680efc2b8a0c","Type":"ContainerStarted","Data":"f1ee493f84a2577f13f5355ce5d4112bc071ed45c8a36af8ed1724b5f48aea4d"} Apr 16 16:03:18.387671 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:18.387648 2576 generic.go:358] "Generic (PLEG): container finished" podID="345463d3-76fd-4233-8808-6df63a64c4b5" containerID="cb1eed2c420ea3c8b5bd22c997de6c462d433d4f5d37b6a4b0908d2023970785" exitCode=0 Apr 16 16:03:18.387731 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:18.387713 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rvbp5" event={"ID":"345463d3-76fd-4233-8808-6df63a64c4b5","Type":"ContainerDied","Data":"cb1eed2c420ea3c8b5bd22c997de6c462d433d4f5d37b6a4b0908d2023970785"} Apr 16 16:03:18.388927 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:18.388900 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-6bkvd" event={"ID":"e5e28615-1240-4149-a23f-752b612f8a06","Type":"ContainerStarted","Data":"3d44126d778c9416ab6dae2d5b8c696c7838fd1e9443bce7fecc09cfc90c7e4f"} Apr 16 16:03:18.390539 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:18.390515 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72kw2" event={"ID":"9eca3407-9be3-44a0-8eb5-274e3e43107d","Type":"ContainerStarted","Data":"901afaf41b579d11e0720219bd4d5eaed61db9c0859bcba27957b421d5cf0d21"} Apr 16 16:03:18.392941 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:18.392919 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" event={"ID":"d0578a47-b539-4acd-9e68-0468bd267183","Type":"ContainerStarted","Data":"271a79d7ebf22e904e0f720ac8e9735f1081cde8ea16e458b8e003709bd7dba3"} Apr 16 16:03:18.393020 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:18.392948 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" event={"ID":"d0578a47-b539-4acd-9e68-0468bd267183","Type":"ContainerStarted","Data":"567e16b29a35527d496b4919a3e51602ef3975a8c78157b77708346e99a980d6"} Apr 16 16:03:18.393020 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:18.392962 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" event={"ID":"d0578a47-b539-4acd-9e68-0468bd267183","Type":"ContainerStarted","Data":"4b8365b94dac9bd99890bc8e9f60f1d78be116cafb0f2a9443ab9445c40a78c9"} Apr 16 16:03:18.393020 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:18.392974 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" event={"ID":"d0578a47-b539-4acd-9e68-0468bd267183","Type":"ContainerStarted","Data":"2c1a0b3f411a400a918997a4c6c6cf728b03f2ef9e74f09f02f7b57c624332a2"} Apr 16 16:03:18.393020 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:18.392987 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" event={"ID":"d0578a47-b539-4acd-9e68-0468bd267183","Type":"ContainerStarted","Data":"0cf70c361e1b36a155c571ea32c52b11965004ad9e67329fe56d28d80d220d5e"} Apr 16 16:03:18.393020 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:18.393000 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" event={"ID":"d0578a47-b539-4acd-9e68-0468bd267183","Type":"ContainerStarted","Data":"c374ca4cabee50c80b41ad7d5f2ad6c74e07e30cc9411c68ca7361b4bd7f2c48"} Apr 16 16:03:18.405786 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:18.405743 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-8rbfc" podStartSLOduration=9.107604642 podStartE2EDuration="21.405732083s" podCreationTimestamp="2026-04-16 16:02:57 +0000 UTC" firstStartedPulling="2026-04-16 16:02:59.873663334 +0000 UTC m=+3.228653444" lastFinishedPulling="2026-04-16 16:03:12.171790766 +0000 UTC m=+15.526780885" observedRunningTime="2026-04-16 16:03:17.486954215 +0000 UTC m=+20.841944343" watchObservedRunningTime="2026-04-16 16:03:18.405732083 +0000 UTC m=+21.760722270" Apr 16 16:03:18.433562 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:18.433524 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-rvgfg" podStartSLOduration=4.394025887 podStartE2EDuration="21.433513184s" podCreationTimestamp="2026-04-16 16:02:57 +0000 UTC" firstStartedPulling="2026-04-16 16:02:59.879048436 +0000 UTC m=+3.234038558" lastFinishedPulling="2026-04-16 16:03:16.918535733 +0000 UTC m=+20.273525855" observedRunningTime="2026-04-16 16:03:18.406373705 +0000 UTC m=+21.761363834" watchObservedRunningTime="2026-04-16 16:03:18.433513184 +0000 UTC m=+21.788503312" Apr 16 16:03:18.453498 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:18.453458 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-6bkvd" podStartSLOduration=4.409270499 podStartE2EDuration="21.45344724s" podCreationTimestamp="2026-04-16 16:02:57 +0000 UTC" firstStartedPulling="2026-04-16 16:02:59.874914596 +0000 UTC m=+3.229904720" lastFinishedPulling="2026-04-16 16:03:16.919091354 +0000 UTC m=+20.274081461" observedRunningTime="2026-04-16 16:03:18.452965996 +0000 UTC m=+21.807956126" watchObservedRunningTime="2026-04-16 16:03:18.45344724 +0000 UTC m=+21.808437369" Apr 16 16:03:19.087404 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:19.087377 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/46b0410a-fdd8-490e-b05f-b4633630c446-original-pull-secret\") pod \"global-pull-secret-syncer-ntvgj\" (UID: \"46b0410a-fdd8-490e-b05f-b4633630c446\") " pod="kube-system/global-pull-secret-syncer-ntvgj" Apr 16 16:03:19.087521 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:19.087502 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:03:19.087580 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:19.087570 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46b0410a-fdd8-490e-b05f-b4633630c446-original-pull-secret podName:46b0410a-fdd8-490e-b05f-b4633630c446 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:35.087550328 +0000 UTC m=+38.442540454 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/46b0410a-fdd8-490e-b05f-b4633630c446-original-pull-secret") pod "global-pull-secret-syncer-ntvgj" (UID: "46b0410a-fdd8-490e-b05f-b4633630c446") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:03:19.274286 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:19.274199 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ntvgj" Apr 16 16:03:19.274433 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:19.274198 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gvndv" Apr 16 16:03:19.274433 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:19.274315 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ntvgj" podUID="46b0410a-fdd8-490e-b05f-b4633630c446" Apr 16 16:03:19.274433 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:19.274401 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gvndv" podUID="916e5e50-1aef-4277-971a-7f2e8ffd2703" Apr 16 16:03:19.396403 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:19.396355 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72kw2" event={"ID":"9eca3407-9be3-44a0-8eb5-274e3e43107d","Type":"ContainerStarted","Data":"5bfdc79b419e26983bb2be2c966c32ba88d4971cce9a778abced7cb6131a0139"} Apr 16 16:03:19.419275 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:19.419227 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-72kw2" podStartSLOduration=3.313720092 podStartE2EDuration="22.419197344s" podCreationTimestamp="2026-04-16 16:02:57 +0000 UTC" firstStartedPulling="2026-04-16 16:02:59.88332617 +0000 UTC m=+3.238316283" lastFinishedPulling="2026-04-16 16:03:18.988803428 +0000 UTC m=+22.343793535" observedRunningTime="2026-04-16 16:03:19.418572829 +0000 UTC m=+22.773562986" watchObservedRunningTime="2026-04-16 16:03:19.419197344 +0000 UTC m=+22.774187473" Apr 16 16:03:20.273959 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:20.273930 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xjfdj" Apr 16 16:03:20.274139 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:20.274051 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xjfdj" podUID="31ee7aaa-d858-49a4-becd-246ec9f1a8c5" Apr 16 16:03:20.400963 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:20.400919 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" event={"ID":"d0578a47-b539-4acd-9e68-0468bd267183","Type":"ContainerStarted","Data":"ae0d06eadea70eade7e8d74913a1fff954c33f31a0c882506c0c10da369a686f"} Apr 16 16:03:21.274721 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:21.274503 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gvndv" Apr 16 16:03:21.274874 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:21.274554 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ntvgj" Apr 16 16:03:21.274874 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:21.274798 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gvndv" podUID="916e5e50-1aef-4277-971a-7f2e8ffd2703" Apr 16 16:03:21.274972 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:21.274903 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ntvgj" podUID="46b0410a-fdd8-490e-b05f-b4633630c446" Apr 16 16:03:21.999576 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:21.999524 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-8rbfc" Apr 16 16:03:22.000270 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:22.000252 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-8rbfc" Apr 16 16:03:22.274120 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:22.274100 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xjfdj" Apr 16 16:03:22.274378 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:22.274199 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xjfdj" podUID="31ee7aaa-d858-49a4-becd-246ec9f1a8c5" Apr 16 16:03:22.407347 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:22.407315 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" event={"ID":"d0578a47-b539-4acd-9e68-0468bd267183","Type":"ContainerStarted","Data":"7dd591549b8d2fb8dadcbffd56102ab046c79d3e32eacb2328d6bf8c5dee5125"} Apr 16 16:03:22.407785 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:22.407731 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-8rbfc" Apr 16 16:03:22.408133 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:22.408115 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-8rbfc" Apr 16 16:03:22.463222 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:22.463167 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" podStartSLOduration=7.946619706 podStartE2EDuration="25.463153234s" podCreationTimestamp="2026-04-16 16:02:57 +0000 UTC" firstStartedPulling="2026-04-16 16:02:59.883366167 +0000 UTC m=+3.238356285" lastFinishedPulling="2026-04-16 16:03:17.399899693 +0000 UTC m=+20.754889813" observedRunningTime="2026-04-16 16:03:22.443892585 +0000 UTC m=+25.798882747" watchObservedRunningTime="2026-04-16 16:03:22.463153234 +0000 UTC m=+25.818143387" Apr 16 16:03:23.273995 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:23.273811 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gvndv" Apr 16 16:03:23.274832 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:23.273811 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ntvgj" Apr 16 16:03:23.274832 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:23.274080 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gvndv" podUID="916e5e50-1aef-4277-971a-7f2e8ffd2703" Apr 16 16:03:23.274832 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:23.274134 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ntvgj" podUID="46b0410a-fdd8-490e-b05f-b4633630c446" Apr 16 16:03:23.413097 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:23.413052 2576 generic.go:358] "Generic (PLEG): container finished" podID="345463d3-76fd-4233-8808-6df63a64c4b5" containerID="d209a79ad79443009e8c8d8645d40c90f42b8755f63756dc5c7cc4b1567951cd" exitCode=0 Apr 16 16:03:23.413291 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:23.413130 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rvbp5" event={"ID":"345463d3-76fd-4233-8808-6df63a64c4b5","Type":"ContainerDied","Data":"d209a79ad79443009e8c8d8645d40c90f42b8755f63756dc5c7cc4b1567951cd"} Apr 16 16:03:23.414010 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:23.413939 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:03:23.414010 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:23.413969 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:03:23.414010 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:23.413991 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:03:23.430489 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:23.430465 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:03:23.431047 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:23.431027 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:03:24.183385 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:24.183360 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-ntvgj"] Apr 16 16:03:24.183520 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:24.183508 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ntvgj" Apr 16 16:03:24.183638 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:24.183617 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ntvgj" podUID="46b0410a-fdd8-490e-b05f-b4633630c446" Apr 16 16:03:24.184089 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:24.184071 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gvndv"] Apr 16 16:03:24.184176 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:24.184166 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gvndv" Apr 16 16:03:24.184288 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:24.184269 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gvndv" podUID="916e5e50-1aef-4277-971a-7f2e8ffd2703" Apr 16 16:03:24.186768 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:24.186749 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-xjfdj"] Apr 16 16:03:24.186847 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:24.186836 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xjfdj" Apr 16 16:03:24.186920 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:24.186906 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xjfdj" podUID="31ee7aaa-d858-49a4-becd-246ec9f1a8c5" Apr 16 16:03:24.416787 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:24.416761 2576 generic.go:358] "Generic (PLEG): container finished" podID="345463d3-76fd-4233-8808-6df63a64c4b5" containerID="a866bb4ae17cabab2e2ba70af5d708df2d5b597a56ff066b7f0967c792e980a3" exitCode=0 Apr 16 16:03:24.417171 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:24.416850 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rvbp5" event={"ID":"345463d3-76fd-4233-8808-6df63a64c4b5","Type":"ContainerDied","Data":"a866bb4ae17cabab2e2ba70af5d708df2d5b597a56ff066b7f0967c792e980a3"} Apr 16 16:03:25.274331 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:25.274263 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ntvgj" Apr 16 16:03:25.274451 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:25.274359 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ntvgj" podUID="46b0410a-fdd8-490e-b05f-b4633630c446" Apr 16 16:03:25.420295 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:25.420264 2576 generic.go:358] "Generic (PLEG): container finished" podID="345463d3-76fd-4233-8808-6df63a64c4b5" containerID="d7d96f283c9d5efc0115048e10f4414caf4e041e1eb1adabfc1f5de8ecb879b0" exitCode=0 Apr 16 16:03:25.420652 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:25.420317 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rvbp5" event={"ID":"345463d3-76fd-4233-8808-6df63a64c4b5","Type":"ContainerDied","Data":"d7d96f283c9d5efc0115048e10f4414caf4e041e1eb1adabfc1f5de8ecb879b0"} Apr 16 16:03:26.274015 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:26.273987 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xjfdj" Apr 16 16:03:26.274170 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:26.273992 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gvndv" Apr 16 16:03:26.274170 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:26.274091 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xjfdj" podUID="31ee7aaa-d858-49a4-becd-246ec9f1a8c5" Apr 16 16:03:26.274284 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:26.274192 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gvndv" podUID="916e5e50-1aef-4277-971a-7f2e8ffd2703" Apr 16 16:03:27.275673 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:27.275644 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ntvgj" Apr 16 16:03:27.276132 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:27.275807 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ntvgj" podUID="46b0410a-fdd8-490e-b05f-b4633630c446" Apr 16 16:03:28.273817 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:28.273785 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gvndv" Apr 16 16:03:28.273998 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:28.273786 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xjfdj" Apr 16 16:03:28.273998 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:28.273920 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gvndv" podUID="916e5e50-1aef-4277-971a-7f2e8ffd2703" Apr 16 16:03:28.274089 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:28.273991 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xjfdj" podUID="31ee7aaa-d858-49a4-becd-246ec9f1a8c5" Apr 16 16:03:29.273616 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:29.273577 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ntvgj" Apr 16 16:03:29.274274 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:29.273707 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ntvgj" podUID="46b0410a-fdd8-490e-b05f-b4633630c446" Apr 16 16:03:30.274386 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:30.274347 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xjfdj" Apr 16 16:03:30.274881 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:30.274357 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gvndv" Apr 16 16:03:30.274881 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:30.274480 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xjfdj" podUID="31ee7aaa-d858-49a4-becd-246ec9f1a8c5" Apr 16 16:03:30.274881 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:30.274576 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gvndv" podUID="916e5e50-1aef-4277-971a-7f2e8ffd2703" Apr 16 16:03:30.875295 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:30.875207 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/916e5e50-1aef-4277-971a-7f2e8ffd2703-metrics-certs\") pod \"network-metrics-daemon-gvndv\" (UID: \"916e5e50-1aef-4277-971a-7f2e8ffd2703\") " pod="openshift-multus/network-metrics-daemon-gvndv" Apr 16 16:03:30.875511 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:30.875317 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:03:30.875511 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:30.875388 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/916e5e50-1aef-4277-971a-7f2e8ffd2703-metrics-certs podName:916e5e50-1aef-4277-971a-7f2e8ffd2703 nodeName:}" failed. No retries permitted until 2026-04-16 16:04:02.875367254 +0000 UTC m=+66.230357361 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/916e5e50-1aef-4277-971a-7f2e8ffd2703-metrics-certs") pod "network-metrics-daemon-gvndv" (UID: "916e5e50-1aef-4277-971a-7f2e8ffd2703") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:03:30.919748 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:30.919726 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-150.ec2.internal" event="NodeReady" Apr 16 16:03:30.919875 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:30.919849 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 16:03:30.976295 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:30.976272 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zwvgt\" (UniqueName: \"kubernetes.io/projected/31ee7aaa-d858-49a4-becd-246ec9f1a8c5-kube-api-access-zwvgt\") pod \"network-check-target-xjfdj\" (UID: \"31ee7aaa-d858-49a4-becd-246ec9f1a8c5\") " pod="openshift-network-diagnostics/network-check-target-xjfdj" Apr 16 16:03:30.976448 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:30.976430 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:03:30.976505 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:30.976450 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:03:30.976505 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:30.976463 2576 projected.go:194] Error preparing data for projected volume kube-api-access-zwvgt for pod openshift-network-diagnostics/network-check-target-xjfdj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:03:30.976586 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:30.976517 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/31ee7aaa-d858-49a4-becd-246ec9f1a8c5-kube-api-access-zwvgt podName:31ee7aaa-d858-49a4-becd-246ec9f1a8c5 nodeName:}" failed. No retries permitted until 2026-04-16 16:04:02.976500535 +0000 UTC m=+66.331490648 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-zwvgt" (UniqueName: "kubernetes.io/projected/31ee7aaa-d858-49a4-becd-246ec9f1a8c5-kube-api-access-zwvgt") pod "network-check-target-xjfdj" (UID: "31ee7aaa-d858-49a4-becd-246ec9f1a8c5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:03:30.977306 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:30.977288 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-qg8tx"] Apr 16 16:03:30.999461 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:30.999435 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-vzpf5"] Apr 16 16:03:30.999606 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:30.999587 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qg8tx" Apr 16 16:03:31.001821 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:31.001805 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-l2qxk\"" Apr 16 16:03:31.002033 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:31.002020 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 16:03:31.002241 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:31.002224 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 16:03:31.003020 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:31.003001 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 16:03:31.017990 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:31.017975 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qg8tx"] Apr 16 16:03:31.018073 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:31.017995 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vzpf5"] Apr 16 16:03:31.018111 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:31.018080 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vzpf5" Apr 16 16:03:31.030044 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:31.030027 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 16:03:31.030318 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:31.030295 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-jhn24\"" Apr 16 16:03:31.030318 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:31.030315 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 16:03:31.077090 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:31.077070 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/48e2d2c2-3b35-49d1-bb3f-46840c5001a5-cert\") pod \"ingress-canary-qg8tx\" (UID: \"48e2d2c2-3b35-49d1-bb3f-46840c5001a5\") " pod="openshift-ingress-canary/ingress-canary-qg8tx" Apr 16 16:03:31.077205 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:31.077157 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8n9w\" (UniqueName: \"kubernetes.io/projected/48e2d2c2-3b35-49d1-bb3f-46840c5001a5-kube-api-access-x8n9w\") pod \"ingress-canary-qg8tx\" (UID: \"48e2d2c2-3b35-49d1-bb3f-46840c5001a5\") " pod="openshift-ingress-canary/ingress-canary-qg8tx" Apr 16 16:03:31.177693 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:31.177671 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/48e2d2c2-3b35-49d1-bb3f-46840c5001a5-cert\") pod \"ingress-canary-qg8tx\" (UID: \"48e2d2c2-3b35-49d1-bb3f-46840c5001a5\") " pod="openshift-ingress-canary/ingress-canary-qg8tx" Apr 16 16:03:31.177796 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:31.177718 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x8n9w\" (UniqueName: \"kubernetes.io/projected/48e2d2c2-3b35-49d1-bb3f-46840c5001a5-kube-api-access-x8n9w\") pod \"ingress-canary-qg8tx\" (UID: \"48e2d2c2-3b35-49d1-bb3f-46840c5001a5\") " pod="openshift-ingress-canary/ingress-canary-qg8tx" Apr 16 16:03:31.177796 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:31.177743 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/01216772-46ae-4344-a250-d689b2fe3c4c-tmp-dir\") pod \"dns-default-vzpf5\" (UID: \"01216772-46ae-4344-a250-d689b2fe3c4c\") " pod="openshift-dns/dns-default-vzpf5" Apr 16 16:03:31.177796 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:31.177767 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw4vh\" (UniqueName: \"kubernetes.io/projected/01216772-46ae-4344-a250-d689b2fe3c4c-kube-api-access-nw4vh\") pod \"dns-default-vzpf5\" (UID: \"01216772-46ae-4344-a250-d689b2fe3c4c\") " pod="openshift-dns/dns-default-vzpf5" Apr 16 16:03:31.177919 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:31.177831 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:03:31.177966 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:31.177917 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/01216772-46ae-4344-a250-d689b2fe3c4c-config-volume\") pod \"dns-default-vzpf5\" (UID: \"01216772-46ae-4344-a250-d689b2fe3c4c\") " pod="openshift-dns/dns-default-vzpf5" Apr 16 16:03:31.177966 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:31.177952 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/01216772-46ae-4344-a250-d689b2fe3c4c-metrics-tls\") pod \"dns-default-vzpf5\" (UID: \"01216772-46ae-4344-a250-d689b2fe3c4c\") " pod="openshift-dns/dns-default-vzpf5" Apr 16 16:03:31.178060 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:31.177980 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48e2d2c2-3b35-49d1-bb3f-46840c5001a5-cert podName:48e2d2c2-3b35-49d1-bb3f-46840c5001a5 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:31.677953934 +0000 UTC m=+35.032944058 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/48e2d2c2-3b35-49d1-bb3f-46840c5001a5-cert") pod "ingress-canary-qg8tx" (UID: "48e2d2c2-3b35-49d1-bb3f-46840c5001a5") : secret "canary-serving-cert" not found Apr 16 16:03:31.189541 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:31.189514 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8n9w\" (UniqueName: \"kubernetes.io/projected/48e2d2c2-3b35-49d1-bb3f-46840c5001a5-kube-api-access-x8n9w\") pod \"ingress-canary-qg8tx\" (UID: \"48e2d2c2-3b35-49d1-bb3f-46840c5001a5\") " pod="openshift-ingress-canary/ingress-canary-qg8tx" Apr 16 16:03:31.273695 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:31.273674 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ntvgj" Apr 16 16:03:31.276167 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:31.276150 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 16:03:31.278362 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:31.278302 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/01216772-46ae-4344-a250-d689b2fe3c4c-tmp-dir\") pod \"dns-default-vzpf5\" (UID: \"01216772-46ae-4344-a250-d689b2fe3c4c\") " pod="openshift-dns/dns-default-vzpf5" Apr 16 16:03:31.278362 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:31.278340 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nw4vh\" (UniqueName: \"kubernetes.io/projected/01216772-46ae-4344-a250-d689b2fe3c4c-kube-api-access-nw4vh\") pod \"dns-default-vzpf5\" (UID: \"01216772-46ae-4344-a250-d689b2fe3c4c\") " pod="openshift-dns/dns-default-vzpf5" Apr 16 16:03:31.278517 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:31.278379 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/01216772-46ae-4344-a250-d689b2fe3c4c-config-volume\") pod \"dns-default-vzpf5\" (UID: \"01216772-46ae-4344-a250-d689b2fe3c4c\") " pod="openshift-dns/dns-default-vzpf5" Apr 16 16:03:31.278517 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:31.278405 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/01216772-46ae-4344-a250-d689b2fe3c4c-metrics-tls\") pod \"dns-default-vzpf5\" (UID: \"01216772-46ae-4344-a250-d689b2fe3c4c\") " pod="openshift-dns/dns-default-vzpf5" Apr 16 16:03:31.278627 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:31.278575 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:03:31.278679 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:31.278636 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01216772-46ae-4344-a250-d689b2fe3c4c-metrics-tls podName:01216772-46ae-4344-a250-d689b2fe3c4c nodeName:}" failed. No retries permitted until 2026-04-16 16:03:31.778618368 +0000 UTC m=+35.133608481 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/01216772-46ae-4344-a250-d689b2fe3c4c-metrics-tls") pod "dns-default-vzpf5" (UID: "01216772-46ae-4344-a250-d689b2fe3c4c") : secret "dns-default-metrics-tls" not found Apr 16 16:03:31.279407 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:31.279384 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/01216772-46ae-4344-a250-d689b2fe3c4c-config-volume\") pod \"dns-default-vzpf5\" (UID: \"01216772-46ae-4344-a250-d689b2fe3c4c\") " pod="openshift-dns/dns-default-vzpf5" Apr 16 16:03:31.286051 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:31.286004 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/01216772-46ae-4344-a250-d689b2fe3c4c-tmp-dir\") pod \"dns-default-vzpf5\" (UID: \"01216772-46ae-4344-a250-d689b2fe3c4c\") " pod="openshift-dns/dns-default-vzpf5" Apr 16 16:03:31.287331 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:31.287310 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw4vh\" (UniqueName: \"kubernetes.io/projected/01216772-46ae-4344-a250-d689b2fe3c4c-kube-api-access-nw4vh\") pod \"dns-default-vzpf5\" (UID: \"01216772-46ae-4344-a250-d689b2fe3c4c\") " pod="openshift-dns/dns-default-vzpf5" Apr 16 16:03:31.434058 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:31.433995 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rvbp5" event={"ID":"345463d3-76fd-4233-8808-6df63a64c4b5","Type":"ContainerStarted","Data":"e88caf2ce3ab7fdd2adf32b5276af8944f921ee61b6860cba8704744babf984d"} Apr 16 16:03:31.681688 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:31.681657 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/48e2d2c2-3b35-49d1-bb3f-46840c5001a5-cert\") pod \"ingress-canary-qg8tx\" (UID: \"48e2d2c2-3b35-49d1-bb3f-46840c5001a5\") " pod="openshift-ingress-canary/ingress-canary-qg8tx" Apr 16 16:03:31.681832 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:31.681742 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:03:31.681832 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:31.681790 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48e2d2c2-3b35-49d1-bb3f-46840c5001a5-cert podName:48e2d2c2-3b35-49d1-bb3f-46840c5001a5 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:32.681776645 +0000 UTC m=+36.036766762 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/48e2d2c2-3b35-49d1-bb3f-46840c5001a5-cert") pod "ingress-canary-qg8tx" (UID: "48e2d2c2-3b35-49d1-bb3f-46840c5001a5") : secret "canary-serving-cert" not found Apr 16 16:03:31.782434 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:31.782378 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/01216772-46ae-4344-a250-d689b2fe3c4c-metrics-tls\") pod \"dns-default-vzpf5\" (UID: \"01216772-46ae-4344-a250-d689b2fe3c4c\") " pod="openshift-dns/dns-default-vzpf5" Apr 16 16:03:31.782525 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:31.782507 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:03:31.782565 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:31.782557 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01216772-46ae-4344-a250-d689b2fe3c4c-metrics-tls podName:01216772-46ae-4344-a250-d689b2fe3c4c nodeName:}" failed. No retries permitted until 2026-04-16 16:03:32.782540541 +0000 UTC m=+36.137530663 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/01216772-46ae-4344-a250-d689b2fe3c4c-metrics-tls") pod "dns-default-vzpf5" (UID: "01216772-46ae-4344-a250-d689b2fe3c4c") : secret "dns-default-metrics-tls" not found Apr 16 16:03:32.274284 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:32.274252 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gvndv" Apr 16 16:03:32.274437 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:32.274250 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xjfdj" Apr 16 16:03:32.277855 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:32.277828 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 16:03:32.277855 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:32.277828 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 16:03:32.278342 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:32.277829 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-qf27n\"" Apr 16 16:03:32.278342 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:32.277836 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 16:03:32.278342 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:32.277886 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-hwq6n\"" Apr 16 16:03:32.438100 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:32.438065 2576 generic.go:358] "Generic (PLEG): container finished" podID="345463d3-76fd-4233-8808-6df63a64c4b5" containerID="e88caf2ce3ab7fdd2adf32b5276af8944f921ee61b6860cba8704744babf984d" exitCode=0 Apr 16 16:03:32.438279 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:32.438127 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rvbp5" event={"ID":"345463d3-76fd-4233-8808-6df63a64c4b5","Type":"ContainerDied","Data":"e88caf2ce3ab7fdd2adf32b5276af8944f921ee61b6860cba8704744babf984d"} Apr 16 16:03:32.687991 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:32.687968 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/48e2d2c2-3b35-49d1-bb3f-46840c5001a5-cert\") pod \"ingress-canary-qg8tx\" (UID: \"48e2d2c2-3b35-49d1-bb3f-46840c5001a5\") " pod="openshift-ingress-canary/ingress-canary-qg8tx" Apr 16 16:03:32.688131 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:32.688115 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:03:32.688223 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:32.688189 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48e2d2c2-3b35-49d1-bb3f-46840c5001a5-cert podName:48e2d2c2-3b35-49d1-bb3f-46840c5001a5 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:34.688170802 +0000 UTC m=+38.043160915 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/48e2d2c2-3b35-49d1-bb3f-46840c5001a5-cert") pod "ingress-canary-qg8tx" (UID: "48e2d2c2-3b35-49d1-bb3f-46840c5001a5") : secret "canary-serving-cert" not found Apr 16 16:03:32.788903 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:32.788823 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/01216772-46ae-4344-a250-d689b2fe3c4c-metrics-tls\") pod \"dns-default-vzpf5\" (UID: \"01216772-46ae-4344-a250-d689b2fe3c4c\") " pod="openshift-dns/dns-default-vzpf5" Apr 16 16:03:32.789012 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:32.788964 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:03:32.789059 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:32.789021 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01216772-46ae-4344-a250-d689b2fe3c4c-metrics-tls podName:01216772-46ae-4344-a250-d689b2fe3c4c nodeName:}" failed. No retries permitted until 2026-04-16 16:03:34.789007066 +0000 UTC m=+38.143997178 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/01216772-46ae-4344-a250-d689b2fe3c4c-metrics-tls") pod "dns-default-vzpf5" (UID: "01216772-46ae-4344-a250-d689b2fe3c4c") : secret "dns-default-metrics-tls" not found Apr 16 16:03:33.442896 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:33.442864 2576 generic.go:358] "Generic (PLEG): container finished" podID="345463d3-76fd-4233-8808-6df63a64c4b5" containerID="ebc746e9a7bb5610dc382b42fac1306429c30f4c6f1aaf959b23557cfe010616" exitCode=0 Apr 16 16:03:33.443279 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:33.442907 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rvbp5" event={"ID":"345463d3-76fd-4233-8808-6df63a64c4b5","Type":"ContainerDied","Data":"ebc746e9a7bb5610dc382b42fac1306429c30f4c6f1aaf959b23557cfe010616"} Apr 16 16:03:34.449895 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:34.449664 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rvbp5" event={"ID":"345463d3-76fd-4233-8808-6df63a64c4b5","Type":"ContainerStarted","Data":"1c42400e61b88d864c72856f828d312e7502212169e2510f34c865a6f33e035a"} Apr 16 16:03:34.473022 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:34.472961 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-rvbp5" podStartSLOduration=6.107866981 podStartE2EDuration="37.472950076s" podCreationTimestamp="2026-04-16 16:02:57 +0000 UTC" firstStartedPulling="2026-04-16 16:02:59.876002909 +0000 UTC m=+3.230993037" lastFinishedPulling="2026-04-16 16:03:31.24108601 +0000 UTC m=+34.596076132" observedRunningTime="2026-04-16 16:03:34.471765125 +0000 UTC m=+37.826755255" watchObservedRunningTime="2026-04-16 16:03:34.472950076 +0000 UTC m=+37.827940214" Apr 16 16:03:34.702605 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:34.702523 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/48e2d2c2-3b35-49d1-bb3f-46840c5001a5-cert\") pod \"ingress-canary-qg8tx\" (UID: \"48e2d2c2-3b35-49d1-bb3f-46840c5001a5\") " pod="openshift-ingress-canary/ingress-canary-qg8tx" Apr 16 16:03:34.702756 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:34.702637 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:03:34.702756 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:34.702695 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48e2d2c2-3b35-49d1-bb3f-46840c5001a5-cert podName:48e2d2c2-3b35-49d1-bb3f-46840c5001a5 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:38.702677204 +0000 UTC m=+42.057667317 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/48e2d2c2-3b35-49d1-bb3f-46840c5001a5-cert") pod "ingress-canary-qg8tx" (UID: "48e2d2c2-3b35-49d1-bb3f-46840c5001a5") : secret "canary-serving-cert" not found Apr 16 16:03:34.803537 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:34.803510 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/01216772-46ae-4344-a250-d689b2fe3c4c-metrics-tls\") pod \"dns-default-vzpf5\" (UID: \"01216772-46ae-4344-a250-d689b2fe3c4c\") " pod="openshift-dns/dns-default-vzpf5" Apr 16 16:03:34.803665 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:34.803612 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:03:34.803665 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:34.803653 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01216772-46ae-4344-a250-d689b2fe3c4c-metrics-tls podName:01216772-46ae-4344-a250-d689b2fe3c4c nodeName:}" failed. No retries permitted until 2026-04-16 16:03:38.803641169 +0000 UTC m=+42.158631276 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/01216772-46ae-4344-a250-d689b2fe3c4c-metrics-tls") pod "dns-default-vzpf5" (UID: "01216772-46ae-4344-a250-d689b2fe3c4c") : secret "dns-default-metrics-tls" not found Apr 16 16:03:35.106155 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:35.106063 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/46b0410a-fdd8-490e-b05f-b4633630c446-original-pull-secret\") pod \"global-pull-secret-syncer-ntvgj\" (UID: \"46b0410a-fdd8-490e-b05f-b4633630c446\") " pod="kube-system/global-pull-secret-syncer-ntvgj" Apr 16 16:03:35.109044 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:35.109025 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/46b0410a-fdd8-490e-b05f-b4633630c446-original-pull-secret\") pod \"global-pull-secret-syncer-ntvgj\" (UID: \"46b0410a-fdd8-490e-b05f-b4633630c446\") " pod="kube-system/global-pull-secret-syncer-ntvgj" Apr 16 16:03:35.188683 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:35.188646 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ntvgj" Apr 16 16:03:35.357439 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:35.357361 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-ntvgj"] Apr 16 16:03:35.362192 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:03:35.362162 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46b0410a_fdd8_490e_b05f_b4633630c446.slice/crio-1f02cf65ff68cbe84c6a3571767e724a942262a2073c6272a24bc678342bb0ac WatchSource:0}: Error finding container 1f02cf65ff68cbe84c6a3571767e724a942262a2073c6272a24bc678342bb0ac: Status 404 returned error can't find the container with id 1f02cf65ff68cbe84c6a3571767e724a942262a2073c6272a24bc678342bb0ac Apr 16 16:03:35.452192 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:35.452159 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-ntvgj" event={"ID":"46b0410a-fdd8-490e-b05f-b4633630c446","Type":"ContainerStarted","Data":"1f02cf65ff68cbe84c6a3571767e724a942262a2073c6272a24bc678342bb0ac"} Apr 16 16:03:38.734119 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:38.734081 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/48e2d2c2-3b35-49d1-bb3f-46840c5001a5-cert\") pod \"ingress-canary-qg8tx\" (UID: \"48e2d2c2-3b35-49d1-bb3f-46840c5001a5\") " pod="openshift-ingress-canary/ingress-canary-qg8tx" Apr 16 16:03:38.734591 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:38.734259 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:03:38.734591 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:38.734335 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48e2d2c2-3b35-49d1-bb3f-46840c5001a5-cert podName:48e2d2c2-3b35-49d1-bb3f-46840c5001a5 nodeName:}" failed. No retries permitted until 2026-04-16 16:03:46.734315076 +0000 UTC m=+50.089305198 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/48e2d2c2-3b35-49d1-bb3f-46840c5001a5-cert") pod "ingress-canary-qg8tx" (UID: "48e2d2c2-3b35-49d1-bb3f-46840c5001a5") : secret "canary-serving-cert" not found Apr 16 16:03:38.835507 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:38.835478 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/01216772-46ae-4344-a250-d689b2fe3c4c-metrics-tls\") pod \"dns-default-vzpf5\" (UID: \"01216772-46ae-4344-a250-d689b2fe3c4c\") " pod="openshift-dns/dns-default-vzpf5" Apr 16 16:03:38.835651 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:38.835621 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:03:38.835691 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:38.835683 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01216772-46ae-4344-a250-d689b2fe3c4c-metrics-tls podName:01216772-46ae-4344-a250-d689b2fe3c4c nodeName:}" failed. No retries permitted until 2026-04-16 16:03:46.835668886 +0000 UTC m=+50.190658992 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/01216772-46ae-4344-a250-d689b2fe3c4c-metrics-tls") pod "dns-default-vzpf5" (UID: "01216772-46ae-4344-a250-d689b2fe3c4c") : secret "dns-default-metrics-tls" not found Apr 16 16:03:39.461883 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:39.461852 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-ntvgj" event={"ID":"46b0410a-fdd8-490e-b05f-b4633630c446","Type":"ContainerStarted","Data":"e5aa407387d25db61bd6b233134c58b1935bbf1b3a008370a0b66de87064d310"} Apr 16 16:03:39.481079 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:39.480989 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-ntvgj" podStartSLOduration=32.627045948 podStartE2EDuration="36.48097593s" podCreationTimestamp="2026-04-16 16:03:03 +0000 UTC" firstStartedPulling="2026-04-16 16:03:35.363910957 +0000 UTC m=+38.718901078" lastFinishedPulling="2026-04-16 16:03:39.21784094 +0000 UTC m=+42.572831060" observedRunningTime="2026-04-16 16:03:39.480579488 +0000 UTC m=+42.835569616" watchObservedRunningTime="2026-04-16 16:03:39.48097593 +0000 UTC m=+42.835966037" Apr 16 16:03:46.789458 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:46.789420 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/48e2d2c2-3b35-49d1-bb3f-46840c5001a5-cert\") pod \"ingress-canary-qg8tx\" (UID: \"48e2d2c2-3b35-49d1-bb3f-46840c5001a5\") " pod="openshift-ingress-canary/ingress-canary-qg8tx" Apr 16 16:03:46.789836 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:46.789546 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:03:46.789836 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:46.789601 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48e2d2c2-3b35-49d1-bb3f-46840c5001a5-cert podName:48e2d2c2-3b35-49d1-bb3f-46840c5001a5 nodeName:}" failed. No retries permitted until 2026-04-16 16:04:02.789586799 +0000 UTC m=+66.144576907 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/48e2d2c2-3b35-49d1-bb3f-46840c5001a5-cert") pod "ingress-canary-qg8tx" (UID: "48e2d2c2-3b35-49d1-bb3f-46840c5001a5") : secret "canary-serving-cert" not found Apr 16 16:03:46.890687 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:46.890652 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/01216772-46ae-4344-a250-d689b2fe3c4c-metrics-tls\") pod \"dns-default-vzpf5\" (UID: \"01216772-46ae-4344-a250-d689b2fe3c4c\") " pod="openshift-dns/dns-default-vzpf5" Apr 16 16:03:46.890823 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:46.890780 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:03:46.890874 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:03:46.890845 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01216772-46ae-4344-a250-d689b2fe3c4c-metrics-tls podName:01216772-46ae-4344-a250-d689b2fe3c4c nodeName:}" failed. No retries permitted until 2026-04-16 16:04:02.890830405 +0000 UTC m=+66.245820513 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/01216772-46ae-4344-a250-d689b2fe3c4c-metrics-tls") pod "dns-default-vzpf5" (UID: "01216772-46ae-4344-a250-d689b2fe3c4c") : secret "dns-default-metrics-tls" not found Apr 16 16:03:52.565137 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:52.565104 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6f87575f49-9hw9n"] Apr 16 16:03:52.568594 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:52.568580 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6f87575f49-9hw9n" Apr 16 16:03:52.577783 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:52.577763 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 16:03:52.577783 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:52.577775 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 16:03:52.577925 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:52.577763 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 16:03:52.578606 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:52.578593 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 16 16:03:52.587533 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:52.587513 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6f87575f49-9hw9n"] Apr 16 16:03:52.630223 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:52.630194 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/9bdf21f7-d4f3-48cc-a53f-85d7088427df-klusterlet-config\") pod \"klusterlet-addon-workmgr-6f87575f49-9hw9n\" (UID: \"9bdf21f7-d4f3-48cc-a53f-85d7088427df\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6f87575f49-9hw9n" Apr 16 16:03:52.630311 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:52.630249 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9bdf21f7-d4f3-48cc-a53f-85d7088427df-tmp\") pod \"klusterlet-addon-workmgr-6f87575f49-9hw9n\" (UID: \"9bdf21f7-d4f3-48cc-a53f-85d7088427df\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6f87575f49-9hw9n" Apr 16 16:03:52.630359 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:52.630314 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n87r6\" (UniqueName: \"kubernetes.io/projected/9bdf21f7-d4f3-48cc-a53f-85d7088427df-kube-api-access-n87r6\") pod \"klusterlet-addon-workmgr-6f87575f49-9hw9n\" (UID: \"9bdf21f7-d4f3-48cc-a53f-85d7088427df\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6f87575f49-9hw9n" Apr 16 16:03:52.730703 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:52.730680 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9bdf21f7-d4f3-48cc-a53f-85d7088427df-tmp\") pod \"klusterlet-addon-workmgr-6f87575f49-9hw9n\" (UID: \"9bdf21f7-d4f3-48cc-a53f-85d7088427df\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6f87575f49-9hw9n" Apr 16 16:03:52.730789 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:52.730718 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n87r6\" (UniqueName: \"kubernetes.io/projected/9bdf21f7-d4f3-48cc-a53f-85d7088427df-kube-api-access-n87r6\") pod \"klusterlet-addon-workmgr-6f87575f49-9hw9n\" (UID: \"9bdf21f7-d4f3-48cc-a53f-85d7088427df\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6f87575f49-9hw9n" Apr 16 16:03:52.730789 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:52.730763 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/9bdf21f7-d4f3-48cc-a53f-85d7088427df-klusterlet-config\") pod \"klusterlet-addon-workmgr-6f87575f49-9hw9n\" (UID: \"9bdf21f7-d4f3-48cc-a53f-85d7088427df\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6f87575f49-9hw9n" Apr 16 16:03:52.731020 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:52.731003 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9bdf21f7-d4f3-48cc-a53f-85d7088427df-tmp\") pod \"klusterlet-addon-workmgr-6f87575f49-9hw9n\" (UID: \"9bdf21f7-d4f3-48cc-a53f-85d7088427df\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6f87575f49-9hw9n" Apr 16 16:03:52.733093 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:52.733071 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/9bdf21f7-d4f3-48cc-a53f-85d7088427df-klusterlet-config\") pod \"klusterlet-addon-workmgr-6f87575f49-9hw9n\" (UID: \"9bdf21f7-d4f3-48cc-a53f-85d7088427df\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6f87575f49-9hw9n" Apr 16 16:03:52.739251 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:52.739232 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n87r6\" (UniqueName: \"kubernetes.io/projected/9bdf21f7-d4f3-48cc-a53f-85d7088427df-kube-api-access-n87r6\") pod \"klusterlet-addon-workmgr-6f87575f49-9hw9n\" (UID: \"9bdf21f7-d4f3-48cc-a53f-85d7088427df\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6f87575f49-9hw9n" Apr 16 16:03:52.877548 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:52.877496 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6f87575f49-9hw9n" Apr 16 16:03:52.999284 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:52.999259 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6f87575f49-9hw9n"] Apr 16 16:03:53.002055 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:03:53.002031 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9bdf21f7_d4f3_48cc_a53f_85d7088427df.slice/crio-abc5b1913504db3cf29db9c8501804be39f17d6facc557938ef03532db691fe6 WatchSource:0}: Error finding container abc5b1913504db3cf29db9c8501804be39f17d6facc557938ef03532db691fe6: Status 404 returned error can't find the container with id abc5b1913504db3cf29db9c8501804be39f17d6facc557938ef03532db691fe6 Apr 16 16:03:53.485972 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:53.485937 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6f87575f49-9hw9n" event={"ID":"9bdf21f7-d4f3-48cc-a53f-85d7088427df","Type":"ContainerStarted","Data":"abc5b1913504db3cf29db9c8501804be39f17d6facc557938ef03532db691fe6"} Apr 16 16:03:55.433173 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:55.433140 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5fl2m" Apr 16 16:03:57.494387 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:57.494350 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6f87575f49-9hw9n" event={"ID":"9bdf21f7-d4f3-48cc-a53f-85d7088427df","Type":"ContainerStarted","Data":"2119d6503a3a30b2114256f7e738ca3f2471bac9eb81c8ff6069edb3e8f8eef1"} Apr 16 16:03:57.494745 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:57.494579 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6f87575f49-9hw9n" Apr 16 16:03:57.495979 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:57.495926 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6f87575f49-9hw9n" Apr 16 16:03:57.514691 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:03:57.512947 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6f87575f49-9hw9n" podStartSLOduration=1.269474315 podStartE2EDuration="5.512928241s" podCreationTimestamp="2026-04-16 16:03:52 +0000 UTC" firstStartedPulling="2026-04-16 16:03:53.003686335 +0000 UTC m=+56.358676441" lastFinishedPulling="2026-04-16 16:03:57.247140242 +0000 UTC m=+60.602130367" observedRunningTime="2026-04-16 16:03:57.511549567 +0000 UTC m=+60.866539696" watchObservedRunningTime="2026-04-16 16:03:57.512928241 +0000 UTC m=+60.867918374" Apr 16 16:04:02.802620 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:04:02.802577 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/48e2d2c2-3b35-49d1-bb3f-46840c5001a5-cert\") pod \"ingress-canary-qg8tx\" (UID: \"48e2d2c2-3b35-49d1-bb3f-46840c5001a5\") " pod="openshift-ingress-canary/ingress-canary-qg8tx" Apr 16 16:04:02.803017 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:04:02.802736 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:04:02.803017 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:04:02.802803 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48e2d2c2-3b35-49d1-bb3f-46840c5001a5-cert podName:48e2d2c2-3b35-49d1-bb3f-46840c5001a5 nodeName:}" failed. No retries permitted until 2026-04-16 16:04:34.802787598 +0000 UTC m=+98.157777705 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/48e2d2c2-3b35-49d1-bb3f-46840c5001a5-cert") pod "ingress-canary-qg8tx" (UID: "48e2d2c2-3b35-49d1-bb3f-46840c5001a5") : secret "canary-serving-cert" not found Apr 16 16:04:02.903261 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:04:02.903227 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/916e5e50-1aef-4277-971a-7f2e8ffd2703-metrics-certs\") pod \"network-metrics-daemon-gvndv\" (UID: \"916e5e50-1aef-4277-971a-7f2e8ffd2703\") " pod="openshift-multus/network-metrics-daemon-gvndv" Apr 16 16:04:02.903427 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:04:02.903294 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/01216772-46ae-4344-a250-d689b2fe3c4c-metrics-tls\") pod \"dns-default-vzpf5\" (UID: \"01216772-46ae-4344-a250-d689b2fe3c4c\") " pod="openshift-dns/dns-default-vzpf5" Apr 16 16:04:02.903427 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:04:02.903387 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:04:02.903542 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:04:02.903451 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01216772-46ae-4344-a250-d689b2fe3c4c-metrics-tls podName:01216772-46ae-4344-a250-d689b2fe3c4c nodeName:}" failed. No retries permitted until 2026-04-16 16:04:34.903432426 +0000 UTC m=+98.258422536 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/01216772-46ae-4344-a250-d689b2fe3c4c-metrics-tls") pod "dns-default-vzpf5" (UID: "01216772-46ae-4344-a250-d689b2fe3c4c") : secret "dns-default-metrics-tls" not found Apr 16 16:04:02.905660 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:04:02.905643 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 16:04:02.914088 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:04:02.914071 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 16:04:02.914166 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:04:02.914116 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/916e5e50-1aef-4277-971a-7f2e8ffd2703-metrics-certs podName:916e5e50-1aef-4277-971a-7f2e8ffd2703 nodeName:}" failed. No retries permitted until 2026-04-16 16:05:06.914102632 +0000 UTC m=+130.269092739 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/916e5e50-1aef-4277-971a-7f2e8ffd2703-metrics-certs") pod "network-metrics-daemon-gvndv" (UID: "916e5e50-1aef-4277-971a-7f2e8ffd2703") : secret "metrics-daemon-secret" not found Apr 16 16:04:03.004538 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:04:03.004512 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zwvgt\" (UniqueName: \"kubernetes.io/projected/31ee7aaa-d858-49a4-becd-246ec9f1a8c5-kube-api-access-zwvgt\") pod \"network-check-target-xjfdj\" (UID: \"31ee7aaa-d858-49a4-becd-246ec9f1a8c5\") " pod="openshift-network-diagnostics/network-check-target-xjfdj" Apr 16 16:04:03.007234 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:04:03.007200 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 16:04:03.017660 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:04:03.017638 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 16:04:03.027816 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:04:03.027801 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwvgt\" (UniqueName: \"kubernetes.io/projected/31ee7aaa-d858-49a4-becd-246ec9f1a8c5-kube-api-access-zwvgt\") pod \"network-check-target-xjfdj\" (UID: \"31ee7aaa-d858-49a4-becd-246ec9f1a8c5\") " pod="openshift-network-diagnostics/network-check-target-xjfdj" Apr 16 16:04:03.201936 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:04:03.201908 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-qf27n\"" Apr 16 16:04:03.209919 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:04:03.209897 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xjfdj" Apr 16 16:04:03.323270 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:04:03.323242 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-xjfdj"] Apr 16 16:04:03.326647 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:04:03.326622 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31ee7aaa_d858_49a4_becd_246ec9f1a8c5.slice/crio-b2ba70ef3ddd99cd2ddaadc655fc20d9b83a59dddd1e8337b12074ede8d8bab3 WatchSource:0}: Error finding container b2ba70ef3ddd99cd2ddaadc655fc20d9b83a59dddd1e8337b12074ede8d8bab3: Status 404 returned error can't find the container with id b2ba70ef3ddd99cd2ddaadc655fc20d9b83a59dddd1e8337b12074ede8d8bab3 Apr 16 16:04:03.507251 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:04:03.507152 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xjfdj" event={"ID":"31ee7aaa-d858-49a4-becd-246ec9f1a8c5","Type":"ContainerStarted","Data":"b2ba70ef3ddd99cd2ddaadc655fc20d9b83a59dddd1e8337b12074ede8d8bab3"} Apr 16 16:04:06.513902 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:04:06.513866 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xjfdj" event={"ID":"31ee7aaa-d858-49a4-becd-246ec9f1a8c5","Type":"ContainerStarted","Data":"04428bdbfb11a9291a8d12fe9c9813671da3ae0cd26b9e31f658e9b69a743573"} Apr 16 16:04:06.514306 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:04:06.514061 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-xjfdj" Apr 16 16:04:06.530225 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:04:06.530162 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-xjfdj" podStartSLOduration=66.921657938 podStartE2EDuration="1m9.53014769s" podCreationTimestamp="2026-04-16 16:02:57 +0000 UTC" firstStartedPulling="2026-04-16 16:04:03.32846133 +0000 UTC m=+66.683451437" lastFinishedPulling="2026-04-16 16:04:05.936951083 +0000 UTC m=+69.291941189" observedRunningTime="2026-04-16 16:04:06.529484707 +0000 UTC m=+69.884474837" watchObservedRunningTime="2026-04-16 16:04:06.53014769 +0000 UTC m=+69.885137818" Apr 16 16:04:34.820527 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:04:34.820481 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/48e2d2c2-3b35-49d1-bb3f-46840c5001a5-cert\") pod \"ingress-canary-qg8tx\" (UID: \"48e2d2c2-3b35-49d1-bb3f-46840c5001a5\") " pod="openshift-ingress-canary/ingress-canary-qg8tx" Apr 16 16:04:34.820932 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:04:34.820646 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:04:34.820932 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:04:34.820721 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48e2d2c2-3b35-49d1-bb3f-46840c5001a5-cert podName:48e2d2c2-3b35-49d1-bb3f-46840c5001a5 nodeName:}" failed. No retries permitted until 2026-04-16 16:05:38.820706094 +0000 UTC m=+162.175696201 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/48e2d2c2-3b35-49d1-bb3f-46840c5001a5-cert") pod "ingress-canary-qg8tx" (UID: "48e2d2c2-3b35-49d1-bb3f-46840c5001a5") : secret "canary-serving-cert" not found Apr 16 16:04:34.920951 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:04:34.920917 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/01216772-46ae-4344-a250-d689b2fe3c4c-metrics-tls\") pod \"dns-default-vzpf5\" (UID: \"01216772-46ae-4344-a250-d689b2fe3c4c\") " pod="openshift-dns/dns-default-vzpf5" Apr 16 16:04:34.921060 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:04:34.921038 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:04:34.921096 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:04:34.921091 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01216772-46ae-4344-a250-d689b2fe3c4c-metrics-tls podName:01216772-46ae-4344-a250-d689b2fe3c4c nodeName:}" failed. No retries permitted until 2026-04-16 16:05:38.921078609 +0000 UTC m=+162.276068716 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/01216772-46ae-4344-a250-d689b2fe3c4c-metrics-tls") pod "dns-default-vzpf5" (UID: "01216772-46ae-4344-a250-d689b2fe3c4c") : secret "dns-default-metrics-tls" not found Apr 16 16:04:37.518979 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:04:37.518950 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xjfdj" Apr 16 16:05:06.937094 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:06.937046 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/916e5e50-1aef-4277-971a-7f2e8ffd2703-metrics-certs\") pod \"network-metrics-daemon-gvndv\" (UID: \"916e5e50-1aef-4277-971a-7f2e8ffd2703\") " pod="openshift-multus/network-metrics-daemon-gvndv" Apr 16 16:05:06.937679 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:05:06.937238 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 16:05:06.937679 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:05:06.937338 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/916e5e50-1aef-4277-971a-7f2e8ffd2703-metrics-certs podName:916e5e50-1aef-4277-971a-7f2e8ffd2703 nodeName:}" failed. No retries permitted until 2026-04-16 16:07:08.937314625 +0000 UTC m=+252.292304733 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/916e5e50-1aef-4277-971a-7f2e8ffd2703-metrics-certs") pod "network-metrics-daemon-gvndv" (UID: "916e5e50-1aef-4277-971a-7f2e8ffd2703") : secret "metrics-daemon-secret" not found Apr 16 16:05:18.499141 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.499102 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-rqrcp"] Apr 16 16:05:18.501920 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.501900 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-rqrcp" Apr 16 16:05:18.503334 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.503309 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-754d7b9497-rdfwj"] Apr 16 16:05:18.504494 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.504474 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-klhxs\"" Apr 16 16:05:18.505934 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.505920 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-754d7b9497-rdfwj" Apr 16 16:05:18.508561 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.508541 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 16:05:18.508650 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.508607 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 16:05:18.508718 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.508704 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 16:05:18.510425 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.510408 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-5j7ls\"" Apr 16 16:05:18.514638 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.514603 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 16:05:18.516089 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.516068 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-rqrcp"] Apr 16 16:05:18.522680 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.522654 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-754d7b9497-rdfwj"] Apr 16 16:05:18.603594 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.603556 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-57wjn"] Apr 16 16:05:18.606509 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.606486 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-57wjn" Apr 16 16:05:18.610396 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.610374 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:05:18.612533 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.612513 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 16 16:05:18.614466 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.614443 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-q5rg6\"" Apr 16 16:05:18.614584 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.614483 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 16 16:05:18.615114 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.615100 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 16 16:05:18.616994 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.616970 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-57wjn"] Apr 16 16:05:18.619622 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.619602 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fbd6bf59-f967-4545-bd0f-06cc259b395c-image-registry-private-configuration\") pod \"image-registry-754d7b9497-rdfwj\" (UID: \"fbd6bf59-f967-4545-bd0f-06cc259b395c\") " pod="openshift-image-registry/image-registry-754d7b9497-rdfwj" Apr 16 16:05:18.619686 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.619633 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fbd6bf59-f967-4545-bd0f-06cc259b395c-ca-trust-extracted\") pod \"image-registry-754d7b9497-rdfwj\" (UID: \"fbd6bf59-f967-4545-bd0f-06cc259b395c\") " pod="openshift-image-registry/image-registry-754d7b9497-rdfwj" Apr 16 16:05:18.619686 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.619651 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fbd6bf59-f967-4545-bd0f-06cc259b395c-registry-certificates\") pod \"image-registry-754d7b9497-rdfwj\" (UID: \"fbd6bf59-f967-4545-bd0f-06cc259b395c\") " pod="openshift-image-registry/image-registry-754d7b9497-rdfwj" Apr 16 16:05:18.619686 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.619671 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fbd6bf59-f967-4545-bd0f-06cc259b395c-registry-tls\") pod \"image-registry-754d7b9497-rdfwj\" (UID: \"fbd6bf59-f967-4545-bd0f-06cc259b395c\") " pod="openshift-image-registry/image-registry-754d7b9497-rdfwj" Apr 16 16:05:18.619834 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.619694 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fbd6bf59-f967-4545-bd0f-06cc259b395c-trusted-ca\") pod \"image-registry-754d7b9497-rdfwj\" (UID: \"fbd6bf59-f967-4545-bd0f-06cc259b395c\") " pod="openshift-image-registry/image-registry-754d7b9497-rdfwj" Apr 16 16:05:18.619834 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.619723 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fbd6bf59-f967-4545-bd0f-06cc259b395c-installation-pull-secrets\") pod \"image-registry-754d7b9497-rdfwj\" (UID: \"fbd6bf59-f967-4545-bd0f-06cc259b395c\") " pod="openshift-image-registry/image-registry-754d7b9497-rdfwj" Apr 16 16:05:18.619834 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.619787 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fbd6bf59-f967-4545-bd0f-06cc259b395c-bound-sa-token\") pod \"image-registry-754d7b9497-rdfwj\" (UID: \"fbd6bf59-f967-4545-bd0f-06cc259b395c\") " pod="openshift-image-registry/image-registry-754d7b9497-rdfwj" Apr 16 16:05:18.619834 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.619829 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlz65\" (UniqueName: \"kubernetes.io/projected/fbd6bf59-f967-4545-bd0f-06cc259b395c-kube-api-access-hlz65\") pod \"image-registry-754d7b9497-rdfwj\" (UID: \"fbd6bf59-f967-4545-bd0f-06cc259b395c\") " pod="openshift-image-registry/image-registry-754d7b9497-rdfwj" Apr 16 16:05:18.619961 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.619858 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwj6t\" (UniqueName: \"kubernetes.io/projected/4dc825b4-ec3d-4c27-aa56-053ec3f50964-kube-api-access-dwj6t\") pod \"network-check-source-7b678d77c7-rqrcp\" (UID: \"4dc825b4-ec3d-4c27-aa56-053ec3f50964\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-rqrcp" Apr 16 16:05:18.720632 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.720603 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dwj6t\" (UniqueName: \"kubernetes.io/projected/4dc825b4-ec3d-4c27-aa56-053ec3f50964-kube-api-access-dwj6t\") pod \"network-check-source-7b678d77c7-rqrcp\" (UID: \"4dc825b4-ec3d-4c27-aa56-053ec3f50964\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-rqrcp" Apr 16 16:05:18.720808 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.720664 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fbd6bf59-f967-4545-bd0f-06cc259b395c-image-registry-private-configuration\") pod \"image-registry-754d7b9497-rdfwj\" (UID: \"fbd6bf59-f967-4545-bd0f-06cc259b395c\") " pod="openshift-image-registry/image-registry-754d7b9497-rdfwj" Apr 16 16:05:18.720808 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.720689 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fbd6bf59-f967-4545-bd0f-06cc259b395c-ca-trust-extracted\") pod \"image-registry-754d7b9497-rdfwj\" (UID: \"fbd6bf59-f967-4545-bd0f-06cc259b395c\") " pod="openshift-image-registry/image-registry-754d7b9497-rdfwj" Apr 16 16:05:18.720808 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.720707 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fbd6bf59-f967-4545-bd0f-06cc259b395c-registry-certificates\") pod \"image-registry-754d7b9497-rdfwj\" (UID: \"fbd6bf59-f967-4545-bd0f-06cc259b395c\") " pod="openshift-image-registry/image-registry-754d7b9497-rdfwj" Apr 16 16:05:18.720808 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.720726 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef511675-c77f-44dc-a5c2-e3ec14d14609-serving-cert\") pod \"service-ca-operator-69965bb79d-57wjn\" (UID: \"ef511675-c77f-44dc-a5c2-e3ec14d14609\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-57wjn" Apr 16 16:05:18.720808 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.720756 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fbd6bf59-f967-4545-bd0f-06cc259b395c-registry-tls\") pod \"image-registry-754d7b9497-rdfwj\" (UID: \"fbd6bf59-f967-4545-bd0f-06cc259b395c\") " pod="openshift-image-registry/image-registry-754d7b9497-rdfwj" Apr 16 16:05:18.720808 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.720774 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fbd6bf59-f967-4545-bd0f-06cc259b395c-trusted-ca\") pod \"image-registry-754d7b9497-rdfwj\" (UID: \"fbd6bf59-f967-4545-bd0f-06cc259b395c\") " pod="openshift-image-registry/image-registry-754d7b9497-rdfwj" Apr 16 16:05:18.720808 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.720796 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fbd6bf59-f967-4545-bd0f-06cc259b395c-installation-pull-secrets\") pod \"image-registry-754d7b9497-rdfwj\" (UID: \"fbd6bf59-f967-4545-bd0f-06cc259b395c\") " pod="openshift-image-registry/image-registry-754d7b9497-rdfwj" Apr 16 16:05:18.721149 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.720822 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fct8\" (UniqueName: \"kubernetes.io/projected/ef511675-c77f-44dc-a5c2-e3ec14d14609-kube-api-access-9fct8\") pod \"service-ca-operator-69965bb79d-57wjn\" (UID: \"ef511675-c77f-44dc-a5c2-e3ec14d14609\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-57wjn" Apr 16 16:05:18.721149 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.720860 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fbd6bf59-f967-4545-bd0f-06cc259b395c-bound-sa-token\") pod \"image-registry-754d7b9497-rdfwj\" (UID: \"fbd6bf59-f967-4545-bd0f-06cc259b395c\") " pod="openshift-image-registry/image-registry-754d7b9497-rdfwj" Apr 16 16:05:18.721149 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.720886 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef511675-c77f-44dc-a5c2-e3ec14d14609-config\") pod \"service-ca-operator-69965bb79d-57wjn\" (UID: \"ef511675-c77f-44dc-a5c2-e3ec14d14609\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-57wjn" Apr 16 16:05:18.721149 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.720916 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hlz65\" (UniqueName: \"kubernetes.io/projected/fbd6bf59-f967-4545-bd0f-06cc259b395c-kube-api-access-hlz65\") pod \"image-registry-754d7b9497-rdfwj\" (UID: \"fbd6bf59-f967-4545-bd0f-06cc259b395c\") " pod="openshift-image-registry/image-registry-754d7b9497-rdfwj" Apr 16 16:05:18.721149 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:05:18.720860 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:05:18.721149 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:05:18.720972 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-754d7b9497-rdfwj: secret "image-registry-tls" not found Apr 16 16:05:18.721149 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:05:18.721023 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fbd6bf59-f967-4545-bd0f-06cc259b395c-registry-tls podName:fbd6bf59-f967-4545-bd0f-06cc259b395c nodeName:}" failed. No retries permitted until 2026-04-16 16:05:19.221008222 +0000 UTC m=+142.575998329 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fbd6bf59-f967-4545-bd0f-06cc259b395c-registry-tls") pod "image-registry-754d7b9497-rdfwj" (UID: "fbd6bf59-f967-4545-bd0f-06cc259b395c") : secret "image-registry-tls" not found Apr 16 16:05:18.721527 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.721155 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fbd6bf59-f967-4545-bd0f-06cc259b395c-ca-trust-extracted\") pod \"image-registry-754d7b9497-rdfwj\" (UID: \"fbd6bf59-f967-4545-bd0f-06cc259b395c\") " pod="openshift-image-registry/image-registry-754d7b9497-rdfwj" Apr 16 16:05:18.721588 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.721556 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fbd6bf59-f967-4545-bd0f-06cc259b395c-registry-certificates\") pod \"image-registry-754d7b9497-rdfwj\" (UID: \"fbd6bf59-f967-4545-bd0f-06cc259b395c\") " pod="openshift-image-registry/image-registry-754d7b9497-rdfwj" Apr 16 16:05:18.721969 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.721950 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fbd6bf59-f967-4545-bd0f-06cc259b395c-trusted-ca\") pod \"image-registry-754d7b9497-rdfwj\" (UID: \"fbd6bf59-f967-4545-bd0f-06cc259b395c\") " pod="openshift-image-registry/image-registry-754d7b9497-rdfwj" Apr 16 16:05:18.723401 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.723380 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fbd6bf59-f967-4545-bd0f-06cc259b395c-image-registry-private-configuration\") pod \"image-registry-754d7b9497-rdfwj\" (UID: \"fbd6bf59-f967-4545-bd0f-06cc259b395c\") " pod="openshift-image-registry/image-registry-754d7b9497-rdfwj" Apr 16 16:05:18.723612 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.723592 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fbd6bf59-f967-4545-bd0f-06cc259b395c-installation-pull-secrets\") pod \"image-registry-754d7b9497-rdfwj\" (UID: \"fbd6bf59-f967-4545-bd0f-06cc259b395c\") " pod="openshift-image-registry/image-registry-754d7b9497-rdfwj" Apr 16 16:05:18.741184 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.741157 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwj6t\" (UniqueName: \"kubernetes.io/projected/4dc825b4-ec3d-4c27-aa56-053ec3f50964-kube-api-access-dwj6t\") pod \"network-check-source-7b678d77c7-rqrcp\" (UID: \"4dc825b4-ec3d-4c27-aa56-053ec3f50964\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-rqrcp" Apr 16 16:05:18.742994 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.742971 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlz65\" (UniqueName: \"kubernetes.io/projected/fbd6bf59-f967-4545-bd0f-06cc259b395c-kube-api-access-hlz65\") pod \"image-registry-754d7b9497-rdfwj\" (UID: \"fbd6bf59-f967-4545-bd0f-06cc259b395c\") " pod="openshift-image-registry/image-registry-754d7b9497-rdfwj" Apr 16 16:05:18.756687 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.756627 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fbd6bf59-f967-4545-bd0f-06cc259b395c-bound-sa-token\") pod \"image-registry-754d7b9497-rdfwj\" (UID: \"fbd6bf59-f967-4545-bd0f-06cc259b395c\") " pod="openshift-image-registry/image-registry-754d7b9497-rdfwj" Apr 16 16:05:18.812673 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.812642 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-rqrcp" Apr 16 16:05:18.821743 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.821714 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef511675-c77f-44dc-a5c2-e3ec14d14609-serving-cert\") pod \"service-ca-operator-69965bb79d-57wjn\" (UID: \"ef511675-c77f-44dc-a5c2-e3ec14d14609\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-57wjn" Apr 16 16:05:18.821870 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.821766 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9fct8\" (UniqueName: \"kubernetes.io/projected/ef511675-c77f-44dc-a5c2-e3ec14d14609-kube-api-access-9fct8\") pod \"service-ca-operator-69965bb79d-57wjn\" (UID: \"ef511675-c77f-44dc-a5c2-e3ec14d14609\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-57wjn" Apr 16 16:05:18.821870 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.821793 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef511675-c77f-44dc-a5c2-e3ec14d14609-config\") pod \"service-ca-operator-69965bb79d-57wjn\" (UID: \"ef511675-c77f-44dc-a5c2-e3ec14d14609\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-57wjn" Apr 16 16:05:18.822325 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.822302 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef511675-c77f-44dc-a5c2-e3ec14d14609-config\") pod \"service-ca-operator-69965bb79d-57wjn\" (UID: \"ef511675-c77f-44dc-a5c2-e3ec14d14609\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-57wjn" Apr 16 16:05:18.823848 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.823831 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef511675-c77f-44dc-a5c2-e3ec14d14609-serving-cert\") pod \"service-ca-operator-69965bb79d-57wjn\" (UID: \"ef511675-c77f-44dc-a5c2-e3ec14d14609\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-57wjn" Apr 16 16:05:18.831412 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.831387 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fct8\" (UniqueName: \"kubernetes.io/projected/ef511675-c77f-44dc-a5c2-e3ec14d14609-kube-api-access-9fct8\") pod \"service-ca-operator-69965bb79d-57wjn\" (UID: \"ef511675-c77f-44dc-a5c2-e3ec14d14609\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-57wjn" Apr 16 16:05:18.915166 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.915136 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-57wjn" Apr 16 16:05:18.927951 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:18.927916 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-rqrcp"] Apr 16 16:05:18.931349 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:05:18.931318 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4dc825b4_ec3d_4c27_aa56_053ec3f50964.slice/crio-ef407e5105df42f270521faf3120bb82efea29ae39971935abcafccd225289e6 WatchSource:0}: Error finding container ef407e5105df42f270521faf3120bb82efea29ae39971935abcafccd225289e6: Status 404 returned error can't find the container with id ef407e5105df42f270521faf3120bb82efea29ae39971935abcafccd225289e6 Apr 16 16:05:19.031867 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:19.031785 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-57wjn"] Apr 16 16:05:19.035478 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:05:19.035449 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef511675_c77f_44dc_a5c2_e3ec14d14609.slice/crio-0841fb8dfaef605c30859aed6cac1ae7c90cc0a504558fc7b5388281bc59ca55 WatchSource:0}: Error finding container 0841fb8dfaef605c30859aed6cac1ae7c90cc0a504558fc7b5388281bc59ca55: Status 404 returned error can't find the container with id 0841fb8dfaef605c30859aed6cac1ae7c90cc0a504558fc7b5388281bc59ca55 Apr 16 16:05:19.226715 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:19.226667 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fbd6bf59-f967-4545-bd0f-06cc259b395c-registry-tls\") pod \"image-registry-754d7b9497-rdfwj\" (UID: \"fbd6bf59-f967-4545-bd0f-06cc259b395c\") " pod="openshift-image-registry/image-registry-754d7b9497-rdfwj" Apr 16 16:05:19.226875 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:05:19.226829 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:05:19.226875 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:05:19.226848 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-754d7b9497-rdfwj: secret "image-registry-tls" not found Apr 16 16:05:19.226999 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:05:19.226919 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fbd6bf59-f967-4545-bd0f-06cc259b395c-registry-tls podName:fbd6bf59-f967-4545-bd0f-06cc259b395c nodeName:}" failed. No retries permitted until 2026-04-16 16:05:20.226898808 +0000 UTC m=+143.581888934 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fbd6bf59-f967-4545-bd0f-06cc259b395c-registry-tls") pod "image-registry-754d7b9497-rdfwj" (UID: "fbd6bf59-f967-4545-bd0f-06cc259b395c") : secret "image-registry-tls" not found Apr 16 16:05:19.653767 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:19.653724 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-57wjn" event={"ID":"ef511675-c77f-44dc-a5c2-e3ec14d14609","Type":"ContainerStarted","Data":"0841fb8dfaef605c30859aed6cac1ae7c90cc0a504558fc7b5388281bc59ca55"} Apr 16 16:05:19.655059 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:19.655023 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-rqrcp" event={"ID":"4dc825b4-ec3d-4c27-aa56-053ec3f50964","Type":"ContainerStarted","Data":"53738526bef92c33f685657c6cc4fc13f8dcb7e591064943dbc20322e1667478"} Apr 16 16:05:19.655197 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:19.655063 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-rqrcp" event={"ID":"4dc825b4-ec3d-4c27-aa56-053ec3f50964","Type":"ContainerStarted","Data":"ef407e5105df42f270521faf3120bb82efea29ae39971935abcafccd225289e6"} Apr 16 16:05:19.671876 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:19.671831 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-rqrcp" podStartSLOduration=1.671818193 podStartE2EDuration="1.671818193s" podCreationTimestamp="2026-04-16 16:05:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:05:19.671193216 +0000 UTC m=+143.026183350" watchObservedRunningTime="2026-04-16 16:05:19.671818193 +0000 UTC m=+143.026808322" Apr 16 16:05:20.233699 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:20.233663 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fbd6bf59-f967-4545-bd0f-06cc259b395c-registry-tls\") pod \"image-registry-754d7b9497-rdfwj\" (UID: \"fbd6bf59-f967-4545-bd0f-06cc259b395c\") " pod="openshift-image-registry/image-registry-754d7b9497-rdfwj" Apr 16 16:05:20.233896 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:05:20.233838 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:05:20.233896 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:05:20.233860 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-754d7b9497-rdfwj: secret "image-registry-tls" not found Apr 16 16:05:20.234016 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:05:20.233931 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fbd6bf59-f967-4545-bd0f-06cc259b395c-registry-tls podName:fbd6bf59-f967-4545-bd0f-06cc259b395c nodeName:}" failed. No retries permitted until 2026-04-16 16:05:22.23391052 +0000 UTC m=+145.588900664 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fbd6bf59-f967-4545-bd0f-06cc259b395c-registry-tls") pod "image-registry-754d7b9497-rdfwj" (UID: "fbd6bf59-f967-4545-bd0f-06cc259b395c") : secret "image-registry-tls" not found Apr 16 16:05:21.659667 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:21.659632 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-57wjn" event={"ID":"ef511675-c77f-44dc-a5c2-e3ec14d14609","Type":"ContainerStarted","Data":"29f52ff9d4bc6414f17322783831bc93f901944c88153ad1b83e4202da4dba0d"} Apr 16 16:05:21.677252 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:21.677188 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-57wjn" podStartSLOduration=2.066294939 podStartE2EDuration="3.677173827s" podCreationTimestamp="2026-04-16 16:05:18 +0000 UTC" firstStartedPulling="2026-04-16 16:05:19.037341314 +0000 UTC m=+142.392331424" lastFinishedPulling="2026-04-16 16:05:20.648220191 +0000 UTC m=+144.003210312" observedRunningTime="2026-04-16 16:05:21.676638221 +0000 UTC m=+145.031628351" watchObservedRunningTime="2026-04-16 16:05:21.677173827 +0000 UTC m=+145.032163956" Apr 16 16:05:22.251085 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:22.251047 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fbd6bf59-f967-4545-bd0f-06cc259b395c-registry-tls\") pod \"image-registry-754d7b9497-rdfwj\" (UID: \"fbd6bf59-f967-4545-bd0f-06cc259b395c\") " pod="openshift-image-registry/image-registry-754d7b9497-rdfwj" Apr 16 16:05:22.251276 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:05:22.251160 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:05:22.251276 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:05:22.251172 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-754d7b9497-rdfwj: secret "image-registry-tls" not found Apr 16 16:05:22.251276 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:05:22.251235 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fbd6bf59-f967-4545-bd0f-06cc259b395c-registry-tls podName:fbd6bf59-f967-4545-bd0f-06cc259b395c nodeName:}" failed. No retries permitted until 2026-04-16 16:05:26.251204325 +0000 UTC m=+149.606194432 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fbd6bf59-f967-4545-bd0f-06cc259b395c-registry-tls") pod "image-registry-754d7b9497-rdfwj" (UID: "fbd6bf59-f967-4545-bd0f-06cc259b395c") : secret "image-registry-tls" not found Apr 16 16:05:23.946525 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:23.946489 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-knx4c"] Apr 16 16:05:23.949827 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:23.949809 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-knx4c" Apr 16 16:05:23.952148 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:23.952120 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 16:05:23.952296 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:23.952130 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-p8jg7\"" Apr 16 16:05:23.952296 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:23.952128 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 16:05:23.959446 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:23.959425 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-knx4c"] Apr 16 16:05:24.065980 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:24.065935 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgg8w\" (UniqueName: \"kubernetes.io/projected/032c92aa-e753-46b9-9966-8c772b88005b-kube-api-access-zgg8w\") pod \"migrator-64d4d94569-knx4c\" (UID: \"032c92aa-e753-46b9-9966-8c772b88005b\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-knx4c" Apr 16 16:05:24.166608 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:24.166568 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zgg8w\" (UniqueName: \"kubernetes.io/projected/032c92aa-e753-46b9-9966-8c772b88005b-kube-api-access-zgg8w\") pod \"migrator-64d4d94569-knx4c\" (UID: \"032c92aa-e753-46b9-9966-8c772b88005b\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-knx4c" Apr 16 16:05:24.175900 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:24.175874 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgg8w\" (UniqueName: \"kubernetes.io/projected/032c92aa-e753-46b9-9966-8c772b88005b-kube-api-access-zgg8w\") pod \"migrator-64d4d94569-knx4c\" (UID: \"032c92aa-e753-46b9-9966-8c772b88005b\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-knx4c" Apr 16 16:05:24.258254 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:24.258138 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-knx4c" Apr 16 16:05:24.378434 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:24.378404 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-knx4c"] Apr 16 16:05:24.382364 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:05:24.382339 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod032c92aa_e753_46b9_9966_8c772b88005b.slice/crio-616ba5266ed4bf130434fddb44e272f13180b583051036bd6e58be18522e0d69 WatchSource:0}: Error finding container 616ba5266ed4bf130434fddb44e272f13180b583051036bd6e58be18522e0d69: Status 404 returned error can't find the container with id 616ba5266ed4bf130434fddb44e272f13180b583051036bd6e58be18522e0d69 Apr 16 16:05:24.666516 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:24.666476 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-knx4c" event={"ID":"032c92aa-e753-46b9-9966-8c772b88005b","Type":"ContainerStarted","Data":"616ba5266ed4bf130434fddb44e272f13180b583051036bd6e58be18522e0d69"} Apr 16 16:05:24.677087 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:24.677061 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-6bkvd_e5e28615-1240-4149-a23f-752b612f8a06/dns-node-resolver/0.log" Apr 16 16:05:25.670636 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:25.670597 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-knx4c" event={"ID":"032c92aa-e753-46b9-9966-8c772b88005b","Type":"ContainerStarted","Data":"0c0a1f85d0487512d5197610026803113c7c61230b37590744ac939c19a0c37e"} Apr 16 16:05:25.670636 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:25.670634 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-knx4c" event={"ID":"032c92aa-e753-46b9-9966-8c772b88005b","Type":"ContainerStarted","Data":"10f817142f9932d3ff54c9267bc4f8c4850152f73b8b980e3d2bd1fe48358a4b"} Apr 16 16:05:26.075049 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:26.074980 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-cgg4d_4b05a2af-d8f2-42c1-a086-851d57791b5f/node-ca/0.log" Apr 16 16:05:26.284702 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:26.284664 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fbd6bf59-f967-4545-bd0f-06cc259b395c-registry-tls\") pod \"image-registry-754d7b9497-rdfwj\" (UID: \"fbd6bf59-f967-4545-bd0f-06cc259b395c\") " pod="openshift-image-registry/image-registry-754d7b9497-rdfwj" Apr 16 16:05:26.284870 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:05:26.284817 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 16:05:26.284870 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:05:26.284838 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-754d7b9497-rdfwj: secret "image-registry-tls" not found Apr 16 16:05:26.284945 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:05:26.284893 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/fbd6bf59-f967-4545-bd0f-06cc259b395c-registry-tls podName:fbd6bf59-f967-4545-bd0f-06cc259b395c nodeName:}" failed. No retries permitted until 2026-04-16 16:05:34.28487759 +0000 UTC m=+157.639867697 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/fbd6bf59-f967-4545-bd0f-06cc259b395c-registry-tls") pod "image-registry-754d7b9497-rdfwj" (UID: "fbd6bf59-f967-4545-bd0f-06cc259b395c") : secret "image-registry-tls" not found Apr 16 16:05:34.009540 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:05:34.009447 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-qg8tx" podUID="48e2d2c2-3b35-49d1-bb3f-46840c5001a5" Apr 16 16:05:34.025641 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:05:34.025598 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-vzpf5" podUID="01216772-46ae-4344-a250-d689b2fe3c4c" Apr 16 16:05:34.342081 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:34.341996 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fbd6bf59-f967-4545-bd0f-06cc259b395c-registry-tls\") pod \"image-registry-754d7b9497-rdfwj\" (UID: \"fbd6bf59-f967-4545-bd0f-06cc259b395c\") " pod="openshift-image-registry/image-registry-754d7b9497-rdfwj" Apr 16 16:05:34.344462 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:34.344430 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fbd6bf59-f967-4545-bd0f-06cc259b395c-registry-tls\") pod \"image-registry-754d7b9497-rdfwj\" (UID: \"fbd6bf59-f967-4545-bd0f-06cc259b395c\") " pod="openshift-image-registry/image-registry-754d7b9497-rdfwj" Apr 16 16:05:34.418044 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:34.418007 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-754d7b9497-rdfwj" Apr 16 16:05:34.537702 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:34.537638 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-knx4c" podStartSLOduration=10.651526462 podStartE2EDuration="11.537617634s" podCreationTimestamp="2026-04-16 16:05:23 +0000 UTC" firstStartedPulling="2026-04-16 16:05:24.384175806 +0000 UTC m=+147.739165917" lastFinishedPulling="2026-04-16 16:05:25.270266978 +0000 UTC m=+148.625257089" observedRunningTime="2026-04-16 16:05:25.692037478 +0000 UTC m=+149.047027607" watchObservedRunningTime="2026-04-16 16:05:34.537617634 +0000 UTC m=+157.892607763" Apr 16 16:05:34.538098 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:34.538082 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-754d7b9497-rdfwj"] Apr 16 16:05:34.541153 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:05:34.541128 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbd6bf59_f967_4545_bd0f_06cc259b395c.slice/crio-9795094f0e5c3bf222519d5710beb124141402568ab0cc460dcbd78458b68c21 WatchSource:0}: Error finding container 9795094f0e5c3bf222519d5710beb124141402568ab0cc460dcbd78458b68c21: Status 404 returned error can't find the container with id 9795094f0e5c3bf222519d5710beb124141402568ab0cc460dcbd78458b68c21 Apr 16 16:05:34.696897 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:34.696855 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-754d7b9497-rdfwj" event={"ID":"fbd6bf59-f967-4545-bd0f-06cc259b395c","Type":"ContainerStarted","Data":"267e072242b1ba4c657e87c1f40f14deea4b6e99dcb0744bf526eb50c3c6c03b"} Apr 16 16:05:34.696897 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:34.696896 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-754d7b9497-rdfwj" event={"ID":"fbd6bf59-f967-4545-bd0f-06cc259b395c","Type":"ContainerStarted","Data":"9795094f0e5c3bf222519d5710beb124141402568ab0cc460dcbd78458b68c21"} Apr 16 16:05:34.696897 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:34.696864 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qg8tx" Apr 16 16:05:34.697169 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:34.697079 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-754d7b9497-rdfwj" Apr 16 16:05:34.718277 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:34.718192 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-754d7b9497-rdfwj" podStartSLOduration=16.718177272 podStartE2EDuration="16.718177272s" podCreationTimestamp="2026-04-16 16:05:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:05:34.717505164 +0000 UTC m=+158.072495294" watchObservedRunningTime="2026-04-16 16:05:34.718177272 +0000 UTC m=+158.073167402" Apr 16 16:05:35.283690 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:05:35.283650 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-gvndv" podUID="916e5e50-1aef-4277-971a-7f2e8ffd2703" Apr 16 16:05:38.876984 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:38.876936 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/48e2d2c2-3b35-49d1-bb3f-46840c5001a5-cert\") pod \"ingress-canary-qg8tx\" (UID: \"48e2d2c2-3b35-49d1-bb3f-46840c5001a5\") " pod="openshift-ingress-canary/ingress-canary-qg8tx" Apr 16 16:05:38.879310 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:38.879288 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/48e2d2c2-3b35-49d1-bb3f-46840c5001a5-cert\") pod \"ingress-canary-qg8tx\" (UID: \"48e2d2c2-3b35-49d1-bb3f-46840c5001a5\") " pod="openshift-ingress-canary/ingress-canary-qg8tx" Apr 16 16:05:38.900694 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:38.900664 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-l2qxk\"" Apr 16 16:05:38.908679 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:38.908655 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qg8tx" Apr 16 16:05:38.977951 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:38.977919 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/01216772-46ae-4344-a250-d689b2fe3c4c-metrics-tls\") pod \"dns-default-vzpf5\" (UID: \"01216772-46ae-4344-a250-d689b2fe3c4c\") " pod="openshift-dns/dns-default-vzpf5" Apr 16 16:05:38.980250 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:38.980225 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/01216772-46ae-4344-a250-d689b2fe3c4c-metrics-tls\") pod \"dns-default-vzpf5\" (UID: \"01216772-46ae-4344-a250-d689b2fe3c4c\") " pod="openshift-dns/dns-default-vzpf5" Apr 16 16:05:39.027473 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:39.027432 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qg8tx"] Apr 16 16:05:39.031129 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:05:39.031094 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48e2d2c2_3b35_49d1_bb3f_46840c5001a5.slice/crio-8e2e4ae3d9b425c63ba0a382b18026fbadd540143b90776c41e153a14fd84fb8 WatchSource:0}: Error finding container 8e2e4ae3d9b425c63ba0a382b18026fbadd540143b90776c41e153a14fd84fb8: Status 404 returned error can't find the container with id 8e2e4ae3d9b425c63ba0a382b18026fbadd540143b90776c41e153a14fd84fb8 Apr 16 16:05:39.710297 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:39.710253 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qg8tx" event={"ID":"48e2d2c2-3b35-49d1-bb3f-46840c5001a5","Type":"ContainerStarted","Data":"8e2e4ae3d9b425c63ba0a382b18026fbadd540143b90776c41e153a14fd84fb8"} Apr 16 16:05:40.714320 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:40.714277 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qg8tx" event={"ID":"48e2d2c2-3b35-49d1-bb3f-46840c5001a5","Type":"ContainerStarted","Data":"79a68db86b140eba186d76139a848fe8998df6d4228a3dbea610c1be1741e701"} Apr 16 16:05:40.732146 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:40.732092 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-qg8tx" podStartSLOduration=129.145255742 podStartE2EDuration="2m10.732076105s" podCreationTimestamp="2026-04-16 16:03:30 +0000 UTC" firstStartedPulling="2026-04-16 16:05:39.033114407 +0000 UTC m=+162.388104528" lastFinishedPulling="2026-04-16 16:05:40.619934781 +0000 UTC m=+163.974924891" observedRunningTime="2026-04-16 16:05:40.730886116 +0000 UTC m=+164.085876245" watchObservedRunningTime="2026-04-16 16:05:40.732076105 +0000 UTC m=+164.087066267" Apr 16 16:05:44.008554 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.008510 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-5tv2m"] Apr 16 16:05:44.011871 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.011851 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-5tv2m" Apr 16 16:05:44.015339 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.015317 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 16:05:44.015512 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.015493 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 16:05:44.015597 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.015518 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 16:05:44.015597 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.015521 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 16:05:44.015707 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.015604 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-8s6r8\"" Apr 16 16:05:44.022541 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.022520 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-754d7b9497-rdfwj"] Apr 16 16:05:44.034751 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.034724 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-5tv2m"] Apr 16 16:05:44.084466 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.084432 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-545c64844d-f4pv2"] Apr 16 16:05:44.087380 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.087362 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-545c64844d-f4pv2" Apr 16 16:05:44.103457 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.103428 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-545c64844d-f4pv2"] Apr 16 16:05:44.112457 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.112432 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fe7bede0-0e38-4e70-89e7-de62eb29aaa4-image-registry-private-configuration\") pod \"image-registry-545c64844d-f4pv2\" (UID: \"fe7bede0-0e38-4e70-89e7-de62eb29aaa4\") " pod="openshift-image-registry/image-registry-545c64844d-f4pv2" Apr 16 16:05:44.112602 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.112466 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/48910414-96d3-4257-899f-e58464821442-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5tv2m\" (UID: \"48910414-96d3-4257-899f-e58464821442\") " pod="openshift-insights/insights-runtime-extractor-5tv2m" Apr 16 16:05:44.112602 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.112486 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/48910414-96d3-4257-899f-e58464821442-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5tv2m\" (UID: \"48910414-96d3-4257-899f-e58464821442\") " pod="openshift-insights/insights-runtime-extractor-5tv2m" Apr 16 16:05:44.112602 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.112514 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fe7bede0-0e38-4e70-89e7-de62eb29aaa4-installation-pull-secrets\") pod \"image-registry-545c64844d-f4pv2\" (UID: \"fe7bede0-0e38-4e70-89e7-de62eb29aaa4\") " pod="openshift-image-registry/image-registry-545c64844d-f4pv2" Apr 16 16:05:44.112602 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.112566 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fe7bede0-0e38-4e70-89e7-de62eb29aaa4-ca-trust-extracted\") pod \"image-registry-545c64844d-f4pv2\" (UID: \"fe7bede0-0e38-4e70-89e7-de62eb29aaa4\") " pod="openshift-image-registry/image-registry-545c64844d-f4pv2" Apr 16 16:05:44.112732 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.112656 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfhxg\" (UniqueName: \"kubernetes.io/projected/48910414-96d3-4257-899f-e58464821442-kube-api-access-hfhxg\") pod \"insights-runtime-extractor-5tv2m\" (UID: \"48910414-96d3-4257-899f-e58464821442\") " pod="openshift-insights/insights-runtime-extractor-5tv2m" Apr 16 16:05:44.112732 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.112686 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jzb5\" (UniqueName: \"kubernetes.io/projected/fe7bede0-0e38-4e70-89e7-de62eb29aaa4-kube-api-access-2jzb5\") pod \"image-registry-545c64844d-f4pv2\" (UID: \"fe7bede0-0e38-4e70-89e7-de62eb29aaa4\") " pod="openshift-image-registry/image-registry-545c64844d-f4pv2" Apr 16 16:05:44.112732 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.112712 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fe7bede0-0e38-4e70-89e7-de62eb29aaa4-registry-tls\") pod \"image-registry-545c64844d-f4pv2\" (UID: \"fe7bede0-0e38-4e70-89e7-de62eb29aaa4\") " pod="openshift-image-registry/image-registry-545c64844d-f4pv2" Apr 16 16:05:44.112829 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.112753 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/48910414-96d3-4257-899f-e58464821442-crio-socket\") pod \"insights-runtime-extractor-5tv2m\" (UID: \"48910414-96d3-4257-899f-e58464821442\") " pod="openshift-insights/insights-runtime-extractor-5tv2m" Apr 16 16:05:44.112829 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.112781 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fe7bede0-0e38-4e70-89e7-de62eb29aaa4-trusted-ca\") pod \"image-registry-545c64844d-f4pv2\" (UID: \"fe7bede0-0e38-4e70-89e7-de62eb29aaa4\") " pod="openshift-image-registry/image-registry-545c64844d-f4pv2" Apr 16 16:05:44.112829 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.112810 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fe7bede0-0e38-4e70-89e7-de62eb29aaa4-bound-sa-token\") pod \"image-registry-545c64844d-f4pv2\" (UID: \"fe7bede0-0e38-4e70-89e7-de62eb29aaa4\") " pod="openshift-image-registry/image-registry-545c64844d-f4pv2" Apr 16 16:05:44.112918 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.112849 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fe7bede0-0e38-4e70-89e7-de62eb29aaa4-registry-certificates\") pod \"image-registry-545c64844d-f4pv2\" (UID: \"fe7bede0-0e38-4e70-89e7-de62eb29aaa4\") " pod="openshift-image-registry/image-registry-545c64844d-f4pv2" Apr 16 16:05:44.112918 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.112912 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/48910414-96d3-4257-899f-e58464821442-data-volume\") pod \"insights-runtime-extractor-5tv2m\" (UID: \"48910414-96d3-4257-899f-e58464821442\") " pod="openshift-insights/insights-runtime-extractor-5tv2m" Apr 16 16:05:44.213494 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.213459 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fe7bede0-0e38-4e70-89e7-de62eb29aaa4-registry-tls\") pod \"image-registry-545c64844d-f4pv2\" (UID: \"fe7bede0-0e38-4e70-89e7-de62eb29aaa4\") " pod="openshift-image-registry/image-registry-545c64844d-f4pv2" Apr 16 16:05:44.213494 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.213498 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/48910414-96d3-4257-899f-e58464821442-crio-socket\") pod \"insights-runtime-extractor-5tv2m\" (UID: \"48910414-96d3-4257-899f-e58464821442\") " pod="openshift-insights/insights-runtime-extractor-5tv2m" Apr 16 16:05:44.213754 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.213521 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fe7bede0-0e38-4e70-89e7-de62eb29aaa4-trusted-ca\") pod \"image-registry-545c64844d-f4pv2\" (UID: \"fe7bede0-0e38-4e70-89e7-de62eb29aaa4\") " pod="openshift-image-registry/image-registry-545c64844d-f4pv2" Apr 16 16:05:44.213754 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.213537 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fe7bede0-0e38-4e70-89e7-de62eb29aaa4-bound-sa-token\") pod \"image-registry-545c64844d-f4pv2\" (UID: \"fe7bede0-0e38-4e70-89e7-de62eb29aaa4\") " pod="openshift-image-registry/image-registry-545c64844d-f4pv2" Apr 16 16:05:44.213754 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.213558 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fe7bede0-0e38-4e70-89e7-de62eb29aaa4-registry-certificates\") pod \"image-registry-545c64844d-f4pv2\" (UID: \"fe7bede0-0e38-4e70-89e7-de62eb29aaa4\") " pod="openshift-image-registry/image-registry-545c64844d-f4pv2" Apr 16 16:05:44.213754 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.213576 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/48910414-96d3-4257-899f-e58464821442-data-volume\") pod \"insights-runtime-extractor-5tv2m\" (UID: \"48910414-96d3-4257-899f-e58464821442\") " pod="openshift-insights/insights-runtime-extractor-5tv2m" Apr 16 16:05:44.213754 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.213622 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/48910414-96d3-4257-899f-e58464821442-crio-socket\") pod \"insights-runtime-extractor-5tv2m\" (UID: \"48910414-96d3-4257-899f-e58464821442\") " pod="openshift-insights/insights-runtime-extractor-5tv2m" Apr 16 16:05:44.213754 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.213654 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fe7bede0-0e38-4e70-89e7-de62eb29aaa4-image-registry-private-configuration\") pod \"image-registry-545c64844d-f4pv2\" (UID: \"fe7bede0-0e38-4e70-89e7-de62eb29aaa4\") " pod="openshift-image-registry/image-registry-545c64844d-f4pv2" Apr 16 16:05:44.213754 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.213704 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/48910414-96d3-4257-899f-e58464821442-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5tv2m\" (UID: \"48910414-96d3-4257-899f-e58464821442\") " pod="openshift-insights/insights-runtime-extractor-5tv2m" Apr 16 16:05:44.213754 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.213729 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/48910414-96d3-4257-899f-e58464821442-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5tv2m\" (UID: \"48910414-96d3-4257-899f-e58464821442\") " pod="openshift-insights/insights-runtime-extractor-5tv2m" Apr 16 16:05:44.214135 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.213773 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fe7bede0-0e38-4e70-89e7-de62eb29aaa4-installation-pull-secrets\") pod \"image-registry-545c64844d-f4pv2\" (UID: \"fe7bede0-0e38-4e70-89e7-de62eb29aaa4\") " pod="openshift-image-registry/image-registry-545c64844d-f4pv2" Apr 16 16:05:44.214135 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.214012 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/48910414-96d3-4257-899f-e58464821442-data-volume\") pod \"insights-runtime-extractor-5tv2m\" (UID: \"48910414-96d3-4257-899f-e58464821442\") " pod="openshift-insights/insights-runtime-extractor-5tv2m" Apr 16 16:05:44.214135 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.214117 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fe7bede0-0e38-4e70-89e7-de62eb29aaa4-ca-trust-extracted\") pod \"image-registry-545c64844d-f4pv2\" (UID: \"fe7bede0-0e38-4e70-89e7-de62eb29aaa4\") " pod="openshift-image-registry/image-registry-545c64844d-f4pv2" Apr 16 16:05:44.214309 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.214161 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hfhxg\" (UniqueName: \"kubernetes.io/projected/48910414-96d3-4257-899f-e58464821442-kube-api-access-hfhxg\") pod \"insights-runtime-extractor-5tv2m\" (UID: \"48910414-96d3-4257-899f-e58464821442\") " pod="openshift-insights/insights-runtime-extractor-5tv2m" Apr 16 16:05:44.214309 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.214180 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2jzb5\" (UniqueName: \"kubernetes.io/projected/fe7bede0-0e38-4e70-89e7-de62eb29aaa4-kube-api-access-2jzb5\") pod \"image-registry-545c64844d-f4pv2\" (UID: \"fe7bede0-0e38-4e70-89e7-de62eb29aaa4\") " pod="openshift-image-registry/image-registry-545c64844d-f4pv2" Apr 16 16:05:44.214385 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.214313 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/48910414-96d3-4257-899f-e58464821442-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-5tv2m\" (UID: \"48910414-96d3-4257-899f-e58464821442\") " pod="openshift-insights/insights-runtime-extractor-5tv2m" Apr 16 16:05:44.214704 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.214527 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fe7bede0-0e38-4e70-89e7-de62eb29aaa4-registry-certificates\") pod \"image-registry-545c64844d-f4pv2\" (UID: \"fe7bede0-0e38-4e70-89e7-de62eb29aaa4\") " pod="openshift-image-registry/image-registry-545c64844d-f4pv2" Apr 16 16:05:44.214809 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.214774 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fe7bede0-0e38-4e70-89e7-de62eb29aaa4-ca-trust-extracted\") pod \"image-registry-545c64844d-f4pv2\" (UID: \"fe7bede0-0e38-4e70-89e7-de62eb29aaa4\") " pod="openshift-image-registry/image-registry-545c64844d-f4pv2" Apr 16 16:05:44.215388 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.215365 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fe7bede0-0e38-4e70-89e7-de62eb29aaa4-trusted-ca\") pod \"image-registry-545c64844d-f4pv2\" (UID: \"fe7bede0-0e38-4e70-89e7-de62eb29aaa4\") " pod="openshift-image-registry/image-registry-545c64844d-f4pv2" Apr 16 16:05:44.216409 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.216388 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fe7bede0-0e38-4e70-89e7-de62eb29aaa4-registry-tls\") pod \"image-registry-545c64844d-f4pv2\" (UID: \"fe7bede0-0e38-4e70-89e7-de62eb29aaa4\") " pod="openshift-image-registry/image-registry-545c64844d-f4pv2" Apr 16 16:05:44.216684 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.216665 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fe7bede0-0e38-4e70-89e7-de62eb29aaa4-installation-pull-secrets\") pod \"image-registry-545c64844d-f4pv2\" (UID: \"fe7bede0-0e38-4e70-89e7-de62eb29aaa4\") " pod="openshift-image-registry/image-registry-545c64844d-f4pv2" Apr 16 16:05:44.216758 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.216728 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/48910414-96d3-4257-899f-e58464821442-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-5tv2m\" (UID: \"48910414-96d3-4257-899f-e58464821442\") " pod="openshift-insights/insights-runtime-extractor-5tv2m" Apr 16 16:05:44.217035 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.217018 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fe7bede0-0e38-4e70-89e7-de62eb29aaa4-image-registry-private-configuration\") pod \"image-registry-545c64844d-f4pv2\" (UID: \"fe7bede0-0e38-4e70-89e7-de62eb29aaa4\") " pod="openshift-image-registry/image-registry-545c64844d-f4pv2" Apr 16 16:05:44.231804 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.231778 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fe7bede0-0e38-4e70-89e7-de62eb29aaa4-bound-sa-token\") pod \"image-registry-545c64844d-f4pv2\" (UID: \"fe7bede0-0e38-4e70-89e7-de62eb29aaa4\") " pod="openshift-image-registry/image-registry-545c64844d-f4pv2" Apr 16 16:05:44.232094 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.232073 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jzb5\" (UniqueName: \"kubernetes.io/projected/fe7bede0-0e38-4e70-89e7-de62eb29aaa4-kube-api-access-2jzb5\") pod \"image-registry-545c64844d-f4pv2\" (UID: \"fe7bede0-0e38-4e70-89e7-de62eb29aaa4\") " pod="openshift-image-registry/image-registry-545c64844d-f4pv2" Apr 16 16:05:44.232849 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.232830 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfhxg\" (UniqueName: \"kubernetes.io/projected/48910414-96d3-4257-899f-e58464821442-kube-api-access-hfhxg\") pod \"insights-runtime-extractor-5tv2m\" (UID: \"48910414-96d3-4257-899f-e58464821442\") " pod="openshift-insights/insights-runtime-extractor-5tv2m" Apr 16 16:05:44.320444 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.320352 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-5tv2m" Apr 16 16:05:44.396709 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.396676 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-545c64844d-f4pv2" Apr 16 16:05:44.464493 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.464087 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-5tv2m"] Apr 16 16:05:44.468284 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:05:44.468149 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48910414_96d3_4257_899f_e58464821442.slice/crio-9baf809b35df073b629531fcd74c20f835296ff0006fc9366805951b4a17c085 WatchSource:0}: Error finding container 9baf809b35df073b629531fcd74c20f835296ff0006fc9366805951b4a17c085: Status 404 returned error can't find the container with id 9baf809b35df073b629531fcd74c20f835296ff0006fc9366805951b4a17c085 Apr 16 16:05:44.538266 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.538242 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-545c64844d-f4pv2"] Apr 16 16:05:44.540990 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:05:44.540968 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe7bede0_0e38_4e70_89e7_de62eb29aaa4.slice/crio-d36915edd4c281c4ae8d9618d8a5868c028b8b628b311ddd0a0973599eaf9eea WatchSource:0}: Error finding container d36915edd4c281c4ae8d9618d8a5868c028b8b628b311ddd0a0973599eaf9eea: Status 404 returned error can't find the container with id d36915edd4c281c4ae8d9618d8a5868c028b8b628b311ddd0a0973599eaf9eea Apr 16 16:05:44.724385 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.724348 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-545c64844d-f4pv2" event={"ID":"fe7bede0-0e38-4e70-89e7-de62eb29aaa4","Type":"ContainerStarted","Data":"8d450691033b305fb2a25b1c847ee904567655a63cda3373181eacced73d9283"} Apr 16 16:05:44.724385 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.724389 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-545c64844d-f4pv2" event={"ID":"fe7bede0-0e38-4e70-89e7-de62eb29aaa4","Type":"ContainerStarted","Data":"d36915edd4c281c4ae8d9618d8a5868c028b8b628b311ddd0a0973599eaf9eea"} Apr 16 16:05:44.724646 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.724536 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-545c64844d-f4pv2" Apr 16 16:05:44.725620 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.725598 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5tv2m" event={"ID":"48910414-96d3-4257-899f-e58464821442","Type":"ContainerStarted","Data":"f8973fd1a8b20a279861b912446bc283ce22d1cb00d6446ef2419952f52b89c5"} Apr 16 16:05:44.725620 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.725624 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5tv2m" event={"ID":"48910414-96d3-4257-899f-e58464821442","Type":"ContainerStarted","Data":"9baf809b35df073b629531fcd74c20f835296ff0006fc9366805951b4a17c085"} Apr 16 16:05:44.764302 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:44.764244 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-545c64844d-f4pv2" podStartSLOduration=0.764203327 podStartE2EDuration="764.203327ms" podCreationTimestamp="2026-04-16 16:05:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:05:44.762448033 +0000 UTC m=+168.117438162" watchObservedRunningTime="2026-04-16 16:05:44.764203327 +0000 UTC m=+168.119193493" Apr 16 16:05:45.277062 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:45.277033 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vzpf5" Apr 16 16:05:45.280417 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:45.280399 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-jhn24\"" Apr 16 16:05:45.288375 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:45.288352 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vzpf5" Apr 16 16:05:45.412437 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:45.412406 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vzpf5"] Apr 16 16:05:45.417091 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:05:45.417058 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01216772_46ae_4344_a250_d689b2fe3c4c.slice/crio-b4d60f0ae6da0a11c30eefecb585566180d23a1e955d38bbe17e72b37ca06716 WatchSource:0}: Error finding container b4d60f0ae6da0a11c30eefecb585566180d23a1e955d38bbe17e72b37ca06716: Status 404 returned error can't find the container with id b4d60f0ae6da0a11c30eefecb585566180d23a1e955d38bbe17e72b37ca06716 Apr 16 16:05:45.729629 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:45.729507 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vzpf5" event={"ID":"01216772-46ae-4344-a250-d689b2fe3c4c","Type":"ContainerStarted","Data":"b4d60f0ae6da0a11c30eefecb585566180d23a1e955d38bbe17e72b37ca06716"} Apr 16 16:05:45.731014 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:45.730987 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5tv2m" event={"ID":"48910414-96d3-4257-899f-e58464821442","Type":"ContainerStarted","Data":"df54e3dfb7b452bbde70a1fce1cd50e82b4a3fb27668b6eb51715ca3a93bc455"} Apr 16 16:05:47.738547 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:47.738512 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vzpf5" event={"ID":"01216772-46ae-4344-a250-d689b2fe3c4c","Type":"ContainerStarted","Data":"4aefa17da155dd0aa3bbc02fa9355b2d86d1532d76be7a93b3a7da24dcd9fffc"} Apr 16 16:05:47.738547 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:47.738548 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vzpf5" event={"ID":"01216772-46ae-4344-a250-d689b2fe3c4c","Type":"ContainerStarted","Data":"24db116afaaac351a81b534391fefc8dc3586b48dad492606b4b9e04f86b2bad"} Apr 16 16:05:47.739046 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:47.738630 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-vzpf5" Apr 16 16:05:47.740221 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:47.740187 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-5tv2m" event={"ID":"48910414-96d3-4257-899f-e58464821442","Type":"ContainerStarted","Data":"e7346a23192d8fd4982efd6540a813ca48134377c58ba5203fbaef2d8f4306be"} Apr 16 16:05:47.785669 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:47.785614 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-vzpf5" podStartSLOduration=136.09295199 podStartE2EDuration="2m17.785598084s" podCreationTimestamp="2026-04-16 16:03:30 +0000 UTC" firstStartedPulling="2026-04-16 16:05:45.418994785 +0000 UTC m=+168.773984904" lastFinishedPulling="2026-04-16 16:05:47.111640891 +0000 UTC m=+170.466630998" observedRunningTime="2026-04-16 16:05:47.761921528 +0000 UTC m=+171.116911657" watchObservedRunningTime="2026-04-16 16:05:47.785598084 +0000 UTC m=+171.140588240" Apr 16 16:05:47.785831 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:47.785775 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-5tv2m" podStartSLOduration=2.225551897 podStartE2EDuration="4.78577068s" podCreationTimestamp="2026-04-16 16:05:43 +0000 UTC" firstStartedPulling="2026-04-16 16:05:44.548579175 +0000 UTC m=+167.903569282" lastFinishedPulling="2026-04-16 16:05:47.108797958 +0000 UTC m=+170.463788065" observedRunningTime="2026-04-16 16:05:47.785565478 +0000 UTC m=+171.140555618" watchObservedRunningTime="2026-04-16 16:05:47.78577068 +0000 UTC m=+171.140760809" Apr 16 16:05:48.274156 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:48.274120 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gvndv" Apr 16 16:05:54.027666 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:54.027631 2576 patch_prober.go:28] interesting pod/image-registry-754d7b9497-rdfwj container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 16:05:54.028023 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:54.027685 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-754d7b9497-rdfwj" podUID="fbd6bf59-f967-4545-bd0f-06cc259b395c" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:05:55.409313 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:55.409274 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-sfgcn"] Apr 16 16:05:55.414083 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:55.414059 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-sfgcn" Apr 16 16:05:55.416874 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:55.416851 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 16 16:05:55.416961 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:55.416889 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 16:05:55.418106 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:55.418084 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 16 16:05:55.418106 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:55.418100 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 16:05:55.418306 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:55.418111 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 16:05:55.418306 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:55.418171 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-b99tb\"" Apr 16 16:05:55.423582 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:55.423559 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-sfgcn"] Apr 16 16:05:55.505357 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:55.505319 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4973f86a-c513-4ca6-a939-13d71612d2c0-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-sfgcn\" (UID: \"4973f86a-c513-4ca6-a939-13d71612d2c0\") " pod="openshift-monitoring/prometheus-operator-78f957474d-sfgcn" Apr 16 16:05:55.505529 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:55.505368 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4973f86a-c513-4ca6-a939-13d71612d2c0-metrics-client-ca\") pod \"prometheus-operator-78f957474d-sfgcn\" (UID: \"4973f86a-c513-4ca6-a939-13d71612d2c0\") " pod="openshift-monitoring/prometheus-operator-78f957474d-sfgcn" Apr 16 16:05:55.505529 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:55.505422 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/4973f86a-c513-4ca6-a939-13d71612d2c0-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-sfgcn\" (UID: \"4973f86a-c513-4ca6-a939-13d71612d2c0\") " pod="openshift-monitoring/prometheus-operator-78f957474d-sfgcn" Apr 16 16:05:55.505529 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:55.505493 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2pv7\" (UniqueName: \"kubernetes.io/projected/4973f86a-c513-4ca6-a939-13d71612d2c0-kube-api-access-t2pv7\") pod \"prometheus-operator-78f957474d-sfgcn\" (UID: \"4973f86a-c513-4ca6-a939-13d71612d2c0\") " pod="openshift-monitoring/prometheus-operator-78f957474d-sfgcn" Apr 16 16:05:55.605909 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:55.605869 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t2pv7\" (UniqueName: \"kubernetes.io/projected/4973f86a-c513-4ca6-a939-13d71612d2c0-kube-api-access-t2pv7\") pod \"prometheus-operator-78f957474d-sfgcn\" (UID: \"4973f86a-c513-4ca6-a939-13d71612d2c0\") " pod="openshift-monitoring/prometheus-operator-78f957474d-sfgcn" Apr 16 16:05:55.605998 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:55.605927 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4973f86a-c513-4ca6-a939-13d71612d2c0-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-sfgcn\" (UID: \"4973f86a-c513-4ca6-a939-13d71612d2c0\") " pod="openshift-monitoring/prometheus-operator-78f957474d-sfgcn" Apr 16 16:05:55.605998 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:55.605955 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4973f86a-c513-4ca6-a939-13d71612d2c0-metrics-client-ca\") pod \"prometheus-operator-78f957474d-sfgcn\" (UID: \"4973f86a-c513-4ca6-a939-13d71612d2c0\") " pod="openshift-monitoring/prometheus-operator-78f957474d-sfgcn" Apr 16 16:05:55.605998 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:55.605978 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/4973f86a-c513-4ca6-a939-13d71612d2c0-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-sfgcn\" (UID: \"4973f86a-c513-4ca6-a939-13d71612d2c0\") " pod="openshift-monitoring/prometheus-operator-78f957474d-sfgcn" Apr 16 16:05:55.606182 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:05:55.606162 2576 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 16 16:05:55.606287 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:05:55.606276 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4973f86a-c513-4ca6-a939-13d71612d2c0-prometheus-operator-tls podName:4973f86a-c513-4ca6-a939-13d71612d2c0 nodeName:}" failed. No retries permitted until 2026-04-16 16:05:56.106257486 +0000 UTC m=+179.461247601 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/4973f86a-c513-4ca6-a939-13d71612d2c0-prometheus-operator-tls") pod "prometheus-operator-78f957474d-sfgcn" (UID: "4973f86a-c513-4ca6-a939-13d71612d2c0") : secret "prometheus-operator-tls" not found Apr 16 16:05:55.606583 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:55.606566 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4973f86a-c513-4ca6-a939-13d71612d2c0-metrics-client-ca\") pod \"prometheus-operator-78f957474d-sfgcn\" (UID: \"4973f86a-c513-4ca6-a939-13d71612d2c0\") " pod="openshift-monitoring/prometheus-operator-78f957474d-sfgcn" Apr 16 16:05:55.608287 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:55.608271 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4973f86a-c513-4ca6-a939-13d71612d2c0-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-sfgcn\" (UID: \"4973f86a-c513-4ca6-a939-13d71612d2c0\") " pod="openshift-monitoring/prometheus-operator-78f957474d-sfgcn" Apr 16 16:05:55.616135 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:55.616108 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2pv7\" (UniqueName: \"kubernetes.io/projected/4973f86a-c513-4ca6-a939-13d71612d2c0-kube-api-access-t2pv7\") pod \"prometheus-operator-78f957474d-sfgcn\" (UID: \"4973f86a-c513-4ca6-a939-13d71612d2c0\") " pod="openshift-monitoring/prometheus-operator-78f957474d-sfgcn" Apr 16 16:05:56.110701 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:56.110659 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/4973f86a-c513-4ca6-a939-13d71612d2c0-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-sfgcn\" (UID: \"4973f86a-c513-4ca6-a939-13d71612d2c0\") " pod="openshift-monitoring/prometheus-operator-78f957474d-sfgcn" Apr 16 16:05:56.113110 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:56.113074 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/4973f86a-c513-4ca6-a939-13d71612d2c0-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-sfgcn\" (UID: \"4973f86a-c513-4ca6-a939-13d71612d2c0\") " pod="openshift-monitoring/prometheus-operator-78f957474d-sfgcn" Apr 16 16:05:56.323236 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:56.323174 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-sfgcn" Apr 16 16:05:56.446594 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:56.446563 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-sfgcn"] Apr 16 16:05:56.450924 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:05:56.450725 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4973f86a_c513_4ca6_a939_13d71612d2c0.slice/crio-a0f358774dd379f9c59faf3c838f97db8b3c556da800bd268c257683eab4ad84 WatchSource:0}: Error finding container a0f358774dd379f9c59faf3c838f97db8b3c556da800bd268c257683eab4ad84: Status 404 returned error can't find the container with id a0f358774dd379f9c59faf3c838f97db8b3c556da800bd268c257683eab4ad84 Apr 16 16:05:56.765633 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:56.765598 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-sfgcn" event={"ID":"4973f86a-c513-4ca6-a939-13d71612d2c0","Type":"ContainerStarted","Data":"a0f358774dd379f9c59faf3c838f97db8b3c556da800bd268c257683eab4ad84"} Apr 16 16:05:57.495771 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:57.495725 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6f87575f49-9hw9n" podUID="9bdf21f7-d4f3-48cc-a53f-85d7088427df" containerName="acm-agent" probeResult="failure" output="Get \"http://10.133.0.8:8000/readyz\": dial tcp 10.133.0.8:8000: connect: connection refused" Apr 16 16:05:57.744790 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:57.744760 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-vzpf5" Apr 16 16:05:57.768658 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:57.768635 2576 generic.go:358] "Generic (PLEG): container finished" podID="9bdf21f7-d4f3-48cc-a53f-85d7088427df" containerID="2119d6503a3a30b2114256f7e738ca3f2471bac9eb81c8ff6069edb3e8f8eef1" exitCode=1 Apr 16 16:05:57.768773 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:57.768681 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6f87575f49-9hw9n" event={"ID":"9bdf21f7-d4f3-48cc-a53f-85d7088427df","Type":"ContainerDied","Data":"2119d6503a3a30b2114256f7e738ca3f2471bac9eb81c8ff6069edb3e8f8eef1"} Apr 16 16:05:57.768976 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:57.768963 2576 scope.go:117] "RemoveContainer" containerID="2119d6503a3a30b2114256f7e738ca3f2471bac9eb81c8ff6069edb3e8f8eef1" Apr 16 16:05:58.772501 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:58.772464 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6f87575f49-9hw9n" event={"ID":"9bdf21f7-d4f3-48cc-a53f-85d7088427df","Type":"ContainerStarted","Data":"ebe5ff19aa9f4769dcc9f95d8c02e80b2e9d93c5520a70002099e01ce56a2b6a"} Apr 16 16:05:58.772974 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:58.772769 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6f87575f49-9hw9n" Apr 16 16:05:58.773421 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:58.773402 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6f87575f49-9hw9n" Apr 16 16:05:58.774103 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:58.774084 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-sfgcn" event={"ID":"4973f86a-c513-4ca6-a939-13d71612d2c0","Type":"ContainerStarted","Data":"63dee3f375ee40fb73281f7a73b649c61d311859eef90f484ed670fbbf853270"} Apr 16 16:05:58.774103 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:58.774110 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-sfgcn" event={"ID":"4973f86a-c513-4ca6-a939-13d71612d2c0","Type":"ContainerStarted","Data":"d6ce57cac038dd3b4657ad074df07f4e35914cbfee503d3205861a63719932bf"} Apr 16 16:05:58.806509 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:05:58.806460 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-78f957474d-sfgcn" podStartSLOduration=2.544986065 podStartE2EDuration="3.806446669s" podCreationTimestamp="2026-04-16 16:05:55 +0000 UTC" firstStartedPulling="2026-04-16 16:05:56.453148892 +0000 UTC m=+179.808139002" lastFinishedPulling="2026-04-16 16:05:57.7146095 +0000 UTC m=+181.069599606" observedRunningTime="2026-04-16 16:05:58.805812385 +0000 UTC m=+182.160802504" watchObservedRunningTime="2026-04-16 16:05:58.806446669 +0000 UTC m=+182.161436798" Apr 16 16:06:00.860560 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:00.860528 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-zdvmv"] Apr 16 16:06:00.863839 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:00.863818 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zdvmv" Apr 16 16:06:00.866486 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:00.866464 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 16:06:00.866600 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:00.866489 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-nrtjc\"" Apr 16 16:06:00.866600 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:00.866489 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 16:06:00.866752 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:00.866737 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 16:06:00.949350 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:00.949314 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ee37060b-37d5-4004-91af-f37493123dc3-node-exporter-accelerators-collector-config\") pod \"node-exporter-zdvmv\" (UID: \"ee37060b-37d5-4004-91af-f37493123dc3\") " pod="openshift-monitoring/node-exporter-zdvmv" Apr 16 16:06:00.949350 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:00.949350 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ee37060b-37d5-4004-91af-f37493123dc3-sys\") pod \"node-exporter-zdvmv\" (UID: \"ee37060b-37d5-4004-91af-f37493123dc3\") " pod="openshift-monitoring/node-exporter-zdvmv" Apr 16 16:06:00.949604 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:00.949377 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ee37060b-37d5-4004-91af-f37493123dc3-metrics-client-ca\") pod \"node-exporter-zdvmv\" (UID: \"ee37060b-37d5-4004-91af-f37493123dc3\") " pod="openshift-monitoring/node-exporter-zdvmv" Apr 16 16:06:00.949604 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:00.949400 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ee37060b-37d5-4004-91af-f37493123dc3-node-exporter-tls\") pod \"node-exporter-zdvmv\" (UID: \"ee37060b-37d5-4004-91af-f37493123dc3\") " pod="openshift-monitoring/node-exporter-zdvmv" Apr 16 16:06:00.949604 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:00.949456 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ee37060b-37d5-4004-91af-f37493123dc3-node-exporter-textfile\") pod \"node-exporter-zdvmv\" (UID: \"ee37060b-37d5-4004-91af-f37493123dc3\") " pod="openshift-monitoring/node-exporter-zdvmv" Apr 16 16:06:00.949604 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:00.949505 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ee37060b-37d5-4004-91af-f37493123dc3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zdvmv\" (UID: \"ee37060b-37d5-4004-91af-f37493123dc3\") " pod="openshift-monitoring/node-exporter-zdvmv" Apr 16 16:06:00.949604 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:00.949575 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ee37060b-37d5-4004-91af-f37493123dc3-root\") pod \"node-exporter-zdvmv\" (UID: \"ee37060b-37d5-4004-91af-f37493123dc3\") " pod="openshift-monitoring/node-exporter-zdvmv" Apr 16 16:06:00.949814 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:00.949610 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52cv2\" (UniqueName: \"kubernetes.io/projected/ee37060b-37d5-4004-91af-f37493123dc3-kube-api-access-52cv2\") pod \"node-exporter-zdvmv\" (UID: \"ee37060b-37d5-4004-91af-f37493123dc3\") " pod="openshift-monitoring/node-exporter-zdvmv" Apr 16 16:06:00.949814 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:00.949657 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ee37060b-37d5-4004-91af-f37493123dc3-node-exporter-wtmp\") pod \"node-exporter-zdvmv\" (UID: \"ee37060b-37d5-4004-91af-f37493123dc3\") " pod="openshift-monitoring/node-exporter-zdvmv" Apr 16 16:06:01.050319 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:01.050266 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ee37060b-37d5-4004-91af-f37493123dc3-node-exporter-wtmp\") pod \"node-exporter-zdvmv\" (UID: \"ee37060b-37d5-4004-91af-f37493123dc3\") " pod="openshift-monitoring/node-exporter-zdvmv" Apr 16 16:06:01.050319 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:01.050325 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ee37060b-37d5-4004-91af-f37493123dc3-node-exporter-accelerators-collector-config\") pod \"node-exporter-zdvmv\" (UID: \"ee37060b-37d5-4004-91af-f37493123dc3\") " pod="openshift-monitoring/node-exporter-zdvmv" Apr 16 16:06:01.050565 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:01.050457 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ee37060b-37d5-4004-91af-f37493123dc3-node-exporter-wtmp\") pod \"node-exporter-zdvmv\" (UID: \"ee37060b-37d5-4004-91af-f37493123dc3\") " pod="openshift-monitoring/node-exporter-zdvmv" Apr 16 16:06:01.050565 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:01.050452 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ee37060b-37d5-4004-91af-f37493123dc3-sys\") pod \"node-exporter-zdvmv\" (UID: \"ee37060b-37d5-4004-91af-f37493123dc3\") " pod="openshift-monitoring/node-exporter-zdvmv" Apr 16 16:06:01.050565 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:01.050495 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ee37060b-37d5-4004-91af-f37493123dc3-sys\") pod \"node-exporter-zdvmv\" (UID: \"ee37060b-37d5-4004-91af-f37493123dc3\") " pod="openshift-monitoring/node-exporter-zdvmv" Apr 16 16:06:01.050565 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:01.050527 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ee37060b-37d5-4004-91af-f37493123dc3-metrics-client-ca\") pod \"node-exporter-zdvmv\" (UID: \"ee37060b-37d5-4004-91af-f37493123dc3\") " pod="openshift-monitoring/node-exporter-zdvmv" Apr 16 16:06:01.050565 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:01.050550 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ee37060b-37d5-4004-91af-f37493123dc3-node-exporter-tls\") pod \"node-exporter-zdvmv\" (UID: \"ee37060b-37d5-4004-91af-f37493123dc3\") " pod="openshift-monitoring/node-exporter-zdvmv" Apr 16 16:06:01.050802 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:01.050581 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ee37060b-37d5-4004-91af-f37493123dc3-node-exporter-textfile\") pod \"node-exporter-zdvmv\" (UID: \"ee37060b-37d5-4004-91af-f37493123dc3\") " pod="openshift-monitoring/node-exporter-zdvmv" Apr 16 16:06:01.050802 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:01.050613 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ee37060b-37d5-4004-91af-f37493123dc3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zdvmv\" (UID: \"ee37060b-37d5-4004-91af-f37493123dc3\") " pod="openshift-monitoring/node-exporter-zdvmv" Apr 16 16:06:01.050802 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:01.050682 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ee37060b-37d5-4004-91af-f37493123dc3-root\") pod \"node-exporter-zdvmv\" (UID: \"ee37060b-37d5-4004-91af-f37493123dc3\") " pod="openshift-monitoring/node-exporter-zdvmv" Apr 16 16:06:01.050802 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:01.050706 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-52cv2\" (UniqueName: \"kubernetes.io/projected/ee37060b-37d5-4004-91af-f37493123dc3-kube-api-access-52cv2\") pod \"node-exporter-zdvmv\" (UID: \"ee37060b-37d5-4004-91af-f37493123dc3\") " pod="openshift-monitoring/node-exporter-zdvmv" Apr 16 16:06:01.050802 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:06:01.050693 2576 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 16:06:01.050802 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:01.050751 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ee37060b-37d5-4004-91af-f37493123dc3-root\") pod \"node-exporter-zdvmv\" (UID: \"ee37060b-37d5-4004-91af-f37493123dc3\") " pod="openshift-monitoring/node-exporter-zdvmv" Apr 16 16:06:01.050802 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:06:01.050770 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee37060b-37d5-4004-91af-f37493123dc3-node-exporter-tls podName:ee37060b-37d5-4004-91af-f37493123dc3 nodeName:}" failed. No retries permitted until 2026-04-16 16:06:01.550752123 +0000 UTC m=+184.905742230 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/ee37060b-37d5-4004-91af-f37493123dc3-node-exporter-tls") pod "node-exporter-zdvmv" (UID: "ee37060b-37d5-4004-91af-f37493123dc3") : secret "node-exporter-tls" not found Apr 16 16:06:01.051145 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:01.050905 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ee37060b-37d5-4004-91af-f37493123dc3-node-exporter-accelerators-collector-config\") pod \"node-exporter-zdvmv\" (UID: \"ee37060b-37d5-4004-91af-f37493123dc3\") " pod="openshift-monitoring/node-exporter-zdvmv" Apr 16 16:06:01.051145 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:01.050910 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ee37060b-37d5-4004-91af-f37493123dc3-node-exporter-textfile\") pod \"node-exporter-zdvmv\" (UID: \"ee37060b-37d5-4004-91af-f37493123dc3\") " pod="openshift-monitoring/node-exporter-zdvmv" Apr 16 16:06:01.051145 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:01.051098 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ee37060b-37d5-4004-91af-f37493123dc3-metrics-client-ca\") pod \"node-exporter-zdvmv\" (UID: \"ee37060b-37d5-4004-91af-f37493123dc3\") " pod="openshift-monitoring/node-exporter-zdvmv" Apr 16 16:06:01.052937 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:01.052922 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ee37060b-37d5-4004-91af-f37493123dc3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zdvmv\" (UID: \"ee37060b-37d5-4004-91af-f37493123dc3\") " pod="openshift-monitoring/node-exporter-zdvmv" Apr 16 16:06:01.062457 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:01.062429 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-52cv2\" (UniqueName: \"kubernetes.io/projected/ee37060b-37d5-4004-91af-f37493123dc3-kube-api-access-52cv2\") pod \"node-exporter-zdvmv\" (UID: \"ee37060b-37d5-4004-91af-f37493123dc3\") " pod="openshift-monitoring/node-exporter-zdvmv" Apr 16 16:06:01.554619 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:01.554583 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ee37060b-37d5-4004-91af-f37493123dc3-node-exporter-tls\") pod \"node-exporter-zdvmv\" (UID: \"ee37060b-37d5-4004-91af-f37493123dc3\") " pod="openshift-monitoring/node-exporter-zdvmv" Apr 16 16:06:01.556978 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:01.556953 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ee37060b-37d5-4004-91af-f37493123dc3-node-exporter-tls\") pod \"node-exporter-zdvmv\" (UID: \"ee37060b-37d5-4004-91af-f37493123dc3\") " pod="openshift-monitoring/node-exporter-zdvmv" Apr 16 16:06:01.772450 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:01.772416 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zdvmv" Apr 16 16:06:01.781377 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:06:01.781345 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee37060b_37d5_4004_91af_f37493123dc3.slice/crio-4a9ee3d192089975a256c41c8ae1464716ea73ccdde902d72e86c7904a989e20 WatchSource:0}: Error finding container 4a9ee3d192089975a256c41c8ae1464716ea73ccdde902d72e86c7904a989e20: Status 404 returned error can't find the container with id 4a9ee3d192089975a256c41c8ae1464716ea73ccdde902d72e86c7904a989e20 Apr 16 16:06:02.785860 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:02.785781 2576 generic.go:358] "Generic (PLEG): container finished" podID="ee37060b-37d5-4004-91af-f37493123dc3" containerID="f628319677774f4da318d4b4672fd93a3221c8b40a1069c1bee9c8357b5f79ef" exitCode=0 Apr 16 16:06:02.785860 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:02.785844 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zdvmv" event={"ID":"ee37060b-37d5-4004-91af-f37493123dc3","Type":"ContainerDied","Data":"f628319677774f4da318d4b4672fd93a3221c8b40a1069c1bee9c8357b5f79ef"} Apr 16 16:06:02.786247 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:02.785876 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zdvmv" event={"ID":"ee37060b-37d5-4004-91af-f37493123dc3","Type":"ContainerStarted","Data":"4a9ee3d192089975a256c41c8ae1464716ea73ccdde902d72e86c7904a989e20"} Apr 16 16:06:03.790140 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:03.790108 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zdvmv" event={"ID":"ee37060b-37d5-4004-91af-f37493123dc3","Type":"ContainerStarted","Data":"7dea69fbf04046ec67b5d68c4a7f75f43adf75d51995df2e0ed31a9d3803ad5d"} Apr 16 16:06:03.790140 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:03.790145 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zdvmv" event={"ID":"ee37060b-37d5-4004-91af-f37493123dc3","Type":"ContainerStarted","Data":"8457cd6d967a2207f6c45226c953ae320539f433fb565c81da4b646f638c681b"} Apr 16 16:06:03.813266 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:03.813203 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-zdvmv" podStartSLOduration=3.142807861 podStartE2EDuration="3.813189655s" podCreationTimestamp="2026-04-16 16:06:00 +0000 UTC" firstStartedPulling="2026-04-16 16:06:01.783027279 +0000 UTC m=+185.138017386" lastFinishedPulling="2026-04-16 16:06:02.453409061 +0000 UTC m=+185.808399180" observedRunningTime="2026-04-16 16:06:03.812319951 +0000 UTC m=+187.167310081" watchObservedRunningTime="2026-04-16 16:06:03.813189655 +0000 UTC m=+187.168179784" Apr 16 16:06:04.027381 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:04.027344 2576 patch_prober.go:28] interesting pod/image-registry-754d7b9497-rdfwj container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 16:06:04.027516 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:04.027400 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-754d7b9497-rdfwj" podUID="fbd6bf59-f967-4545-bd0f-06cc259b395c" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:06:04.401200 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:04.401167 2576 patch_prober.go:28] interesting pod/image-registry-545c64844d-f4pv2 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 16:06:04.401379 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:04.401234 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-545c64844d-f4pv2" podUID="fe7bede0-0e38-4e70-89e7-de62eb29aaa4" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:06:05.617492 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:05.617451 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-kwwwq"] Apr 16 16:06:05.620695 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:05.620676 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-kwwwq" Apr 16 16:06:05.626826 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:05.626530 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 16:06:05.626826 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:05.626582 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-5pkf7\"" Apr 16 16:06:05.647054 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:05.647032 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-kwwwq"] Apr 16 16:06:05.689661 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:05.689635 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/16cac812-a0a2-4bf1-aa45-6ec4924939b1-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-kwwwq\" (UID: \"16cac812-a0a2-4bf1-aa45-6ec4924939b1\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-kwwwq" Apr 16 16:06:05.735115 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:05.735091 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-545c64844d-f4pv2" Apr 16 16:06:05.790373 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:05.790341 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/16cac812-a0a2-4bf1-aa45-6ec4924939b1-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-kwwwq\" (UID: \"16cac812-a0a2-4bf1-aa45-6ec4924939b1\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-kwwwq" Apr 16 16:06:05.793241 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:05.793196 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/16cac812-a0a2-4bf1-aa45-6ec4924939b1-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-kwwwq\" (UID: \"16cac812-a0a2-4bf1-aa45-6ec4924939b1\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-kwwwq" Apr 16 16:06:05.929182 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:05.929153 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-kwwwq" Apr 16 16:06:06.066405 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:06.066372 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-kwwwq"] Apr 16 16:06:06.070902 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:06:06.070877 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16cac812_a0a2_4bf1_aa45_6ec4924939b1.slice/crio-9bb95c965e92b2d0e242f79cecbf22df8ab0ca37848219d5d9dc0f079761e109 WatchSource:0}: Error finding container 9bb95c965e92b2d0e242f79cecbf22df8ab0ca37848219d5d9dc0f079761e109: Status 404 returned error can't find the container with id 9bb95c965e92b2d0e242f79cecbf22df8ab0ca37848219d5d9dc0f079761e109 Apr 16 16:06:06.798579 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:06.798547 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-kwwwq" event={"ID":"16cac812-a0a2-4bf1-aa45-6ec4924939b1","Type":"ContainerStarted","Data":"9bb95c965e92b2d0e242f79cecbf22df8ab0ca37848219d5d9dc0f079761e109"} Apr 16 16:06:07.802792 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:07.802755 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-kwwwq" event={"ID":"16cac812-a0a2-4bf1-aa45-6ec4924939b1","Type":"ContainerStarted","Data":"6bc665f3f29df0913f60d763d92d783c6ad4c1afe0b70eb57628de8b68a146e2"} Apr 16 16:06:07.803269 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:07.802978 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-kwwwq" Apr 16 16:06:07.807872 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:07.807810 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-kwwwq" Apr 16 16:06:07.822760 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:07.822717 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-kwwwq" podStartSLOduration=1.339781904 podStartE2EDuration="2.822704271s" podCreationTimestamp="2026-04-16 16:06:05 +0000 UTC" firstStartedPulling="2026-04-16 16:06:06.073159783 +0000 UTC m=+189.428149890" lastFinishedPulling="2026-04-16 16:06:07.556081952 +0000 UTC m=+190.911072257" observedRunningTime="2026-04-16 16:06:07.820497415 +0000 UTC m=+191.175487543" watchObservedRunningTime="2026-04-16 16:06:07.822704271 +0000 UTC m=+191.177694458" Apr 16 16:06:09.043631 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:09.043578 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-754d7b9497-rdfwj" podUID="fbd6bf59-f967-4545-bd0f-06cc259b395c" containerName="registry" containerID="cri-o://267e072242b1ba4c657e87c1f40f14deea4b6e99dcb0744bf526eb50c3c6c03b" gracePeriod=30 Apr 16 16:06:09.272542 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:09.272513 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-754d7b9497-rdfwj" Apr 16 16:06:09.425447 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:09.425417 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fbd6bf59-f967-4545-bd0f-06cc259b395c-installation-pull-secrets\") pod \"fbd6bf59-f967-4545-bd0f-06cc259b395c\" (UID: \"fbd6bf59-f967-4545-bd0f-06cc259b395c\") " Apr 16 16:06:09.425594 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:09.425472 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fbd6bf59-f967-4545-bd0f-06cc259b395c-image-registry-private-configuration\") pod \"fbd6bf59-f967-4545-bd0f-06cc259b395c\" (UID: \"fbd6bf59-f967-4545-bd0f-06cc259b395c\") " Apr 16 16:06:09.425594 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:09.425499 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fbd6bf59-f967-4545-bd0f-06cc259b395c-ca-trust-extracted\") pod \"fbd6bf59-f967-4545-bd0f-06cc259b395c\" (UID: \"fbd6bf59-f967-4545-bd0f-06cc259b395c\") " Apr 16 16:06:09.425737 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:09.425669 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fbd6bf59-f967-4545-bd0f-06cc259b395c-registry-certificates\") pod \"fbd6bf59-f967-4545-bd0f-06cc259b395c\" (UID: \"fbd6bf59-f967-4545-bd0f-06cc259b395c\") " Apr 16 16:06:09.425737 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:09.425720 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlz65\" (UniqueName: \"kubernetes.io/projected/fbd6bf59-f967-4545-bd0f-06cc259b395c-kube-api-access-hlz65\") pod \"fbd6bf59-f967-4545-bd0f-06cc259b395c\" (UID: \"fbd6bf59-f967-4545-bd0f-06cc259b395c\") " Apr 16 16:06:09.425816 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:09.425750 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fbd6bf59-f967-4545-bd0f-06cc259b395c-registry-tls\") pod \"fbd6bf59-f967-4545-bd0f-06cc259b395c\" (UID: \"fbd6bf59-f967-4545-bd0f-06cc259b395c\") " Apr 16 16:06:09.425816 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:09.425776 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fbd6bf59-f967-4545-bd0f-06cc259b395c-trusted-ca\") pod \"fbd6bf59-f967-4545-bd0f-06cc259b395c\" (UID: \"fbd6bf59-f967-4545-bd0f-06cc259b395c\") " Apr 16 16:06:09.425816 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:09.425811 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fbd6bf59-f967-4545-bd0f-06cc259b395c-bound-sa-token\") pod \"fbd6bf59-f967-4545-bd0f-06cc259b395c\" (UID: \"fbd6bf59-f967-4545-bd0f-06cc259b395c\") " Apr 16 16:06:09.426244 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:09.426144 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbd6bf59-f967-4545-bd0f-06cc259b395c-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "fbd6bf59-f967-4545-bd0f-06cc259b395c" (UID: "fbd6bf59-f967-4545-bd0f-06cc259b395c"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:06:09.426355 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:09.426246 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fbd6bf59-f967-4545-bd0f-06cc259b395c-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "fbd6bf59-f967-4545-bd0f-06cc259b395c" (UID: "fbd6bf59-f967-4545-bd0f-06cc259b395c"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:06:09.428491 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:09.428456 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbd6bf59-f967-4545-bd0f-06cc259b395c-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "fbd6bf59-f967-4545-bd0f-06cc259b395c" (UID: "fbd6bf59-f967-4545-bd0f-06cc259b395c"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:06:09.428610 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:09.428500 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbd6bf59-f967-4545-bd0f-06cc259b395c-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "fbd6bf59-f967-4545-bd0f-06cc259b395c" (UID: "fbd6bf59-f967-4545-bd0f-06cc259b395c"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:06:09.428610 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:09.428519 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbd6bf59-f967-4545-bd0f-06cc259b395c-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "fbd6bf59-f967-4545-bd0f-06cc259b395c" (UID: "fbd6bf59-f967-4545-bd0f-06cc259b395c"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:06:09.428610 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:09.428500 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbd6bf59-f967-4545-bd0f-06cc259b395c-kube-api-access-hlz65" (OuterVolumeSpecName: "kube-api-access-hlz65") pod "fbd6bf59-f967-4545-bd0f-06cc259b395c" (UID: "fbd6bf59-f967-4545-bd0f-06cc259b395c"). InnerVolumeSpecName "kube-api-access-hlz65". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:06:09.428610 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:09.428553 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbd6bf59-f967-4545-bd0f-06cc259b395c-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "fbd6bf59-f967-4545-bd0f-06cc259b395c" (UID: "fbd6bf59-f967-4545-bd0f-06cc259b395c"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:06:09.434781 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:09.434755 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbd6bf59-f967-4545-bd0f-06cc259b395c-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "fbd6bf59-f967-4545-bd0f-06cc259b395c" (UID: "fbd6bf59-f967-4545-bd0f-06cc259b395c"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:06:09.526423 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:09.526387 2576 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fbd6bf59-f967-4545-bd0f-06cc259b395c-bound-sa-token\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:06:09.526423 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:09.526418 2576 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fbd6bf59-f967-4545-bd0f-06cc259b395c-installation-pull-secrets\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:06:09.526423 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:09.526428 2576 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/fbd6bf59-f967-4545-bd0f-06cc259b395c-image-registry-private-configuration\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:06:09.526701 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:09.526439 2576 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fbd6bf59-f967-4545-bd0f-06cc259b395c-ca-trust-extracted\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:06:09.526701 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:09.526448 2576 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fbd6bf59-f967-4545-bd0f-06cc259b395c-registry-certificates\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:06:09.526701 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:09.526457 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hlz65\" (UniqueName: \"kubernetes.io/projected/fbd6bf59-f967-4545-bd0f-06cc259b395c-kube-api-access-hlz65\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:06:09.526701 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:09.526483 2576 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fbd6bf59-f967-4545-bd0f-06cc259b395c-registry-tls\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:06:09.526701 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:09.526492 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fbd6bf59-f967-4545-bd0f-06cc259b395c-trusted-ca\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:06:09.808277 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:09.808176 2576 generic.go:358] "Generic (PLEG): container finished" podID="fbd6bf59-f967-4545-bd0f-06cc259b395c" containerID="267e072242b1ba4c657e87c1f40f14deea4b6e99dcb0744bf526eb50c3c6c03b" exitCode=0 Apr 16 16:06:09.808277 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:09.808244 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-754d7b9497-rdfwj" event={"ID":"fbd6bf59-f967-4545-bd0f-06cc259b395c","Type":"ContainerDied","Data":"267e072242b1ba4c657e87c1f40f14deea4b6e99dcb0744bf526eb50c3c6c03b"} Apr 16 16:06:09.808277 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:09.808266 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-754d7b9497-rdfwj" Apr 16 16:06:09.808505 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:09.808281 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-754d7b9497-rdfwj" event={"ID":"fbd6bf59-f967-4545-bd0f-06cc259b395c","Type":"ContainerDied","Data":"9795094f0e5c3bf222519d5710beb124141402568ab0cc460dcbd78458b68c21"} Apr 16 16:06:09.808505 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:09.808296 2576 scope.go:117] "RemoveContainer" containerID="267e072242b1ba4c657e87c1f40f14deea4b6e99dcb0744bf526eb50c3c6c03b" Apr 16 16:06:09.816409 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:09.816394 2576 scope.go:117] "RemoveContainer" containerID="267e072242b1ba4c657e87c1f40f14deea4b6e99dcb0744bf526eb50c3c6c03b" Apr 16 16:06:09.816699 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:06:09.816673 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"267e072242b1ba4c657e87c1f40f14deea4b6e99dcb0744bf526eb50c3c6c03b\": container with ID starting with 267e072242b1ba4c657e87c1f40f14deea4b6e99dcb0744bf526eb50c3c6c03b not found: ID does not exist" containerID="267e072242b1ba4c657e87c1f40f14deea4b6e99dcb0744bf526eb50c3c6c03b" Apr 16 16:06:09.816758 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:09.816710 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"267e072242b1ba4c657e87c1f40f14deea4b6e99dcb0744bf526eb50c3c6c03b"} err="failed to get container status \"267e072242b1ba4c657e87c1f40f14deea4b6e99dcb0744bf526eb50c3c6c03b\": rpc error: code = NotFound desc = could not find container \"267e072242b1ba4c657e87c1f40f14deea4b6e99dcb0744bf526eb50c3c6c03b\": container with ID starting with 267e072242b1ba4c657e87c1f40f14deea4b6e99dcb0744bf526eb50c3c6c03b not found: ID does not exist" Apr 16 16:06:09.834980 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:09.834946 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-754d7b9497-rdfwj"] Apr 16 16:06:09.836858 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:09.836835 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-754d7b9497-rdfwj"] Apr 16 16:06:11.278169 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:11.278137 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbd6bf59-f967-4545-bd0f-06cc259b395c" path="/var/lib/kubelet/pods/fbd6bf59-f967-4545-bd0f-06cc259b395c/volumes" Apr 16 16:06:15.077501 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:15.077464 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-586b57c7b4-knjdk"] Apr 16 16:06:15.077945 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:15.077726 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fbd6bf59-f967-4545-bd0f-06cc259b395c" containerName="registry" Apr 16 16:06:15.077945 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:15.077739 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbd6bf59-f967-4545-bd0f-06cc259b395c" containerName="registry" Apr 16 16:06:15.077945 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:15.077777 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="fbd6bf59-f967-4545-bd0f-06cc259b395c" containerName="registry" Apr 16 16:06:15.080503 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:15.080483 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-knjdk" Apr 16 16:06:15.083738 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:15.083718 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 16:06:15.083867 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:15.083811 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-xtjvr\"" Apr 16 16:06:15.083927 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:15.083859 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 16:06:15.090773 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:15.090745 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-knjdk"] Apr 16 16:06:15.168986 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:15.168946 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5cbh\" (UniqueName: \"kubernetes.io/projected/4270a553-9c4d-44df-8a3d-2ec2e74d38b1-kube-api-access-j5cbh\") pod \"downloads-586b57c7b4-knjdk\" (UID: \"4270a553-9c4d-44df-8a3d-2ec2e74d38b1\") " pod="openshift-console/downloads-586b57c7b4-knjdk" Apr 16 16:06:15.270281 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:15.270244 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j5cbh\" (UniqueName: \"kubernetes.io/projected/4270a553-9c4d-44df-8a3d-2ec2e74d38b1-kube-api-access-j5cbh\") pod \"downloads-586b57c7b4-knjdk\" (UID: \"4270a553-9c4d-44df-8a3d-2ec2e74d38b1\") " pod="openshift-console/downloads-586b57c7b4-knjdk" Apr 16 16:06:15.280401 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:15.280371 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5cbh\" (UniqueName: \"kubernetes.io/projected/4270a553-9c4d-44df-8a3d-2ec2e74d38b1-kube-api-access-j5cbh\") pod \"downloads-586b57c7b4-knjdk\" (UID: \"4270a553-9c4d-44df-8a3d-2ec2e74d38b1\") " pod="openshift-console/downloads-586b57c7b4-knjdk" Apr 16 16:06:15.393537 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:15.393448 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-knjdk" Apr 16 16:06:15.513504 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:15.513477 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-knjdk"] Apr 16 16:06:15.516875 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:06:15.516848 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4270a553_9c4d_44df_8a3d_2ec2e74d38b1.slice/crio-67f9fe5e7d585c19785af2005b03f96b9d82441ce651da709c03b0f66217ddfa WatchSource:0}: Error finding container 67f9fe5e7d585c19785af2005b03f96b9d82441ce651da709c03b0f66217ddfa: Status 404 returned error can't find the container with id 67f9fe5e7d585c19785af2005b03f96b9d82441ce651da709c03b0f66217ddfa Apr 16 16:06:15.825237 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:15.825190 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-knjdk" event={"ID":"4270a553-9c4d-44df-8a3d-2ec2e74d38b1","Type":"ContainerStarted","Data":"67f9fe5e7d585c19785af2005b03f96b9d82441ce651da709c03b0f66217ddfa"} Apr 16 16:06:24.929557 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:24.929523 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-76c7575ff8-9zxt9"] Apr 16 16:06:24.933084 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:24.933064 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76c7575ff8-9zxt9" Apr 16 16:06:24.937976 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:24.937950 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 16:06:24.938121 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:24.937952 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 16:06:24.938324 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:24.938276 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 16:06:24.938901 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:24.938857 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 16:06:24.938901 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:24.938898 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-jwjgm\"" Apr 16 16:06:24.939063 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:24.939007 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 16:06:24.956854 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:24.956824 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-76c7575ff8-9zxt9"] Apr 16 16:06:25.057496 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:25.057458 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8f222af2-9e41-44d5-8b00-aa5477fb6187-service-ca\") pod \"console-76c7575ff8-9zxt9\" (UID: \"8f222af2-9e41-44d5-8b00-aa5477fb6187\") " pod="openshift-console/console-76c7575ff8-9zxt9" Apr 16 16:06:25.057496 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:25.057497 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ldx7\" (UniqueName: \"kubernetes.io/projected/8f222af2-9e41-44d5-8b00-aa5477fb6187-kube-api-access-6ldx7\") pod \"console-76c7575ff8-9zxt9\" (UID: \"8f222af2-9e41-44d5-8b00-aa5477fb6187\") " pod="openshift-console/console-76c7575ff8-9zxt9" Apr 16 16:06:25.057743 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:25.057580 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8f222af2-9e41-44d5-8b00-aa5477fb6187-console-config\") pod \"console-76c7575ff8-9zxt9\" (UID: \"8f222af2-9e41-44d5-8b00-aa5477fb6187\") " pod="openshift-console/console-76c7575ff8-9zxt9" Apr 16 16:06:25.057743 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:25.057609 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8f222af2-9e41-44d5-8b00-aa5477fb6187-console-oauth-config\") pod \"console-76c7575ff8-9zxt9\" (UID: \"8f222af2-9e41-44d5-8b00-aa5477fb6187\") " pod="openshift-console/console-76c7575ff8-9zxt9" Apr 16 16:06:25.057743 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:25.057637 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f222af2-9e41-44d5-8b00-aa5477fb6187-console-serving-cert\") pod \"console-76c7575ff8-9zxt9\" (UID: \"8f222af2-9e41-44d5-8b00-aa5477fb6187\") " pod="openshift-console/console-76c7575ff8-9zxt9" Apr 16 16:06:25.057743 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:25.057675 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8f222af2-9e41-44d5-8b00-aa5477fb6187-oauth-serving-cert\") pod \"console-76c7575ff8-9zxt9\" (UID: \"8f222af2-9e41-44d5-8b00-aa5477fb6187\") " pod="openshift-console/console-76c7575ff8-9zxt9" Apr 16 16:06:25.158924 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:25.158891 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8f222af2-9e41-44d5-8b00-aa5477fb6187-service-ca\") pod \"console-76c7575ff8-9zxt9\" (UID: \"8f222af2-9e41-44d5-8b00-aa5477fb6187\") " pod="openshift-console/console-76c7575ff8-9zxt9" Apr 16 16:06:25.159096 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:25.158936 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ldx7\" (UniqueName: \"kubernetes.io/projected/8f222af2-9e41-44d5-8b00-aa5477fb6187-kube-api-access-6ldx7\") pod \"console-76c7575ff8-9zxt9\" (UID: \"8f222af2-9e41-44d5-8b00-aa5477fb6187\") " pod="openshift-console/console-76c7575ff8-9zxt9" Apr 16 16:06:25.159096 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:25.158996 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8f222af2-9e41-44d5-8b00-aa5477fb6187-console-config\") pod \"console-76c7575ff8-9zxt9\" (UID: \"8f222af2-9e41-44d5-8b00-aa5477fb6187\") " pod="openshift-console/console-76c7575ff8-9zxt9" Apr 16 16:06:25.159096 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:25.159018 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8f222af2-9e41-44d5-8b00-aa5477fb6187-console-oauth-config\") pod \"console-76c7575ff8-9zxt9\" (UID: \"8f222af2-9e41-44d5-8b00-aa5477fb6187\") " pod="openshift-console/console-76c7575ff8-9zxt9" Apr 16 16:06:25.159096 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:25.159035 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f222af2-9e41-44d5-8b00-aa5477fb6187-console-serving-cert\") pod \"console-76c7575ff8-9zxt9\" (UID: \"8f222af2-9e41-44d5-8b00-aa5477fb6187\") " pod="openshift-console/console-76c7575ff8-9zxt9" Apr 16 16:06:25.159096 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:25.159087 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8f222af2-9e41-44d5-8b00-aa5477fb6187-oauth-serving-cert\") pod \"console-76c7575ff8-9zxt9\" (UID: \"8f222af2-9e41-44d5-8b00-aa5477fb6187\") " pod="openshift-console/console-76c7575ff8-9zxt9" Apr 16 16:06:25.159745 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:25.159711 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8f222af2-9e41-44d5-8b00-aa5477fb6187-service-ca\") pod \"console-76c7575ff8-9zxt9\" (UID: \"8f222af2-9e41-44d5-8b00-aa5477fb6187\") " pod="openshift-console/console-76c7575ff8-9zxt9" Apr 16 16:06:25.159927 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:25.159857 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8f222af2-9e41-44d5-8b00-aa5477fb6187-console-config\") pod \"console-76c7575ff8-9zxt9\" (UID: \"8f222af2-9e41-44d5-8b00-aa5477fb6187\") " pod="openshift-console/console-76c7575ff8-9zxt9" Apr 16 16:06:25.160276 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:25.160251 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8f222af2-9e41-44d5-8b00-aa5477fb6187-oauth-serving-cert\") pod \"console-76c7575ff8-9zxt9\" (UID: \"8f222af2-9e41-44d5-8b00-aa5477fb6187\") " pod="openshift-console/console-76c7575ff8-9zxt9" Apr 16 16:06:25.161978 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:25.161956 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f222af2-9e41-44d5-8b00-aa5477fb6187-console-serving-cert\") pod \"console-76c7575ff8-9zxt9\" (UID: \"8f222af2-9e41-44d5-8b00-aa5477fb6187\") " pod="openshift-console/console-76c7575ff8-9zxt9" Apr 16 16:06:25.162088 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:25.161976 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8f222af2-9e41-44d5-8b00-aa5477fb6187-console-oauth-config\") pod \"console-76c7575ff8-9zxt9\" (UID: \"8f222af2-9e41-44d5-8b00-aa5477fb6187\") " pod="openshift-console/console-76c7575ff8-9zxt9" Apr 16 16:06:25.177974 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:25.177950 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ldx7\" (UniqueName: \"kubernetes.io/projected/8f222af2-9e41-44d5-8b00-aa5477fb6187-kube-api-access-6ldx7\") pod \"console-76c7575ff8-9zxt9\" (UID: \"8f222af2-9e41-44d5-8b00-aa5477fb6187\") " pod="openshift-console/console-76c7575ff8-9zxt9" Apr 16 16:06:25.244459 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:25.244373 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76c7575ff8-9zxt9" Apr 16 16:06:29.359866 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:29.359823 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-bd7b8f78c-488dr"] Apr 16 16:06:29.366530 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:29.366501 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bd7b8f78c-488dr" Apr 16 16:06:29.374974 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:29.374944 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 16:06:29.375258 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:29.375233 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-bd7b8f78c-488dr"] Apr 16 16:06:29.391727 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:29.391695 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vj6xf\" (UniqueName: \"kubernetes.io/projected/45052aca-38b1-43e2-8dbb-e08cd9a6ade2-kube-api-access-vj6xf\") pod \"console-bd7b8f78c-488dr\" (UID: \"45052aca-38b1-43e2-8dbb-e08cd9a6ade2\") " pod="openshift-console/console-bd7b8f78c-488dr" Apr 16 16:06:29.391901 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:29.391750 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45052aca-38b1-43e2-8dbb-e08cd9a6ade2-trusted-ca-bundle\") pod \"console-bd7b8f78c-488dr\" (UID: \"45052aca-38b1-43e2-8dbb-e08cd9a6ade2\") " pod="openshift-console/console-bd7b8f78c-488dr" Apr 16 16:06:29.391901 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:29.391786 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/45052aca-38b1-43e2-8dbb-e08cd9a6ade2-console-serving-cert\") pod \"console-bd7b8f78c-488dr\" (UID: \"45052aca-38b1-43e2-8dbb-e08cd9a6ade2\") " pod="openshift-console/console-bd7b8f78c-488dr" Apr 16 16:06:29.391901 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:29.391816 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/45052aca-38b1-43e2-8dbb-e08cd9a6ade2-console-oauth-config\") pod \"console-bd7b8f78c-488dr\" (UID: \"45052aca-38b1-43e2-8dbb-e08cd9a6ade2\") " pod="openshift-console/console-bd7b8f78c-488dr" Apr 16 16:06:29.392065 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:29.391899 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/45052aca-38b1-43e2-8dbb-e08cd9a6ade2-service-ca\") pod \"console-bd7b8f78c-488dr\" (UID: \"45052aca-38b1-43e2-8dbb-e08cd9a6ade2\") " pod="openshift-console/console-bd7b8f78c-488dr" Apr 16 16:06:29.392065 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:29.391938 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/45052aca-38b1-43e2-8dbb-e08cd9a6ade2-oauth-serving-cert\") pod \"console-bd7b8f78c-488dr\" (UID: \"45052aca-38b1-43e2-8dbb-e08cd9a6ade2\") " pod="openshift-console/console-bd7b8f78c-488dr" Apr 16 16:06:29.392065 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:29.391970 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/45052aca-38b1-43e2-8dbb-e08cd9a6ade2-console-config\") pod \"console-bd7b8f78c-488dr\" (UID: \"45052aca-38b1-43e2-8dbb-e08cd9a6ade2\") " pod="openshift-console/console-bd7b8f78c-488dr" Apr 16 16:06:29.492498 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:29.492455 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/45052aca-38b1-43e2-8dbb-e08cd9a6ade2-console-config\") pod \"console-bd7b8f78c-488dr\" (UID: \"45052aca-38b1-43e2-8dbb-e08cd9a6ade2\") " pod="openshift-console/console-bd7b8f78c-488dr" Apr 16 16:06:29.492705 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:29.492548 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vj6xf\" (UniqueName: \"kubernetes.io/projected/45052aca-38b1-43e2-8dbb-e08cd9a6ade2-kube-api-access-vj6xf\") pod \"console-bd7b8f78c-488dr\" (UID: \"45052aca-38b1-43e2-8dbb-e08cd9a6ade2\") " pod="openshift-console/console-bd7b8f78c-488dr" Apr 16 16:06:29.492705 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:29.492592 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45052aca-38b1-43e2-8dbb-e08cd9a6ade2-trusted-ca-bundle\") pod \"console-bd7b8f78c-488dr\" (UID: \"45052aca-38b1-43e2-8dbb-e08cd9a6ade2\") " pod="openshift-console/console-bd7b8f78c-488dr" Apr 16 16:06:29.492705 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:29.492624 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/45052aca-38b1-43e2-8dbb-e08cd9a6ade2-console-serving-cert\") pod \"console-bd7b8f78c-488dr\" (UID: \"45052aca-38b1-43e2-8dbb-e08cd9a6ade2\") " pod="openshift-console/console-bd7b8f78c-488dr" Apr 16 16:06:29.492705 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:29.492646 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/45052aca-38b1-43e2-8dbb-e08cd9a6ade2-console-oauth-config\") pod \"console-bd7b8f78c-488dr\" (UID: \"45052aca-38b1-43e2-8dbb-e08cd9a6ade2\") " pod="openshift-console/console-bd7b8f78c-488dr" Apr 16 16:06:29.492933 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:29.492787 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/45052aca-38b1-43e2-8dbb-e08cd9a6ade2-service-ca\") pod \"console-bd7b8f78c-488dr\" (UID: \"45052aca-38b1-43e2-8dbb-e08cd9a6ade2\") " pod="openshift-console/console-bd7b8f78c-488dr" Apr 16 16:06:29.493123 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:29.493053 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/45052aca-38b1-43e2-8dbb-e08cd9a6ade2-oauth-serving-cert\") pod \"console-bd7b8f78c-488dr\" (UID: \"45052aca-38b1-43e2-8dbb-e08cd9a6ade2\") " pod="openshift-console/console-bd7b8f78c-488dr" Apr 16 16:06:29.493355 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:29.493304 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/45052aca-38b1-43e2-8dbb-e08cd9a6ade2-console-config\") pod \"console-bd7b8f78c-488dr\" (UID: \"45052aca-38b1-43e2-8dbb-e08cd9a6ade2\") " pod="openshift-console/console-bd7b8f78c-488dr" Apr 16 16:06:29.493597 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:29.493573 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/45052aca-38b1-43e2-8dbb-e08cd9a6ade2-service-ca\") pod \"console-bd7b8f78c-488dr\" (UID: \"45052aca-38b1-43e2-8dbb-e08cd9a6ade2\") " pod="openshift-console/console-bd7b8f78c-488dr" Apr 16 16:06:29.493817 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:29.493770 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/45052aca-38b1-43e2-8dbb-e08cd9a6ade2-oauth-serving-cert\") pod \"console-bd7b8f78c-488dr\" (UID: \"45052aca-38b1-43e2-8dbb-e08cd9a6ade2\") " pod="openshift-console/console-bd7b8f78c-488dr" Apr 16 16:06:29.493930 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:29.493871 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45052aca-38b1-43e2-8dbb-e08cd9a6ade2-trusted-ca-bundle\") pod \"console-bd7b8f78c-488dr\" (UID: \"45052aca-38b1-43e2-8dbb-e08cd9a6ade2\") " pod="openshift-console/console-bd7b8f78c-488dr" Apr 16 16:06:29.495459 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:29.495431 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/45052aca-38b1-43e2-8dbb-e08cd9a6ade2-console-serving-cert\") pod \"console-bd7b8f78c-488dr\" (UID: \"45052aca-38b1-43e2-8dbb-e08cd9a6ade2\") " pod="openshift-console/console-bd7b8f78c-488dr" Apr 16 16:06:29.495551 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:29.495534 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/45052aca-38b1-43e2-8dbb-e08cd9a6ade2-console-oauth-config\") pod \"console-bd7b8f78c-488dr\" (UID: \"45052aca-38b1-43e2-8dbb-e08cd9a6ade2\") " pod="openshift-console/console-bd7b8f78c-488dr" Apr 16 16:06:29.509138 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:29.509109 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vj6xf\" (UniqueName: \"kubernetes.io/projected/45052aca-38b1-43e2-8dbb-e08cd9a6ade2-kube-api-access-vj6xf\") pod \"console-bd7b8f78c-488dr\" (UID: \"45052aca-38b1-43e2-8dbb-e08cd9a6ade2\") " pod="openshift-console/console-bd7b8f78c-488dr" Apr 16 16:06:29.678700 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:29.678660 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bd7b8f78c-488dr" Apr 16 16:06:30.988930 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:30.988905 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-bd7b8f78c-488dr"] Apr 16 16:06:31.003279 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:06:31.003249 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45052aca_38b1_43e2_8dbb_e08cd9a6ade2.slice/crio-d904a681791b5f5e9549eed143626b1f24d47fc039596390f949348318fd3ec1 WatchSource:0}: Error finding container d904a681791b5f5e9549eed143626b1f24d47fc039596390f949348318fd3ec1: Status 404 returned error can't find the container with id d904a681791b5f5e9549eed143626b1f24d47fc039596390f949348318fd3ec1 Apr 16 16:06:31.022981 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:31.022952 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-76c7575ff8-9zxt9"] Apr 16 16:06:31.025309 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:06:31.025282 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f222af2_9e41_44d5_8b00_aa5477fb6187.slice/crio-aeab175cac1806098e4194a3871b4e72d2210832e88b673da6d6e8636cdc92f1 WatchSource:0}: Error finding container aeab175cac1806098e4194a3871b4e72d2210832e88b673da6d6e8636cdc92f1: Status 404 returned error can't find the container with id aeab175cac1806098e4194a3871b4e72d2210832e88b673da6d6e8636cdc92f1 Apr 16 16:06:31.874232 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:31.873922 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-knjdk" event={"ID":"4270a553-9c4d-44df-8a3d-2ec2e74d38b1","Type":"ContainerStarted","Data":"29a10b1bed3cd9c0852b5deacf5fecc7519e7620e62050e4136b135af2190214"} Apr 16 16:06:31.874232 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:31.873989 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-586b57c7b4-knjdk" Apr 16 16:06:31.876789 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:31.876757 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76c7575ff8-9zxt9" event={"ID":"8f222af2-9e41-44d5-8b00-aa5477fb6187","Type":"ContainerStarted","Data":"aeab175cac1806098e4194a3871b4e72d2210832e88b673da6d6e8636cdc92f1"} Apr 16 16:06:31.878326 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:31.878289 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bd7b8f78c-488dr" event={"ID":"45052aca-38b1-43e2-8dbb-e08cd9a6ade2","Type":"ContainerStarted","Data":"d904a681791b5f5e9549eed143626b1f24d47fc039596390f949348318fd3ec1"} Apr 16 16:06:31.887593 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:31.887560 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-586b57c7b4-knjdk" Apr 16 16:06:31.893797 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:31.892628 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-586b57c7b4-knjdk" podStartSLOduration=1.451265622 podStartE2EDuration="16.892611782s" podCreationTimestamp="2026-04-16 16:06:15 +0000 UTC" firstStartedPulling="2026-04-16 16:06:15.519192905 +0000 UTC m=+198.874183015" lastFinishedPulling="2026-04-16 16:06:30.960539054 +0000 UTC m=+214.315529175" observedRunningTime="2026-04-16 16:06:31.892469657 +0000 UTC m=+215.247459787" watchObservedRunningTime="2026-04-16 16:06:31.892611782 +0000 UTC m=+215.247601911" Apr 16 16:06:35.891500 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:35.891456 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76c7575ff8-9zxt9" event={"ID":"8f222af2-9e41-44d5-8b00-aa5477fb6187","Type":"ContainerStarted","Data":"edb81c7e38ddc138df9cc89b90439ba552fc23d1e26396f97be78e460320834f"} Apr 16 16:06:35.893083 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:35.893057 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bd7b8f78c-488dr" event={"ID":"45052aca-38b1-43e2-8dbb-e08cd9a6ade2","Type":"ContainerStarted","Data":"cd85b8ca6e5914bdbe480a271f46b34842d3134f7c2fde45f803b4e170a072e5"} Apr 16 16:06:35.915492 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:35.915442 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-76c7575ff8-9zxt9" podStartSLOduration=8.0302978 podStartE2EDuration="11.915427337s" podCreationTimestamp="2026-04-16 16:06:24 +0000 UTC" firstStartedPulling="2026-04-16 16:06:31.027199249 +0000 UTC m=+214.382189371" lastFinishedPulling="2026-04-16 16:06:34.912328798 +0000 UTC m=+218.267318908" observedRunningTime="2026-04-16 16:06:35.914517633 +0000 UTC m=+219.269507760" watchObservedRunningTime="2026-04-16 16:06:35.915427337 +0000 UTC m=+219.270417466" Apr 16 16:06:35.935357 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:35.935310 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-bd7b8f78c-488dr" podStartSLOduration=3.021159879 podStartE2EDuration="6.935292943s" podCreationTimestamp="2026-04-16 16:06:29 +0000 UTC" firstStartedPulling="2026-04-16 16:06:31.005322485 +0000 UTC m=+214.360312592" lastFinishedPulling="2026-04-16 16:06:34.919455544 +0000 UTC m=+218.274445656" observedRunningTime="2026-04-16 16:06:35.934900253 +0000 UTC m=+219.289890378" watchObservedRunningTime="2026-04-16 16:06:35.935292943 +0000 UTC m=+219.290283072" Apr 16 16:06:39.679727 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:39.679679 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-bd7b8f78c-488dr" Apr 16 16:06:39.679727 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:39.679736 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-bd7b8f78c-488dr" Apr 16 16:06:39.685304 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:39.685276 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-bd7b8f78c-488dr" Apr 16 16:06:39.910448 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:39.910416 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-bd7b8f78c-488dr" Apr 16 16:06:39.998471 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:39.998390 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-76c7575ff8-9zxt9"] Apr 16 16:06:45.244694 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:45.244656 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-76c7575ff8-9zxt9" Apr 16 16:06:56.955439 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:56.955402 2576 generic.go:358] "Generic (PLEG): container finished" podID="ef511675-c77f-44dc-a5c2-e3ec14d14609" containerID="29f52ff9d4bc6414f17322783831bc93f901944c88153ad1b83e4202da4dba0d" exitCode=0 Apr 16 16:06:56.955899 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:56.955463 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-57wjn" event={"ID":"ef511675-c77f-44dc-a5c2-e3ec14d14609","Type":"ContainerDied","Data":"29f52ff9d4bc6414f17322783831bc93f901944c88153ad1b83e4202da4dba0d"} Apr 16 16:06:56.955899 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:56.955833 2576 scope.go:117] "RemoveContainer" containerID="29f52ff9d4bc6414f17322783831bc93f901944c88153ad1b83e4202da4dba0d" Apr 16 16:06:57.959620 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:06:57.959587 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-57wjn" event={"ID":"ef511675-c77f-44dc-a5c2-e3ec14d14609","Type":"ContainerStarted","Data":"943123d13804226fd801e63f1f18e36725993dcfd54e19728e64f0641bc21dcf"} Apr 16 16:07:05.021859 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:05.021820 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-76c7575ff8-9zxt9" podUID="8f222af2-9e41-44d5-8b00-aa5477fb6187" containerName="console" containerID="cri-o://edb81c7e38ddc138df9cc89b90439ba552fc23d1e26396f97be78e460320834f" gracePeriod=15 Apr 16 16:07:05.288011 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:05.287990 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-76c7575ff8-9zxt9_8f222af2-9e41-44d5-8b00-aa5477fb6187/console/0.log" Apr 16 16:07:05.288121 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:05.288046 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76c7575ff8-9zxt9" Apr 16 16:07:05.393497 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:05.393466 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8f222af2-9e41-44d5-8b00-aa5477fb6187-service-ca\") pod \"8f222af2-9e41-44d5-8b00-aa5477fb6187\" (UID: \"8f222af2-9e41-44d5-8b00-aa5477fb6187\") " Apr 16 16:07:05.393497 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:05.393498 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8f222af2-9e41-44d5-8b00-aa5477fb6187-console-config\") pod \"8f222af2-9e41-44d5-8b00-aa5477fb6187\" (UID: \"8f222af2-9e41-44d5-8b00-aa5477fb6187\") " Apr 16 16:07:05.393694 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:05.393527 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8f222af2-9e41-44d5-8b00-aa5477fb6187-console-oauth-config\") pod \"8f222af2-9e41-44d5-8b00-aa5477fb6187\" (UID: \"8f222af2-9e41-44d5-8b00-aa5477fb6187\") " Apr 16 16:07:05.393694 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:05.393547 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8f222af2-9e41-44d5-8b00-aa5477fb6187-oauth-serving-cert\") pod \"8f222af2-9e41-44d5-8b00-aa5477fb6187\" (UID: \"8f222af2-9e41-44d5-8b00-aa5477fb6187\") " Apr 16 16:07:05.393694 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:05.393584 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f222af2-9e41-44d5-8b00-aa5477fb6187-console-serving-cert\") pod \"8f222af2-9e41-44d5-8b00-aa5477fb6187\" (UID: \"8f222af2-9e41-44d5-8b00-aa5477fb6187\") " Apr 16 16:07:05.393694 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:05.393617 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ldx7\" (UniqueName: \"kubernetes.io/projected/8f222af2-9e41-44d5-8b00-aa5477fb6187-kube-api-access-6ldx7\") pod \"8f222af2-9e41-44d5-8b00-aa5477fb6187\" (UID: \"8f222af2-9e41-44d5-8b00-aa5477fb6187\") " Apr 16 16:07:05.393987 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:05.393958 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f222af2-9e41-44d5-8b00-aa5477fb6187-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "8f222af2-9e41-44d5-8b00-aa5477fb6187" (UID: "8f222af2-9e41-44d5-8b00-aa5477fb6187"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:07:05.394122 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:05.393954 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f222af2-9e41-44d5-8b00-aa5477fb6187-console-config" (OuterVolumeSpecName: "console-config") pod "8f222af2-9e41-44d5-8b00-aa5477fb6187" (UID: "8f222af2-9e41-44d5-8b00-aa5477fb6187"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:07:05.394122 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:05.393966 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f222af2-9e41-44d5-8b00-aa5477fb6187-service-ca" (OuterVolumeSpecName: "service-ca") pod "8f222af2-9e41-44d5-8b00-aa5477fb6187" (UID: "8f222af2-9e41-44d5-8b00-aa5477fb6187"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:07:05.394264 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:05.394124 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8f222af2-9e41-44d5-8b00-aa5477fb6187-service-ca\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:07:05.394264 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:05.394145 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8f222af2-9e41-44d5-8b00-aa5477fb6187-console-config\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:07:05.394264 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:05.394159 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8f222af2-9e41-44d5-8b00-aa5477fb6187-oauth-serving-cert\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:07:05.395875 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:05.395853 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f222af2-9e41-44d5-8b00-aa5477fb6187-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "8f222af2-9e41-44d5-8b00-aa5477fb6187" (UID: "8f222af2-9e41-44d5-8b00-aa5477fb6187"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:07:05.395971 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:05.395915 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f222af2-9e41-44d5-8b00-aa5477fb6187-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "8f222af2-9e41-44d5-8b00-aa5477fb6187" (UID: "8f222af2-9e41-44d5-8b00-aa5477fb6187"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:07:05.396047 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:05.396032 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f222af2-9e41-44d5-8b00-aa5477fb6187-kube-api-access-6ldx7" (OuterVolumeSpecName: "kube-api-access-6ldx7") pod "8f222af2-9e41-44d5-8b00-aa5477fb6187" (UID: "8f222af2-9e41-44d5-8b00-aa5477fb6187"). InnerVolumeSpecName "kube-api-access-6ldx7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:07:05.494510 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:05.494471 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8f222af2-9e41-44d5-8b00-aa5477fb6187-console-oauth-config\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:07:05.494510 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:05.494507 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8f222af2-9e41-44d5-8b00-aa5477fb6187-console-serving-cert\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:07:05.494510 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:05.494517 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6ldx7\" (UniqueName: \"kubernetes.io/projected/8f222af2-9e41-44d5-8b00-aa5477fb6187-kube-api-access-6ldx7\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:07:05.984199 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:05.984170 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-76c7575ff8-9zxt9_8f222af2-9e41-44d5-8b00-aa5477fb6187/console/0.log" Apr 16 16:07:05.984376 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:05.984226 2576 generic.go:358] "Generic (PLEG): container finished" podID="8f222af2-9e41-44d5-8b00-aa5477fb6187" containerID="edb81c7e38ddc138df9cc89b90439ba552fc23d1e26396f97be78e460320834f" exitCode=2 Apr 16 16:07:05.984376 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:05.984254 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76c7575ff8-9zxt9" event={"ID":"8f222af2-9e41-44d5-8b00-aa5477fb6187","Type":"ContainerDied","Data":"edb81c7e38ddc138df9cc89b90439ba552fc23d1e26396f97be78e460320834f"} Apr 16 16:07:05.984376 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:05.984277 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76c7575ff8-9zxt9" event={"ID":"8f222af2-9e41-44d5-8b00-aa5477fb6187","Type":"ContainerDied","Data":"aeab175cac1806098e4194a3871b4e72d2210832e88b673da6d6e8636cdc92f1"} Apr 16 16:07:05.984376 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:05.984288 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76c7575ff8-9zxt9" Apr 16 16:07:05.984376 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:05.984291 2576 scope.go:117] "RemoveContainer" containerID="edb81c7e38ddc138df9cc89b90439ba552fc23d1e26396f97be78e460320834f" Apr 16 16:07:05.998429 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:05.998413 2576 scope.go:117] "RemoveContainer" containerID="edb81c7e38ddc138df9cc89b90439ba552fc23d1e26396f97be78e460320834f" Apr 16 16:07:05.998671 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:07:05.998653 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edb81c7e38ddc138df9cc89b90439ba552fc23d1e26396f97be78e460320834f\": container with ID starting with edb81c7e38ddc138df9cc89b90439ba552fc23d1e26396f97be78e460320834f not found: ID does not exist" containerID="edb81c7e38ddc138df9cc89b90439ba552fc23d1e26396f97be78e460320834f" Apr 16 16:07:05.998742 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:05.998679 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edb81c7e38ddc138df9cc89b90439ba552fc23d1e26396f97be78e460320834f"} err="failed to get container status \"edb81c7e38ddc138df9cc89b90439ba552fc23d1e26396f97be78e460320834f\": rpc error: code = NotFound desc = could not find container \"edb81c7e38ddc138df9cc89b90439ba552fc23d1e26396f97be78e460320834f\": container with ID starting with edb81c7e38ddc138df9cc89b90439ba552fc23d1e26396f97be78e460320834f not found: ID does not exist" Apr 16 16:07:06.005299 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:06.005280 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-76c7575ff8-9zxt9"] Apr 16 16:07:06.008828 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:06.008809 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-76c7575ff8-9zxt9"] Apr 16 16:07:07.277891 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:07.277860 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f222af2-9e41-44d5-8b00-aa5477fb6187" path="/var/lib/kubelet/pods/8f222af2-9e41-44d5-8b00-aa5477fb6187/volumes" Apr 16 16:07:09.023295 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:09.023256 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/916e5e50-1aef-4277-971a-7f2e8ffd2703-metrics-certs\") pod \"network-metrics-daemon-gvndv\" (UID: \"916e5e50-1aef-4277-971a-7f2e8ffd2703\") " pod="openshift-multus/network-metrics-daemon-gvndv" Apr 16 16:07:09.025548 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:09.025529 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/916e5e50-1aef-4277-971a-7f2e8ffd2703-metrics-certs\") pod \"network-metrics-daemon-gvndv\" (UID: \"916e5e50-1aef-4277-971a-7f2e8ffd2703\") " pod="openshift-multus/network-metrics-daemon-gvndv" Apr 16 16:07:09.277432 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:09.277360 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-hwq6n\"" Apr 16 16:07:09.285371 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:09.285349 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gvndv" Apr 16 16:07:09.411337 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:09.409784 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gvndv"] Apr 16 16:07:09.413300 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:07:09.413257 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod916e5e50_1aef_4277_971a_7f2e8ffd2703.slice/crio-03cacf411d944aefff8e3bd09c74e6d02552b9aa7595f98836c68f0db6264f69 WatchSource:0}: Error finding container 03cacf411d944aefff8e3bd09c74e6d02552b9aa7595f98836c68f0db6264f69: Status 404 returned error can't find the container with id 03cacf411d944aefff8e3bd09c74e6d02552b9aa7595f98836c68f0db6264f69 Apr 16 16:07:09.996888 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:09.996853 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gvndv" event={"ID":"916e5e50-1aef-4277-971a-7f2e8ffd2703","Type":"ContainerStarted","Data":"03cacf411d944aefff8e3bd09c74e6d02552b9aa7595f98836c68f0db6264f69"} Apr 16 16:07:11.000889 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:11.000860 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gvndv" event={"ID":"916e5e50-1aef-4277-971a-7f2e8ffd2703","Type":"ContainerStarted","Data":"62ba2ca0104197cf66ddcfb88e2b782de05e5794f20a4f7e247ad30a25827fd5"} Apr 16 16:07:12.004819 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:12.004786 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gvndv" event={"ID":"916e5e50-1aef-4277-971a-7f2e8ffd2703","Type":"ContainerStarted","Data":"55ff537bf419100d17dbb0643ec64d282696f84625f76f45f21e3e3e2438bb06"} Apr 16 16:07:12.023941 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:12.023891 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-gvndv" podStartSLOduration=253.72295415 podStartE2EDuration="4m15.023874253s" podCreationTimestamp="2026-04-16 16:02:57 +0000 UTC" firstStartedPulling="2026-04-16 16:07:09.415328411 +0000 UTC m=+252.770318519" lastFinishedPulling="2026-04-16 16:07:10.7162485 +0000 UTC m=+254.071238622" observedRunningTime="2026-04-16 16:07:12.022017197 +0000 UTC m=+255.377007326" watchObservedRunningTime="2026-04-16 16:07:12.023874253 +0000 UTC m=+255.378864382" Apr 16 16:07:18.449580 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:18.449501 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5fd85f4cb6-r9bxs"] Apr 16 16:07:18.449911 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:18.449757 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8f222af2-9e41-44d5-8b00-aa5477fb6187" containerName="console" Apr 16 16:07:18.449911 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:18.449768 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f222af2-9e41-44d5-8b00-aa5477fb6187" containerName="console" Apr 16 16:07:18.449911 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:18.449825 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="8f222af2-9e41-44d5-8b00-aa5477fb6187" containerName="console" Apr 16 16:07:18.452579 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:18.452558 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fd85f4cb6-r9bxs" Apr 16 16:07:18.464531 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:18.464507 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5fd85f4cb6-r9bxs"] Apr 16 16:07:18.595878 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:18.595824 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spbn8\" (UniqueName: \"kubernetes.io/projected/74c7ba9b-eb07-4da1-879f-8f4413d87608-kube-api-access-spbn8\") pod \"console-5fd85f4cb6-r9bxs\" (UID: \"74c7ba9b-eb07-4da1-879f-8f4413d87608\") " pod="openshift-console/console-5fd85f4cb6-r9bxs" Apr 16 16:07:18.595878 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:18.595881 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/74c7ba9b-eb07-4da1-879f-8f4413d87608-oauth-serving-cert\") pod \"console-5fd85f4cb6-r9bxs\" (UID: \"74c7ba9b-eb07-4da1-879f-8f4413d87608\") " pod="openshift-console/console-5fd85f4cb6-r9bxs" Apr 16 16:07:18.596169 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:18.595940 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/74c7ba9b-eb07-4da1-879f-8f4413d87608-service-ca\") pod \"console-5fd85f4cb6-r9bxs\" (UID: \"74c7ba9b-eb07-4da1-879f-8f4413d87608\") " pod="openshift-console/console-5fd85f4cb6-r9bxs" Apr 16 16:07:18.596169 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:18.595995 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/74c7ba9b-eb07-4da1-879f-8f4413d87608-console-config\") pod \"console-5fd85f4cb6-r9bxs\" (UID: \"74c7ba9b-eb07-4da1-879f-8f4413d87608\") " pod="openshift-console/console-5fd85f4cb6-r9bxs" Apr 16 16:07:18.596169 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:18.596058 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74c7ba9b-eb07-4da1-879f-8f4413d87608-trusted-ca-bundle\") pod \"console-5fd85f4cb6-r9bxs\" (UID: \"74c7ba9b-eb07-4da1-879f-8f4413d87608\") " pod="openshift-console/console-5fd85f4cb6-r9bxs" Apr 16 16:07:18.596169 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:18.596090 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/74c7ba9b-eb07-4da1-879f-8f4413d87608-console-serving-cert\") pod \"console-5fd85f4cb6-r9bxs\" (UID: \"74c7ba9b-eb07-4da1-879f-8f4413d87608\") " pod="openshift-console/console-5fd85f4cb6-r9bxs" Apr 16 16:07:18.596169 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:18.596119 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/74c7ba9b-eb07-4da1-879f-8f4413d87608-console-oauth-config\") pod \"console-5fd85f4cb6-r9bxs\" (UID: \"74c7ba9b-eb07-4da1-879f-8f4413d87608\") " pod="openshift-console/console-5fd85f4cb6-r9bxs" Apr 16 16:07:18.697418 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:18.697385 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-spbn8\" (UniqueName: \"kubernetes.io/projected/74c7ba9b-eb07-4da1-879f-8f4413d87608-kube-api-access-spbn8\") pod \"console-5fd85f4cb6-r9bxs\" (UID: \"74c7ba9b-eb07-4da1-879f-8f4413d87608\") " pod="openshift-console/console-5fd85f4cb6-r9bxs" Apr 16 16:07:18.697418 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:18.697424 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/74c7ba9b-eb07-4da1-879f-8f4413d87608-oauth-serving-cert\") pod \"console-5fd85f4cb6-r9bxs\" (UID: \"74c7ba9b-eb07-4da1-879f-8f4413d87608\") " pod="openshift-console/console-5fd85f4cb6-r9bxs" Apr 16 16:07:18.697671 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:18.697449 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/74c7ba9b-eb07-4da1-879f-8f4413d87608-service-ca\") pod \"console-5fd85f4cb6-r9bxs\" (UID: \"74c7ba9b-eb07-4da1-879f-8f4413d87608\") " pod="openshift-console/console-5fd85f4cb6-r9bxs" Apr 16 16:07:18.697671 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:18.697578 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/74c7ba9b-eb07-4da1-879f-8f4413d87608-console-config\") pod \"console-5fd85f4cb6-r9bxs\" (UID: \"74c7ba9b-eb07-4da1-879f-8f4413d87608\") " pod="openshift-console/console-5fd85f4cb6-r9bxs" Apr 16 16:07:18.697765 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:18.697672 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74c7ba9b-eb07-4da1-879f-8f4413d87608-trusted-ca-bundle\") pod \"console-5fd85f4cb6-r9bxs\" (UID: \"74c7ba9b-eb07-4da1-879f-8f4413d87608\") " pod="openshift-console/console-5fd85f4cb6-r9bxs" Apr 16 16:07:18.697765 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:18.697707 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/74c7ba9b-eb07-4da1-879f-8f4413d87608-console-serving-cert\") pod \"console-5fd85f4cb6-r9bxs\" (UID: \"74c7ba9b-eb07-4da1-879f-8f4413d87608\") " pod="openshift-console/console-5fd85f4cb6-r9bxs" Apr 16 16:07:18.697765 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:18.697733 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/74c7ba9b-eb07-4da1-879f-8f4413d87608-console-oauth-config\") pod \"console-5fd85f4cb6-r9bxs\" (UID: \"74c7ba9b-eb07-4da1-879f-8f4413d87608\") " pod="openshift-console/console-5fd85f4cb6-r9bxs" Apr 16 16:07:18.698186 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:18.698149 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/74c7ba9b-eb07-4da1-879f-8f4413d87608-service-ca\") pod \"console-5fd85f4cb6-r9bxs\" (UID: \"74c7ba9b-eb07-4da1-879f-8f4413d87608\") " pod="openshift-console/console-5fd85f4cb6-r9bxs" Apr 16 16:07:18.698295 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:18.698255 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/74c7ba9b-eb07-4da1-879f-8f4413d87608-oauth-serving-cert\") pod \"console-5fd85f4cb6-r9bxs\" (UID: \"74c7ba9b-eb07-4da1-879f-8f4413d87608\") " pod="openshift-console/console-5fd85f4cb6-r9bxs" Apr 16 16:07:18.698363 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:18.698345 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/74c7ba9b-eb07-4da1-879f-8f4413d87608-console-config\") pod \"console-5fd85f4cb6-r9bxs\" (UID: \"74c7ba9b-eb07-4da1-879f-8f4413d87608\") " pod="openshift-console/console-5fd85f4cb6-r9bxs" Apr 16 16:07:18.698515 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:18.698496 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74c7ba9b-eb07-4da1-879f-8f4413d87608-trusted-ca-bundle\") pod \"console-5fd85f4cb6-r9bxs\" (UID: \"74c7ba9b-eb07-4da1-879f-8f4413d87608\") " pod="openshift-console/console-5fd85f4cb6-r9bxs" Apr 16 16:07:18.700109 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:18.700045 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/74c7ba9b-eb07-4da1-879f-8f4413d87608-console-oauth-config\") pod \"console-5fd85f4cb6-r9bxs\" (UID: \"74c7ba9b-eb07-4da1-879f-8f4413d87608\") " pod="openshift-console/console-5fd85f4cb6-r9bxs" Apr 16 16:07:18.700242 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:18.700200 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/74c7ba9b-eb07-4da1-879f-8f4413d87608-console-serving-cert\") pod \"console-5fd85f4cb6-r9bxs\" (UID: \"74c7ba9b-eb07-4da1-879f-8f4413d87608\") " pod="openshift-console/console-5fd85f4cb6-r9bxs" Apr 16 16:07:18.710687 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:18.710665 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-spbn8\" (UniqueName: \"kubernetes.io/projected/74c7ba9b-eb07-4da1-879f-8f4413d87608-kube-api-access-spbn8\") pod \"console-5fd85f4cb6-r9bxs\" (UID: \"74c7ba9b-eb07-4da1-879f-8f4413d87608\") " pod="openshift-console/console-5fd85f4cb6-r9bxs" Apr 16 16:07:18.761509 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:18.761470 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fd85f4cb6-r9bxs" Apr 16 16:07:18.875122 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:18.875091 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5fd85f4cb6-r9bxs"] Apr 16 16:07:18.878088 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:07:18.878057 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74c7ba9b_eb07_4da1_879f_8f4413d87608.slice/crio-07aabae5aed6e8af1d1695d708fdeabfff968e57cec5090a44f91b281a547bd4 WatchSource:0}: Error finding container 07aabae5aed6e8af1d1695d708fdeabfff968e57cec5090a44f91b281a547bd4: Status 404 returned error can't find the container with id 07aabae5aed6e8af1d1695d708fdeabfff968e57cec5090a44f91b281a547bd4 Apr 16 16:07:19.027331 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:19.027245 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fd85f4cb6-r9bxs" event={"ID":"74c7ba9b-eb07-4da1-879f-8f4413d87608","Type":"ContainerStarted","Data":"6e3dce62dd0ebc455fe1894f9d3948ff06cffdf20690cac3bc123a006bbdeb8d"} Apr 16 16:07:19.027331 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:19.027281 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fd85f4cb6-r9bxs" event={"ID":"74c7ba9b-eb07-4da1-879f-8f4413d87608","Type":"ContainerStarted","Data":"07aabae5aed6e8af1d1695d708fdeabfff968e57cec5090a44f91b281a547bd4"} Apr 16 16:07:19.045736 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:19.045689 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5fd85f4cb6-r9bxs" podStartSLOduration=1.045671157 podStartE2EDuration="1.045671157s" podCreationTimestamp="2026-04-16 16:07:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:07:19.044862821 +0000 UTC m=+262.399852951" watchObservedRunningTime="2026-04-16 16:07:19.045671157 +0000 UTC m=+262.400661286" Apr 16 16:07:28.762081 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:28.762038 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5fd85f4cb6-r9bxs" Apr 16 16:07:28.762081 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:28.762083 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5fd85f4cb6-r9bxs" Apr 16 16:07:28.766592 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:28.766573 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5fd85f4cb6-r9bxs" Apr 16 16:07:29.060417 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:29.060339 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5fd85f4cb6-r9bxs" Apr 16 16:07:29.126832 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:29.126795 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-bd7b8f78c-488dr"] Apr 16 16:07:54.146529 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:54.146465 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-bd7b8f78c-488dr" podUID="45052aca-38b1-43e2-8dbb-e08cd9a6ade2" containerName="console" containerID="cri-o://cd85b8ca6e5914bdbe480a271f46b34842d3134f7c2fde45f803b4e170a072e5" gracePeriod=15 Apr 16 16:07:54.377083 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:54.377063 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-bd7b8f78c-488dr_45052aca-38b1-43e2-8dbb-e08cd9a6ade2/console/0.log" Apr 16 16:07:54.377196 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:54.377120 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bd7b8f78c-488dr" Apr 16 16:07:54.481604 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:54.481517 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45052aca-38b1-43e2-8dbb-e08cd9a6ade2-trusted-ca-bundle\") pod \"45052aca-38b1-43e2-8dbb-e08cd9a6ade2\" (UID: \"45052aca-38b1-43e2-8dbb-e08cd9a6ade2\") " Apr 16 16:07:54.481604 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:54.481566 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/45052aca-38b1-43e2-8dbb-e08cd9a6ade2-oauth-serving-cert\") pod \"45052aca-38b1-43e2-8dbb-e08cd9a6ade2\" (UID: \"45052aca-38b1-43e2-8dbb-e08cd9a6ade2\") " Apr 16 16:07:54.481604 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:54.481594 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/45052aca-38b1-43e2-8dbb-e08cd9a6ade2-console-serving-cert\") pod \"45052aca-38b1-43e2-8dbb-e08cd9a6ade2\" (UID: \"45052aca-38b1-43e2-8dbb-e08cd9a6ade2\") " Apr 16 16:07:54.481604 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:54.481615 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/45052aca-38b1-43e2-8dbb-e08cd9a6ade2-console-config\") pod \"45052aca-38b1-43e2-8dbb-e08cd9a6ade2\" (UID: \"45052aca-38b1-43e2-8dbb-e08cd9a6ade2\") " Apr 16 16:07:54.481937 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:54.481666 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/45052aca-38b1-43e2-8dbb-e08cd9a6ade2-console-oauth-config\") pod \"45052aca-38b1-43e2-8dbb-e08cd9a6ade2\" (UID: \"45052aca-38b1-43e2-8dbb-e08cd9a6ade2\") " Apr 16 16:07:54.481937 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:54.481695 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vj6xf\" (UniqueName: \"kubernetes.io/projected/45052aca-38b1-43e2-8dbb-e08cd9a6ade2-kube-api-access-vj6xf\") pod \"45052aca-38b1-43e2-8dbb-e08cd9a6ade2\" (UID: \"45052aca-38b1-43e2-8dbb-e08cd9a6ade2\") " Apr 16 16:07:54.481937 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:54.481714 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/45052aca-38b1-43e2-8dbb-e08cd9a6ade2-service-ca\") pod \"45052aca-38b1-43e2-8dbb-e08cd9a6ade2\" (UID: \"45052aca-38b1-43e2-8dbb-e08cd9a6ade2\") " Apr 16 16:07:54.481937 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:54.481922 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45052aca-38b1-43e2-8dbb-e08cd9a6ade2-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "45052aca-38b1-43e2-8dbb-e08cd9a6ade2" (UID: "45052aca-38b1-43e2-8dbb-e08cd9a6ade2"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:07:54.482133 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:54.482075 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45052aca-38b1-43e2-8dbb-e08cd9a6ade2-console-config" (OuterVolumeSpecName: "console-config") pod "45052aca-38b1-43e2-8dbb-e08cd9a6ade2" (UID: "45052aca-38b1-43e2-8dbb-e08cd9a6ade2"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:07:54.482133 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:54.482107 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45052aca-38b1-43e2-8dbb-e08cd9a6ade2-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "45052aca-38b1-43e2-8dbb-e08cd9a6ade2" (UID: "45052aca-38b1-43e2-8dbb-e08cd9a6ade2"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:07:54.482241 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:54.482162 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45052aca-38b1-43e2-8dbb-e08cd9a6ade2-service-ca" (OuterVolumeSpecName: "service-ca") pod "45052aca-38b1-43e2-8dbb-e08cd9a6ade2" (UID: "45052aca-38b1-43e2-8dbb-e08cd9a6ade2"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:07:54.483916 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:54.483886 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45052aca-38b1-43e2-8dbb-e08cd9a6ade2-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "45052aca-38b1-43e2-8dbb-e08cd9a6ade2" (UID: "45052aca-38b1-43e2-8dbb-e08cd9a6ade2"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:07:54.483916 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:54.483906 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45052aca-38b1-43e2-8dbb-e08cd9a6ade2-kube-api-access-vj6xf" (OuterVolumeSpecName: "kube-api-access-vj6xf") pod "45052aca-38b1-43e2-8dbb-e08cd9a6ade2" (UID: "45052aca-38b1-43e2-8dbb-e08cd9a6ade2"). InnerVolumeSpecName "kube-api-access-vj6xf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:07:54.484052 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:54.483908 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45052aca-38b1-43e2-8dbb-e08cd9a6ade2-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "45052aca-38b1-43e2-8dbb-e08cd9a6ade2" (UID: "45052aca-38b1-43e2-8dbb-e08cd9a6ade2"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:07:54.582459 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:54.582435 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/45052aca-38b1-43e2-8dbb-e08cd9a6ade2-console-config\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:07:54.582459 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:54.582454 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/45052aca-38b1-43e2-8dbb-e08cd9a6ade2-console-oauth-config\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:07:54.582587 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:54.582464 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vj6xf\" (UniqueName: \"kubernetes.io/projected/45052aca-38b1-43e2-8dbb-e08cd9a6ade2-kube-api-access-vj6xf\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:07:54.582587 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:54.582473 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/45052aca-38b1-43e2-8dbb-e08cd9a6ade2-service-ca\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:07:54.582587 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:54.582482 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45052aca-38b1-43e2-8dbb-e08cd9a6ade2-trusted-ca-bundle\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:07:54.582587 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:54.582492 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/45052aca-38b1-43e2-8dbb-e08cd9a6ade2-oauth-serving-cert\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:07:54.582587 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:54.582500 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/45052aca-38b1-43e2-8dbb-e08cd9a6ade2-console-serving-cert\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:07:55.130871 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:55.130843 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-bd7b8f78c-488dr_45052aca-38b1-43e2-8dbb-e08cd9a6ade2/console/0.log" Apr 16 16:07:55.131043 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:55.130889 2576 generic.go:358] "Generic (PLEG): container finished" podID="45052aca-38b1-43e2-8dbb-e08cd9a6ade2" containerID="cd85b8ca6e5914bdbe480a271f46b34842d3134f7c2fde45f803b4e170a072e5" exitCode=2 Apr 16 16:07:55.131043 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:55.130934 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bd7b8f78c-488dr" event={"ID":"45052aca-38b1-43e2-8dbb-e08cd9a6ade2","Type":"ContainerDied","Data":"cd85b8ca6e5914bdbe480a271f46b34842d3134f7c2fde45f803b4e170a072e5"} Apr 16 16:07:55.131043 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:55.130962 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bd7b8f78c-488dr" event={"ID":"45052aca-38b1-43e2-8dbb-e08cd9a6ade2","Type":"ContainerDied","Data":"d904a681791b5f5e9549eed143626b1f24d47fc039596390f949348318fd3ec1"} Apr 16 16:07:55.131043 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:55.130964 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bd7b8f78c-488dr" Apr 16 16:07:55.131043 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:55.131037 2576 scope.go:117] "RemoveContainer" containerID="cd85b8ca6e5914bdbe480a271f46b34842d3134f7c2fde45f803b4e170a072e5" Apr 16 16:07:55.138803 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:55.138785 2576 scope.go:117] "RemoveContainer" containerID="cd85b8ca6e5914bdbe480a271f46b34842d3134f7c2fde45f803b4e170a072e5" Apr 16 16:07:55.139063 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:07:55.139046 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd85b8ca6e5914bdbe480a271f46b34842d3134f7c2fde45f803b4e170a072e5\": container with ID starting with cd85b8ca6e5914bdbe480a271f46b34842d3134f7c2fde45f803b4e170a072e5 not found: ID does not exist" containerID="cd85b8ca6e5914bdbe480a271f46b34842d3134f7c2fde45f803b4e170a072e5" Apr 16 16:07:55.139127 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:55.139074 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd85b8ca6e5914bdbe480a271f46b34842d3134f7c2fde45f803b4e170a072e5"} err="failed to get container status \"cd85b8ca6e5914bdbe480a271f46b34842d3134f7c2fde45f803b4e170a072e5\": rpc error: code = NotFound desc = could not find container \"cd85b8ca6e5914bdbe480a271f46b34842d3134f7c2fde45f803b4e170a072e5\": container with ID starting with cd85b8ca6e5914bdbe480a271f46b34842d3134f7c2fde45f803b4e170a072e5 not found: ID does not exist" Apr 16 16:07:55.151986 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:55.151953 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-bd7b8f78c-488dr"] Apr 16 16:07:55.158690 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:55.158660 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-bd7b8f78c-488dr"] Apr 16 16:07:55.278097 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:55.278053 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45052aca-38b1-43e2-8dbb-e08cd9a6ade2" path="/var/lib/kubelet/pods/45052aca-38b1-43e2-8dbb-e08cd9a6ade2/volumes" Apr 16 16:07:57.132358 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:07:57.132235 2576 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 16:09:18.146948 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:18.146870 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfwnc"] Apr 16 16:09:18.147393 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:18.147124 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="45052aca-38b1-43e2-8dbb-e08cd9a6ade2" containerName="console" Apr 16 16:09:18.147393 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:18.147136 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="45052aca-38b1-43e2-8dbb-e08cd9a6ade2" containerName="console" Apr 16 16:09:18.147393 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:18.147180 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="45052aca-38b1-43e2-8dbb-e08cd9a6ade2" containerName="console" Apr 16 16:09:18.150065 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:18.150048 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfwnc" Apr 16 16:09:18.158701 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:18.158686 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-n6h5m\"" Apr 16 16:09:18.158968 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:18.158953 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 16:09:18.159192 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:18.159178 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 16:09:18.169761 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:18.169743 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfwnc"] Apr 16 16:09:18.182885 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:18.182855 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bb67cb47-1b49-481c-8bd2-f0cc28253a4e-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfwnc\" (UID: \"bb67cb47-1b49-481c-8bd2-f0cc28253a4e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfwnc" Apr 16 16:09:18.182970 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:18.182906 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rklvj\" (UniqueName: \"kubernetes.io/projected/bb67cb47-1b49-481c-8bd2-f0cc28253a4e-kube-api-access-rklvj\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfwnc\" (UID: \"bb67cb47-1b49-481c-8bd2-f0cc28253a4e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfwnc" Apr 16 16:09:18.182970 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:18.182958 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bb67cb47-1b49-481c-8bd2-f0cc28253a4e-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfwnc\" (UID: \"bb67cb47-1b49-481c-8bd2-f0cc28253a4e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfwnc" Apr 16 16:09:18.284460 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:18.284426 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bb67cb47-1b49-481c-8bd2-f0cc28253a4e-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfwnc\" (UID: \"bb67cb47-1b49-481c-8bd2-f0cc28253a4e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfwnc" Apr 16 16:09:18.284609 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:18.284515 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bb67cb47-1b49-481c-8bd2-f0cc28253a4e-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfwnc\" (UID: \"bb67cb47-1b49-481c-8bd2-f0cc28253a4e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfwnc" Apr 16 16:09:18.284652 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:18.284618 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rklvj\" (UniqueName: \"kubernetes.io/projected/bb67cb47-1b49-481c-8bd2-f0cc28253a4e-kube-api-access-rklvj\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfwnc\" (UID: \"bb67cb47-1b49-481c-8bd2-f0cc28253a4e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfwnc" Apr 16 16:09:18.284747 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:18.284731 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bb67cb47-1b49-481c-8bd2-f0cc28253a4e-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfwnc\" (UID: \"bb67cb47-1b49-481c-8bd2-f0cc28253a4e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfwnc" Apr 16 16:09:18.284833 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:18.284815 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bb67cb47-1b49-481c-8bd2-f0cc28253a4e-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfwnc\" (UID: \"bb67cb47-1b49-481c-8bd2-f0cc28253a4e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfwnc" Apr 16 16:09:18.294513 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:18.294491 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rklvj\" (UniqueName: \"kubernetes.io/projected/bb67cb47-1b49-481c-8bd2-f0cc28253a4e-kube-api-access-rklvj\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfwnc\" (UID: \"bb67cb47-1b49-481c-8bd2-f0cc28253a4e\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfwnc" Apr 16 16:09:18.459053 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:18.458976 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfwnc" Apr 16 16:09:18.576740 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:18.576709 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfwnc"] Apr 16 16:09:18.579675 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:09:18.579641 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb67cb47_1b49_481c_8bd2_f0cc28253a4e.slice/crio-8d561b0d9b28836096d752e51df8a035224aa4094c7c6a9a1e40eb2ae143b5c3 WatchSource:0}: Error finding container 8d561b0d9b28836096d752e51df8a035224aa4094c7c6a9a1e40eb2ae143b5c3: Status 404 returned error can't find the container with id 8d561b0d9b28836096d752e51df8a035224aa4094c7c6a9a1e40eb2ae143b5c3 Apr 16 16:09:18.581745 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:18.581730 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:09:19.346624 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:19.346586 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfwnc" event={"ID":"bb67cb47-1b49-481c-8bd2-f0cc28253a4e","Type":"ContainerStarted","Data":"8d561b0d9b28836096d752e51df8a035224aa4094c7c6a9a1e40eb2ae143b5c3"} Apr 16 16:09:24.361634 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:24.361600 2576 generic.go:358] "Generic (PLEG): container finished" podID="bb67cb47-1b49-481c-8bd2-f0cc28253a4e" containerID="2439d90758895af6247f908caa61e4a717f9ab1d800d665ef46857050e215b95" exitCode=0 Apr 16 16:09:24.362015 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:24.361684 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfwnc" event={"ID":"bb67cb47-1b49-481c-8bd2-f0cc28253a4e","Type":"ContainerDied","Data":"2439d90758895af6247f908caa61e4a717f9ab1d800d665ef46857050e215b95"} Apr 16 16:09:29.375790 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:29.375751 2576 generic.go:358] "Generic (PLEG): container finished" podID="bb67cb47-1b49-481c-8bd2-f0cc28253a4e" containerID="3535895b29666ab711d54dfd3371d19d6a925dc722c0f56d20beb9e18b226e9a" exitCode=0 Apr 16 16:09:29.376151 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:29.375823 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfwnc" event={"ID":"bb67cb47-1b49-481c-8bd2-f0cc28253a4e","Type":"ContainerDied","Data":"3535895b29666ab711d54dfd3371d19d6a925dc722c0f56d20beb9e18b226e9a"} Apr 16 16:09:35.393242 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:35.393195 2576 generic.go:358] "Generic (PLEG): container finished" podID="bb67cb47-1b49-481c-8bd2-f0cc28253a4e" containerID="9b13255b910927861d11c6604e1387de737a3c5ed086d943268d4b201b1bde32" exitCode=0 Apr 16 16:09:35.393653 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:35.393281 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfwnc" event={"ID":"bb67cb47-1b49-481c-8bd2-f0cc28253a4e","Type":"ContainerDied","Data":"9b13255b910927861d11c6604e1387de737a3c5ed086d943268d4b201b1bde32"} Apr 16 16:09:36.509793 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:36.509772 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfwnc" Apr 16 16:09:36.633101 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:36.633076 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rklvj\" (UniqueName: \"kubernetes.io/projected/bb67cb47-1b49-481c-8bd2-f0cc28253a4e-kube-api-access-rklvj\") pod \"bb67cb47-1b49-481c-8bd2-f0cc28253a4e\" (UID: \"bb67cb47-1b49-481c-8bd2-f0cc28253a4e\") " Apr 16 16:09:36.633248 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:36.633129 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bb67cb47-1b49-481c-8bd2-f0cc28253a4e-util\") pod \"bb67cb47-1b49-481c-8bd2-f0cc28253a4e\" (UID: \"bb67cb47-1b49-481c-8bd2-f0cc28253a4e\") " Apr 16 16:09:36.633248 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:36.633178 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bb67cb47-1b49-481c-8bd2-f0cc28253a4e-bundle\") pod \"bb67cb47-1b49-481c-8bd2-f0cc28253a4e\" (UID: \"bb67cb47-1b49-481c-8bd2-f0cc28253a4e\") " Apr 16 16:09:36.633802 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:36.633779 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb67cb47-1b49-481c-8bd2-f0cc28253a4e-bundle" (OuterVolumeSpecName: "bundle") pod "bb67cb47-1b49-481c-8bd2-f0cc28253a4e" (UID: "bb67cb47-1b49-481c-8bd2-f0cc28253a4e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:09:36.635249 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:36.635227 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb67cb47-1b49-481c-8bd2-f0cc28253a4e-kube-api-access-rklvj" (OuterVolumeSpecName: "kube-api-access-rklvj") pod "bb67cb47-1b49-481c-8bd2-f0cc28253a4e" (UID: "bb67cb47-1b49-481c-8bd2-f0cc28253a4e"). InnerVolumeSpecName "kube-api-access-rklvj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:09:36.637056 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:36.637038 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb67cb47-1b49-481c-8bd2-f0cc28253a4e-util" (OuterVolumeSpecName: "util") pod "bb67cb47-1b49-481c-8bd2-f0cc28253a4e" (UID: "bb67cb47-1b49-481c-8bd2-f0cc28253a4e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:09:36.734266 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:36.734178 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rklvj\" (UniqueName: \"kubernetes.io/projected/bb67cb47-1b49-481c-8bd2-f0cc28253a4e-kube-api-access-rklvj\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:09:36.734266 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:36.734225 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bb67cb47-1b49-481c-8bd2-f0cc28253a4e-util\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:09:36.734266 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:36.734235 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bb67cb47-1b49-481c-8bd2-f0cc28253a4e-bundle\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:09:37.399773 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:37.399739 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfwnc" event={"ID":"bb67cb47-1b49-481c-8bd2-f0cc28253a4e","Type":"ContainerDied","Data":"8d561b0d9b28836096d752e51df8a035224aa4094c7c6a9a1e40eb2ae143b5c3"} Apr 16 16:09:37.399773 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:37.399772 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d561b0d9b28836096d752e51df8a035224aa4094c7c6a9a1e40eb2ae143b5c3" Apr 16 16:09:37.399952 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:37.399810 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csfwnc" Apr 16 16:09:40.244720 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:40.244683 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-825jp"] Apr 16 16:09:40.245097 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:40.244985 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bb67cb47-1b49-481c-8bd2-f0cc28253a4e" containerName="extract" Apr 16 16:09:40.245097 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:40.244998 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb67cb47-1b49-481c-8bd2-f0cc28253a4e" containerName="extract" Apr 16 16:09:40.245097 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:40.245010 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bb67cb47-1b49-481c-8bd2-f0cc28253a4e" containerName="pull" Apr 16 16:09:40.245097 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:40.245015 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb67cb47-1b49-481c-8bd2-f0cc28253a4e" containerName="pull" Apr 16 16:09:40.245097 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:40.245034 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bb67cb47-1b49-481c-8bd2-f0cc28253a4e" containerName="util" Apr 16 16:09:40.245097 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:40.245040 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb67cb47-1b49-481c-8bd2-f0cc28253a4e" containerName="util" Apr 16 16:09:40.245097 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:40.245077 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="bb67cb47-1b49-481c-8bd2-f0cc28253a4e" containerName="extract" Apr 16 16:09:40.283860 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:40.283827 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-825jp"] Apr 16 16:09:40.283994 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:40.283937 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-825jp" Apr 16 16:09:40.286593 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:40.286574 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 16:09:40.286593 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:40.286590 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 16:09:40.286863 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:40.286850 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-8t4nr\"" Apr 16 16:09:40.287037 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:40.287020 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 16 16:09:40.359540 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:40.359512 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/aee0d658-7bc4-408a-a95b-d272d2437700-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-825jp\" (UID: \"aee0d658-7bc4-408a-a95b-d272d2437700\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-825jp" Apr 16 16:09:40.359665 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:40.359563 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kvj4\" (UniqueName: \"kubernetes.io/projected/aee0d658-7bc4-408a-a95b-d272d2437700-kube-api-access-9kvj4\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-825jp\" (UID: \"aee0d658-7bc4-408a-a95b-d272d2437700\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-825jp" Apr 16 16:09:40.460746 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:40.460722 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/aee0d658-7bc4-408a-a95b-d272d2437700-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-825jp\" (UID: \"aee0d658-7bc4-408a-a95b-d272d2437700\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-825jp" Apr 16 16:09:40.460870 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:40.460769 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9kvj4\" (UniqueName: \"kubernetes.io/projected/aee0d658-7bc4-408a-a95b-d272d2437700-kube-api-access-9kvj4\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-825jp\" (UID: \"aee0d658-7bc4-408a-a95b-d272d2437700\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-825jp" Apr 16 16:09:40.462905 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:40.462888 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/aee0d658-7bc4-408a-a95b-d272d2437700-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-825jp\" (UID: \"aee0d658-7bc4-408a-a95b-d272d2437700\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-825jp" Apr 16 16:09:40.469939 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:40.469918 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kvj4\" (UniqueName: \"kubernetes.io/projected/aee0d658-7bc4-408a-a95b-d272d2437700-kube-api-access-9kvj4\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-825jp\" (UID: \"aee0d658-7bc4-408a-a95b-d272d2437700\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-825jp" Apr 16 16:09:40.594356 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:40.594302 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-825jp" Apr 16 16:09:40.718376 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:40.718337 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-825jp"] Apr 16 16:09:40.722577 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:09:40.722544 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaee0d658_7bc4_408a_a95b_d272d2437700.slice/crio-2407781e0200b623287f34ef2a89cf9b94e4a7c828e203362f95e7eac3d17686 WatchSource:0}: Error finding container 2407781e0200b623287f34ef2a89cf9b94e4a7c828e203362f95e7eac3d17686: Status 404 returned error can't find the container with id 2407781e0200b623287f34ef2a89cf9b94e4a7c828e203362f95e7eac3d17686 Apr 16 16:09:41.411759 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:41.411720 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-825jp" event={"ID":"aee0d658-7bc4-408a-a95b-d272d2437700","Type":"ContainerStarted","Data":"2407781e0200b623287f34ef2a89cf9b94e4a7c828e203362f95e7eac3d17686"} Apr 16 16:09:45.164339 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:45.164299 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-jdfq4"] Apr 16 16:09:45.167725 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:45.167702 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-jdfq4" Apr 16 16:09:45.170009 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:45.169987 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 16 16:09:45.170112 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:45.169992 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 16 16:09:45.170167 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:45.170111 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-trhqz\"" Apr 16 16:09:45.181035 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:45.181016 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-jdfq4"] Apr 16 16:09:45.298494 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:45.298465 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wmjk\" (UniqueName: \"kubernetes.io/projected/60690d37-b482-45a9-9574-cb65483ea121-kube-api-access-2wmjk\") pod \"keda-operator-ffbb595cb-jdfq4\" (UID: \"60690d37-b482-45a9-9574-cb65483ea121\") " pod="openshift-keda/keda-operator-ffbb595cb-jdfq4" Apr 16 16:09:45.298630 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:45.298526 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/60690d37-b482-45a9-9574-cb65483ea121-cabundle0\") pod \"keda-operator-ffbb595cb-jdfq4\" (UID: \"60690d37-b482-45a9-9574-cb65483ea121\") " pod="openshift-keda/keda-operator-ffbb595cb-jdfq4" Apr 16 16:09:45.298630 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:45.298576 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/60690d37-b482-45a9-9574-cb65483ea121-certificates\") pod \"keda-operator-ffbb595cb-jdfq4\" (UID: \"60690d37-b482-45a9-9574-cb65483ea121\") " pod="openshift-keda/keda-operator-ffbb595cb-jdfq4" Apr 16 16:09:45.399802 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:45.399774 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2wmjk\" (UniqueName: \"kubernetes.io/projected/60690d37-b482-45a9-9574-cb65483ea121-kube-api-access-2wmjk\") pod \"keda-operator-ffbb595cb-jdfq4\" (UID: \"60690d37-b482-45a9-9574-cb65483ea121\") " pod="openshift-keda/keda-operator-ffbb595cb-jdfq4" Apr 16 16:09:45.399960 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:45.399819 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/60690d37-b482-45a9-9574-cb65483ea121-cabundle0\") pod \"keda-operator-ffbb595cb-jdfq4\" (UID: \"60690d37-b482-45a9-9574-cb65483ea121\") " pod="openshift-keda/keda-operator-ffbb595cb-jdfq4" Apr 16 16:09:45.399960 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:45.399844 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/60690d37-b482-45a9-9574-cb65483ea121-certificates\") pod \"keda-operator-ffbb595cb-jdfq4\" (UID: \"60690d37-b482-45a9-9574-cb65483ea121\") " pod="openshift-keda/keda-operator-ffbb595cb-jdfq4" Apr 16 16:09:45.400055 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:09:45.399966 2576 secret.go:281] references non-existent secret key: ca.crt Apr 16 16:09:45.400055 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:09:45.399982 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 16:09:45.400055 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:09:45.399993 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-jdfq4: references non-existent secret key: ca.crt Apr 16 16:09:45.400155 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:09:45.400061 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/60690d37-b482-45a9-9574-cb65483ea121-certificates podName:60690d37-b482-45a9-9574-cb65483ea121 nodeName:}" failed. No retries permitted until 2026-04-16 16:09:45.900040258 +0000 UTC m=+409.255030378 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/60690d37-b482-45a9-9574-cb65483ea121-certificates") pod "keda-operator-ffbb595cb-jdfq4" (UID: "60690d37-b482-45a9-9574-cb65483ea121") : references non-existent secret key: ca.crt Apr 16 16:09:45.400540 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:45.400521 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/60690d37-b482-45a9-9574-cb65483ea121-cabundle0\") pod \"keda-operator-ffbb595cb-jdfq4\" (UID: \"60690d37-b482-45a9-9574-cb65483ea121\") " pod="openshift-keda/keda-operator-ffbb595cb-jdfq4" Apr 16 16:09:45.408735 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:45.408712 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wmjk\" (UniqueName: \"kubernetes.io/projected/60690d37-b482-45a9-9574-cb65483ea121-kube-api-access-2wmjk\") pod \"keda-operator-ffbb595cb-jdfq4\" (UID: \"60690d37-b482-45a9-9574-cb65483ea121\") " pod="openshift-keda/keda-operator-ffbb595cb-jdfq4" Apr 16 16:09:45.424153 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:45.424081 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-825jp" event={"ID":"aee0d658-7bc4-408a-a95b-d272d2437700","Type":"ContainerStarted","Data":"46204e8a855f9c1ebbd06aa69d41605ed37510d448e73d30194d130fff35eb19"} Apr 16 16:09:45.424264 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:45.424243 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-825jp" Apr 16 16:09:45.447608 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:45.447567 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-825jp" podStartSLOduration=1.603489224 podStartE2EDuration="5.447554437s" podCreationTimestamp="2026-04-16 16:09:40 +0000 UTC" firstStartedPulling="2026-04-16 16:09:40.724402832 +0000 UTC m=+404.079392940" lastFinishedPulling="2026-04-16 16:09:44.568468045 +0000 UTC m=+407.923458153" observedRunningTime="2026-04-16 16:09:45.446551874 +0000 UTC m=+408.801542003" watchObservedRunningTime="2026-04-16 16:09:45.447554437 +0000 UTC m=+408.802544563" Apr 16 16:09:45.583360 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:45.583332 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-8mn6q"] Apr 16 16:09:45.586573 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:45.586556 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8mn6q" Apr 16 16:09:45.588935 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:45.588916 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 16 16:09:45.601088 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:45.601066 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-8mn6q"] Apr 16 16:09:45.702247 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:45.702149 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/17eabab5-ad35-4673-88c8-7385dcdd7f50-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-8mn6q\" (UID: \"17eabab5-ad35-4673-88c8-7385dcdd7f50\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8mn6q" Apr 16 16:09:45.702247 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:45.702185 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fntml\" (UniqueName: \"kubernetes.io/projected/17eabab5-ad35-4673-88c8-7385dcdd7f50-kube-api-access-fntml\") pod \"keda-metrics-apiserver-7c9f485588-8mn6q\" (UID: \"17eabab5-ad35-4673-88c8-7385dcdd7f50\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8mn6q" Apr 16 16:09:45.702247 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:45.702207 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/17eabab5-ad35-4673-88c8-7385dcdd7f50-certificates\") pod \"keda-metrics-apiserver-7c9f485588-8mn6q\" (UID: \"17eabab5-ad35-4673-88c8-7385dcdd7f50\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8mn6q" Apr 16 16:09:45.802791 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:45.802761 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fntml\" (UniqueName: \"kubernetes.io/projected/17eabab5-ad35-4673-88c8-7385dcdd7f50-kube-api-access-fntml\") pod \"keda-metrics-apiserver-7c9f485588-8mn6q\" (UID: \"17eabab5-ad35-4673-88c8-7385dcdd7f50\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8mn6q" Apr 16 16:09:45.802791 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:45.802796 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/17eabab5-ad35-4673-88c8-7385dcdd7f50-certificates\") pod \"keda-metrics-apiserver-7c9f485588-8mn6q\" (UID: \"17eabab5-ad35-4673-88c8-7385dcdd7f50\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8mn6q" Apr 16 16:09:45.802981 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:45.802863 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/17eabab5-ad35-4673-88c8-7385dcdd7f50-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-8mn6q\" (UID: \"17eabab5-ad35-4673-88c8-7385dcdd7f50\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8mn6q" Apr 16 16:09:45.802981 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:09:45.802945 2576 secret.go:281] references non-existent secret key: tls.crt Apr 16 16:09:45.802981 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:09:45.802966 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 16:09:45.803071 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:09:45.802990 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-8mn6q: references non-existent secret key: tls.crt Apr 16 16:09:45.803071 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:09:45.803047 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/17eabab5-ad35-4673-88c8-7385dcdd7f50-certificates podName:17eabab5-ad35-4673-88c8-7385dcdd7f50 nodeName:}" failed. No retries permitted until 2026-04-16 16:09:46.303027292 +0000 UTC m=+409.658017409 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/17eabab5-ad35-4673-88c8-7385dcdd7f50-certificates") pod "keda-metrics-apiserver-7c9f485588-8mn6q" (UID: "17eabab5-ad35-4673-88c8-7385dcdd7f50") : references non-existent secret key: tls.crt Apr 16 16:09:45.803165 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:45.803147 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/17eabab5-ad35-4673-88c8-7385dcdd7f50-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-8mn6q\" (UID: \"17eabab5-ad35-4673-88c8-7385dcdd7f50\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8mn6q" Apr 16 16:09:45.813604 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:45.813578 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fntml\" (UniqueName: \"kubernetes.io/projected/17eabab5-ad35-4673-88c8-7385dcdd7f50-kube-api-access-fntml\") pod \"keda-metrics-apiserver-7c9f485588-8mn6q\" (UID: \"17eabab5-ad35-4673-88c8-7385dcdd7f50\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8mn6q" Apr 16 16:09:45.903931 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:45.903903 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/60690d37-b482-45a9-9574-cb65483ea121-certificates\") pod \"keda-operator-ffbb595cb-jdfq4\" (UID: \"60690d37-b482-45a9-9574-cb65483ea121\") " pod="openshift-keda/keda-operator-ffbb595cb-jdfq4" Apr 16 16:09:45.904063 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:09:45.904014 2576 secret.go:281] references non-existent secret key: ca.crt Apr 16 16:09:45.904063 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:09:45.904026 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 16:09:45.904063 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:09:45.904036 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-jdfq4: references non-existent secret key: ca.crt Apr 16 16:09:45.904172 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:09:45.904078 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/60690d37-b482-45a9-9574-cb65483ea121-certificates podName:60690d37-b482-45a9-9574-cb65483ea121 nodeName:}" failed. No retries permitted until 2026-04-16 16:09:46.904064925 +0000 UTC m=+410.259055033 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/60690d37-b482-45a9-9574-cb65483ea121-certificates") pod "keda-operator-ffbb595cb-jdfq4" (UID: "60690d37-b482-45a9-9574-cb65483ea121") : references non-existent secret key: ca.crt Apr 16 16:09:46.308838 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:46.308800 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/17eabab5-ad35-4673-88c8-7385dcdd7f50-certificates\") pod \"keda-metrics-apiserver-7c9f485588-8mn6q\" (UID: \"17eabab5-ad35-4673-88c8-7385dcdd7f50\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8mn6q" Apr 16 16:09:46.309179 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:09:46.308949 2576 secret.go:281] references non-existent secret key: tls.crt Apr 16 16:09:46.309179 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:09:46.308968 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 16:09:46.309179 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:09:46.308986 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-8mn6q: references non-existent secret key: tls.crt Apr 16 16:09:46.309179 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:09:46.309043 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/17eabab5-ad35-4673-88c8-7385dcdd7f50-certificates podName:17eabab5-ad35-4673-88c8-7385dcdd7f50 nodeName:}" failed. No retries permitted until 2026-04-16 16:09:47.30902777 +0000 UTC m=+410.664017877 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/17eabab5-ad35-4673-88c8-7385dcdd7f50-certificates") pod "keda-metrics-apiserver-7c9f485588-8mn6q" (UID: "17eabab5-ad35-4673-88c8-7385dcdd7f50") : references non-existent secret key: tls.crt Apr 16 16:09:46.913526 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:46.913491 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/60690d37-b482-45a9-9574-cb65483ea121-certificates\") pod \"keda-operator-ffbb595cb-jdfq4\" (UID: \"60690d37-b482-45a9-9574-cb65483ea121\") " pod="openshift-keda/keda-operator-ffbb595cb-jdfq4" Apr 16 16:09:46.913689 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:09:46.913609 2576 secret.go:281] references non-existent secret key: ca.crt Apr 16 16:09:46.913689 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:09:46.913623 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 16:09:46.913689 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:09:46.913631 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-jdfq4: references non-existent secret key: ca.crt Apr 16 16:09:46.913689 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:09:46.913676 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/60690d37-b482-45a9-9574-cb65483ea121-certificates podName:60690d37-b482-45a9-9574-cb65483ea121 nodeName:}" failed. No retries permitted until 2026-04-16 16:09:48.913662621 +0000 UTC m=+412.268652727 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/60690d37-b482-45a9-9574-cb65483ea121-certificates") pod "keda-operator-ffbb595cb-jdfq4" (UID: "60690d37-b482-45a9-9574-cb65483ea121") : references non-existent secret key: ca.crt Apr 16 16:09:47.316111 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:47.316026 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/17eabab5-ad35-4673-88c8-7385dcdd7f50-certificates\") pod \"keda-metrics-apiserver-7c9f485588-8mn6q\" (UID: \"17eabab5-ad35-4673-88c8-7385dcdd7f50\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8mn6q" Apr 16 16:09:47.316507 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:09:47.316156 2576 secret.go:281] references non-existent secret key: tls.crt Apr 16 16:09:47.316507 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:09:47.316173 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 16:09:47.316507 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:09:47.316191 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-8mn6q: references non-existent secret key: tls.crt Apr 16 16:09:47.316507 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:09:47.316253 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/17eabab5-ad35-4673-88c8-7385dcdd7f50-certificates podName:17eabab5-ad35-4673-88c8-7385dcdd7f50 nodeName:}" failed. No retries permitted until 2026-04-16 16:09:49.31623901 +0000 UTC m=+412.671229117 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/17eabab5-ad35-4673-88c8-7385dcdd7f50-certificates") pod "keda-metrics-apiserver-7c9f485588-8mn6q" (UID: "17eabab5-ad35-4673-88c8-7385dcdd7f50") : references non-existent secret key: tls.crt Apr 16 16:09:48.929930 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:48.929890 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/60690d37-b482-45a9-9574-cb65483ea121-certificates\") pod \"keda-operator-ffbb595cb-jdfq4\" (UID: \"60690d37-b482-45a9-9574-cb65483ea121\") " pod="openshift-keda/keda-operator-ffbb595cb-jdfq4" Apr 16 16:09:48.930445 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:09:48.930026 2576 secret.go:281] references non-existent secret key: ca.crt Apr 16 16:09:48.930445 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:09:48.930043 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 16:09:48.930445 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:09:48.930055 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-jdfq4: references non-existent secret key: ca.crt Apr 16 16:09:48.930445 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:09:48.930112 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/60690d37-b482-45a9-9574-cb65483ea121-certificates podName:60690d37-b482-45a9-9574-cb65483ea121 nodeName:}" failed. No retries permitted until 2026-04-16 16:09:52.930095391 +0000 UTC m=+416.285085499 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/60690d37-b482-45a9-9574-cb65483ea121-certificates") pod "keda-operator-ffbb595cb-jdfq4" (UID: "60690d37-b482-45a9-9574-cb65483ea121") : references non-existent secret key: ca.crt Apr 16 16:09:49.333000 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:49.332918 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/17eabab5-ad35-4673-88c8-7385dcdd7f50-certificates\") pod \"keda-metrics-apiserver-7c9f485588-8mn6q\" (UID: \"17eabab5-ad35-4673-88c8-7385dcdd7f50\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8mn6q" Apr 16 16:09:49.333293 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:09:49.333028 2576 secret.go:281] references non-existent secret key: tls.crt Apr 16 16:09:49.333293 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:09:49.333040 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 16:09:49.333293 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:09:49.333057 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-8mn6q: references non-existent secret key: tls.crt Apr 16 16:09:49.333293 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:09:49.333106 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/17eabab5-ad35-4673-88c8-7385dcdd7f50-certificates podName:17eabab5-ad35-4673-88c8-7385dcdd7f50 nodeName:}" failed. No retries permitted until 2026-04-16 16:09:53.333090304 +0000 UTC m=+416.688080412 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/17eabab5-ad35-4673-88c8-7385dcdd7f50-certificates") pod "keda-metrics-apiserver-7c9f485588-8mn6q" (UID: "17eabab5-ad35-4673-88c8-7385dcdd7f50") : references non-existent secret key: tls.crt Apr 16 16:09:52.960456 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:52.960420 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/60690d37-b482-45a9-9574-cb65483ea121-certificates\") pod \"keda-operator-ffbb595cb-jdfq4\" (UID: \"60690d37-b482-45a9-9574-cb65483ea121\") " pod="openshift-keda/keda-operator-ffbb595cb-jdfq4" Apr 16 16:09:52.962755 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:52.962734 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/60690d37-b482-45a9-9574-cb65483ea121-certificates\") pod \"keda-operator-ffbb595cb-jdfq4\" (UID: \"60690d37-b482-45a9-9574-cb65483ea121\") " pod="openshift-keda/keda-operator-ffbb595cb-jdfq4" Apr 16 16:09:52.977587 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:52.977566 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-jdfq4" Apr 16 16:09:53.093840 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:53.091554 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-jdfq4"] Apr 16 16:09:53.365566 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:53.365482 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/17eabab5-ad35-4673-88c8-7385dcdd7f50-certificates\") pod \"keda-metrics-apiserver-7c9f485588-8mn6q\" (UID: \"17eabab5-ad35-4673-88c8-7385dcdd7f50\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8mn6q" Apr 16 16:09:53.367937 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:53.367914 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/17eabab5-ad35-4673-88c8-7385dcdd7f50-certificates\") pod \"keda-metrics-apiserver-7c9f485588-8mn6q\" (UID: \"17eabab5-ad35-4673-88c8-7385dcdd7f50\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8mn6q" Apr 16 16:09:53.396910 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:53.396881 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8mn6q" Apr 16 16:09:53.448559 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:53.448527 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-jdfq4" event={"ID":"60690d37-b482-45a9-9574-cb65483ea121","Type":"ContainerStarted","Data":"e00795cc05796c3c08c030e08d7fa19a2d3522bef4c36e03107e9b82e965c893"} Apr 16 16:09:53.523832 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:53.523799 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-8mn6q"] Apr 16 16:09:53.525993 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:09:53.525965 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17eabab5_ad35_4673_88c8_7385dcdd7f50.slice/crio-78fc893e6adb2ea052d4093213f01b9150bcf1b26939d95ac0afd7972b293b8b WatchSource:0}: Error finding container 78fc893e6adb2ea052d4093213f01b9150bcf1b26939d95ac0afd7972b293b8b: Status 404 returned error can't find the container with id 78fc893e6adb2ea052d4093213f01b9150bcf1b26939d95ac0afd7972b293b8b Apr 16 16:09:54.452727 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:54.452674 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8mn6q" event={"ID":"17eabab5-ad35-4673-88c8-7385dcdd7f50","Type":"ContainerStarted","Data":"78fc893e6adb2ea052d4093213f01b9150bcf1b26939d95ac0afd7972b293b8b"} Apr 16 16:09:57.463072 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:57.463039 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-jdfq4" event={"ID":"60690d37-b482-45a9-9574-cb65483ea121","Type":"ContainerStarted","Data":"53e0879165e574d18f125e22c0f38429241171c75e5ea1f76ebab9c2774097a9"} Apr 16 16:09:57.463483 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:57.463129 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-jdfq4" Apr 16 16:09:57.464335 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:57.464311 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8mn6q" event={"ID":"17eabab5-ad35-4673-88c8-7385dcdd7f50","Type":"ContainerStarted","Data":"7a473f5180d28391c1e839838cc2defb3898ef9a38b64c5475882fd97396fbb4"} Apr 16 16:09:57.464464 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:57.464443 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8mn6q" Apr 16 16:09:57.488012 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:57.487967 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-jdfq4" podStartSLOduration=8.484514144 podStartE2EDuration="12.48795468s" podCreationTimestamp="2026-04-16 16:09:45 +0000 UTC" firstStartedPulling="2026-04-16 16:09:53.096086012 +0000 UTC m=+416.451076119" lastFinishedPulling="2026-04-16 16:09:57.099526548 +0000 UTC m=+420.454516655" observedRunningTime="2026-04-16 16:09:57.487019317 +0000 UTC m=+420.842009526" watchObservedRunningTime="2026-04-16 16:09:57.48795468 +0000 UTC m=+420.842944809" Apr 16 16:09:57.511400 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:09:57.511360 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8mn6q" podStartSLOduration=8.938210579 podStartE2EDuration="12.511348292s" podCreationTimestamp="2026-04-16 16:09:45 +0000 UTC" firstStartedPulling="2026-04-16 16:09:53.527342512 +0000 UTC m=+416.882332620" lastFinishedPulling="2026-04-16 16:09:57.100480214 +0000 UTC m=+420.455470333" observedRunningTime="2026-04-16 16:09:57.511127937 +0000 UTC m=+420.866118067" watchObservedRunningTime="2026-04-16 16:09:57.511348292 +0000 UTC m=+420.866338412" Apr 16 16:10:06.429312 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:10:06.429281 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-825jp" Apr 16 16:10:08.472407 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:10:08.472376 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-8mn6q" Apr 16 16:10:18.469726 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:10:18.469632 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-jdfq4" Apr 16 16:10:51.185932 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:10:51.185898 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-7c68cb4fc8-jpfzr"] Apr 16 16:10:51.188975 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:10:51.188958 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7c68cb4fc8-jpfzr" Apr 16 16:10:51.191347 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:10:51.191324 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 16:10:51.191623 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:10:51.191604 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-xj8n2\"" Apr 16 16:10:51.192272 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:10:51.192258 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 16 16:10:51.192324 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:10:51.192266 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 16:10:51.199361 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:10:51.199333 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7c68cb4fc8-jpfzr"] Apr 16 16:10:51.269962 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:10:51.269927 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e5ba264-e9f4-4f72-b79f-c53d6c16b138-cert\") pod \"kserve-controller-manager-7c68cb4fc8-jpfzr\" (UID: \"8e5ba264-e9f4-4f72-b79f-c53d6c16b138\") " pod="kserve/kserve-controller-manager-7c68cb4fc8-jpfzr" Apr 16 16:10:51.270115 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:10:51.270015 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sx8k\" (UniqueName: \"kubernetes.io/projected/8e5ba264-e9f4-4f72-b79f-c53d6c16b138-kube-api-access-5sx8k\") pod \"kserve-controller-manager-7c68cb4fc8-jpfzr\" (UID: \"8e5ba264-e9f4-4f72-b79f-c53d6c16b138\") " pod="kserve/kserve-controller-manager-7c68cb4fc8-jpfzr" Apr 16 16:10:51.370828 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:10:51.370793 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5sx8k\" (UniqueName: \"kubernetes.io/projected/8e5ba264-e9f4-4f72-b79f-c53d6c16b138-kube-api-access-5sx8k\") pod \"kserve-controller-manager-7c68cb4fc8-jpfzr\" (UID: \"8e5ba264-e9f4-4f72-b79f-c53d6c16b138\") " pod="kserve/kserve-controller-manager-7c68cb4fc8-jpfzr" Apr 16 16:10:51.370982 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:10:51.370837 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e5ba264-e9f4-4f72-b79f-c53d6c16b138-cert\") pod \"kserve-controller-manager-7c68cb4fc8-jpfzr\" (UID: \"8e5ba264-e9f4-4f72-b79f-c53d6c16b138\") " pod="kserve/kserve-controller-manager-7c68cb4fc8-jpfzr" Apr 16 16:10:51.373176 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:10:51.373157 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e5ba264-e9f4-4f72-b79f-c53d6c16b138-cert\") pod \"kserve-controller-manager-7c68cb4fc8-jpfzr\" (UID: \"8e5ba264-e9f4-4f72-b79f-c53d6c16b138\") " pod="kserve/kserve-controller-manager-7c68cb4fc8-jpfzr" Apr 16 16:10:51.378977 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:10:51.378957 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sx8k\" (UniqueName: \"kubernetes.io/projected/8e5ba264-e9f4-4f72-b79f-c53d6c16b138-kube-api-access-5sx8k\") pod \"kserve-controller-manager-7c68cb4fc8-jpfzr\" (UID: \"8e5ba264-e9f4-4f72-b79f-c53d6c16b138\") " pod="kserve/kserve-controller-manager-7c68cb4fc8-jpfzr" Apr 16 16:10:51.499364 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:10:51.499271 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7c68cb4fc8-jpfzr" Apr 16 16:10:51.616245 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:10:51.616193 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7c68cb4fc8-jpfzr"] Apr 16 16:10:51.619468 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:10:51.619434 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e5ba264_e9f4_4f72_b79f_c53d6c16b138.slice/crio-43567f19d4740c77f81036920540fa206bb2d13e26bdc36a51e11231ae304b9b WatchSource:0}: Error finding container 43567f19d4740c77f81036920540fa206bb2d13e26bdc36a51e11231ae304b9b: Status 404 returned error can't find the container with id 43567f19d4740c77f81036920540fa206bb2d13e26bdc36a51e11231ae304b9b Apr 16 16:10:51.632133 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:10:51.632110 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7c68cb4fc8-jpfzr" event={"ID":"8e5ba264-e9f4-4f72-b79f-c53d6c16b138","Type":"ContainerStarted","Data":"43567f19d4740c77f81036920540fa206bb2d13e26bdc36a51e11231ae304b9b"} Apr 16 16:10:55.648464 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:10:55.648423 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7c68cb4fc8-jpfzr" event={"ID":"8e5ba264-e9f4-4f72-b79f-c53d6c16b138","Type":"ContainerStarted","Data":"9f168ff56d85d0d67d00fa1a803b28760d48e4c2d3161c568a975e7dcfd80978"} Apr 16 16:10:55.648850 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:10:55.648561 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-7c68cb4fc8-jpfzr" Apr 16 16:10:55.664102 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:10:55.664058 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-7c68cb4fc8-jpfzr" podStartSLOduration=1.320844031 podStartE2EDuration="4.664042666s" podCreationTimestamp="2026-04-16 16:10:51 +0000 UTC" firstStartedPulling="2026-04-16 16:10:51.620695076 +0000 UTC m=+474.975685189" lastFinishedPulling="2026-04-16 16:10:54.963893718 +0000 UTC m=+478.318883824" observedRunningTime="2026-04-16 16:10:55.66371105 +0000 UTC m=+479.018701181" watchObservedRunningTime="2026-04-16 16:10:55.664042666 +0000 UTC m=+479.019032794" Apr 16 16:11:26.656821 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:26.656791 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-7c68cb4fc8-jpfzr" Apr 16 16:11:27.716384 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:27.716343 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-7c68cb4fc8-jpfzr"] Apr 16 16:11:27.716847 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:27.716666 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-7c68cb4fc8-jpfzr" podUID="8e5ba264-e9f4-4f72-b79f-c53d6c16b138" containerName="manager" containerID="cri-o://9f168ff56d85d0d67d00fa1a803b28760d48e4c2d3161c568a975e7dcfd80978" gracePeriod=10 Apr 16 16:11:27.739852 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:27.739829 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-7c68cb4fc8-gwb89"] Apr 16 16:11:27.743423 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:27.743406 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7c68cb4fc8-gwb89" Apr 16 16:11:27.752852 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:27.752829 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7c68cb4fc8-gwb89"] Apr 16 16:11:27.850752 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:27.850720 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwxzv\" (UniqueName: \"kubernetes.io/projected/68ccd199-6bbd-41b6-b01a-ca60d98f86e9-kube-api-access-zwxzv\") pod \"kserve-controller-manager-7c68cb4fc8-gwb89\" (UID: \"68ccd199-6bbd-41b6-b01a-ca60d98f86e9\") " pod="kserve/kserve-controller-manager-7c68cb4fc8-gwb89" Apr 16 16:11:27.850852 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:27.850785 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/68ccd199-6bbd-41b6-b01a-ca60d98f86e9-cert\") pod \"kserve-controller-manager-7c68cb4fc8-gwb89\" (UID: \"68ccd199-6bbd-41b6-b01a-ca60d98f86e9\") " pod="kserve/kserve-controller-manager-7c68cb4fc8-gwb89" Apr 16 16:11:27.951155 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:27.951128 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/68ccd199-6bbd-41b6-b01a-ca60d98f86e9-cert\") pod \"kserve-controller-manager-7c68cb4fc8-gwb89\" (UID: \"68ccd199-6bbd-41b6-b01a-ca60d98f86e9\") " pod="kserve/kserve-controller-manager-7c68cb4fc8-gwb89" Apr 16 16:11:27.951278 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:27.951180 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zwxzv\" (UniqueName: \"kubernetes.io/projected/68ccd199-6bbd-41b6-b01a-ca60d98f86e9-kube-api-access-zwxzv\") pod \"kserve-controller-manager-7c68cb4fc8-gwb89\" (UID: \"68ccd199-6bbd-41b6-b01a-ca60d98f86e9\") " pod="kserve/kserve-controller-manager-7c68cb4fc8-gwb89" Apr 16 16:11:27.951805 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:27.951787 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7c68cb4fc8-jpfzr" Apr 16 16:11:27.953492 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:27.953475 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/68ccd199-6bbd-41b6-b01a-ca60d98f86e9-cert\") pod \"kserve-controller-manager-7c68cb4fc8-gwb89\" (UID: \"68ccd199-6bbd-41b6-b01a-ca60d98f86e9\") " pod="kserve/kserve-controller-manager-7c68cb4fc8-gwb89" Apr 16 16:11:27.959079 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:27.959056 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwxzv\" (UniqueName: \"kubernetes.io/projected/68ccd199-6bbd-41b6-b01a-ca60d98f86e9-kube-api-access-zwxzv\") pod \"kserve-controller-manager-7c68cb4fc8-gwb89\" (UID: \"68ccd199-6bbd-41b6-b01a-ca60d98f86e9\") " pod="kserve/kserve-controller-manager-7c68cb4fc8-gwb89" Apr 16 16:11:28.051638 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:28.051564 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sx8k\" (UniqueName: \"kubernetes.io/projected/8e5ba264-e9f4-4f72-b79f-c53d6c16b138-kube-api-access-5sx8k\") pod \"8e5ba264-e9f4-4f72-b79f-c53d6c16b138\" (UID: \"8e5ba264-e9f4-4f72-b79f-c53d6c16b138\") " Apr 16 16:11:28.051772 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:28.051636 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e5ba264-e9f4-4f72-b79f-c53d6c16b138-cert\") pod \"8e5ba264-e9f4-4f72-b79f-c53d6c16b138\" (UID: \"8e5ba264-e9f4-4f72-b79f-c53d6c16b138\") " Apr 16 16:11:28.053615 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:28.053582 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e5ba264-e9f4-4f72-b79f-c53d6c16b138-kube-api-access-5sx8k" (OuterVolumeSpecName: "kube-api-access-5sx8k") pod "8e5ba264-e9f4-4f72-b79f-c53d6c16b138" (UID: "8e5ba264-e9f4-4f72-b79f-c53d6c16b138"). InnerVolumeSpecName "kube-api-access-5sx8k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:11:28.053615 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:28.053592 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e5ba264-e9f4-4f72-b79f-c53d6c16b138-cert" (OuterVolumeSpecName: "cert") pod "8e5ba264-e9f4-4f72-b79f-c53d6c16b138" (UID: "8e5ba264-e9f4-4f72-b79f-c53d6c16b138"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:11:28.093906 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:28.093881 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7c68cb4fc8-gwb89" Apr 16 16:11:28.153329 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:28.153297 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5sx8k\" (UniqueName: \"kubernetes.io/projected/8e5ba264-e9f4-4f72-b79f-c53d6c16b138-kube-api-access-5sx8k\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:11:28.153451 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:28.153333 2576 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8e5ba264-e9f4-4f72-b79f-c53d6c16b138-cert\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:11:28.210314 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:28.210286 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-7c68cb4fc8-gwb89"] Apr 16 16:11:28.212636 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:11:28.212611 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68ccd199_6bbd_41b6_b01a_ca60d98f86e9.slice/crio-3a6849a2bfa0e843457c0063b50b3b2883055175445597008d5a2569a54f0954 WatchSource:0}: Error finding container 3a6849a2bfa0e843457c0063b50b3b2883055175445597008d5a2569a54f0954: Status 404 returned error can't find the container with id 3a6849a2bfa0e843457c0063b50b3b2883055175445597008d5a2569a54f0954 Apr 16 16:11:28.748013 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:28.747986 2576 generic.go:358] "Generic (PLEG): container finished" podID="8e5ba264-e9f4-4f72-b79f-c53d6c16b138" containerID="9f168ff56d85d0d67d00fa1a803b28760d48e4c2d3161c568a975e7dcfd80978" exitCode=0 Apr 16 16:11:28.748359 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:28.748048 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-7c68cb4fc8-jpfzr" Apr 16 16:11:28.748359 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:28.748062 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7c68cb4fc8-jpfzr" event={"ID":"8e5ba264-e9f4-4f72-b79f-c53d6c16b138","Type":"ContainerDied","Data":"9f168ff56d85d0d67d00fa1a803b28760d48e4c2d3161c568a975e7dcfd80978"} Apr 16 16:11:28.748359 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:28.748092 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7c68cb4fc8-jpfzr" event={"ID":"8e5ba264-e9f4-4f72-b79f-c53d6c16b138","Type":"ContainerDied","Data":"43567f19d4740c77f81036920540fa206bb2d13e26bdc36a51e11231ae304b9b"} Apr 16 16:11:28.748359 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:28.748106 2576 scope.go:117] "RemoveContainer" containerID="9f168ff56d85d0d67d00fa1a803b28760d48e4c2d3161c568a975e7dcfd80978" Apr 16 16:11:28.749739 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:28.749718 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7c68cb4fc8-gwb89" event={"ID":"68ccd199-6bbd-41b6-b01a-ca60d98f86e9","Type":"ContainerStarted","Data":"b14c662f4895b8c22d43f168339519296e867941bd97f53ced63e8338622fb38"} Apr 16 16:11:28.749845 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:28.749748 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-7c68cb4fc8-gwb89" event={"ID":"68ccd199-6bbd-41b6-b01a-ca60d98f86e9","Type":"ContainerStarted","Data":"3a6849a2bfa0e843457c0063b50b3b2883055175445597008d5a2569a54f0954"} Apr 16 16:11:28.749902 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:28.749849 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-7c68cb4fc8-gwb89" Apr 16 16:11:28.756272 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:28.756256 2576 scope.go:117] "RemoveContainer" containerID="9f168ff56d85d0d67d00fa1a803b28760d48e4c2d3161c568a975e7dcfd80978" Apr 16 16:11:28.756583 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:11:28.756563 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f168ff56d85d0d67d00fa1a803b28760d48e4c2d3161c568a975e7dcfd80978\": container with ID starting with 9f168ff56d85d0d67d00fa1a803b28760d48e4c2d3161c568a975e7dcfd80978 not found: ID does not exist" containerID="9f168ff56d85d0d67d00fa1a803b28760d48e4c2d3161c568a975e7dcfd80978" Apr 16 16:11:28.756662 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:28.756587 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f168ff56d85d0d67d00fa1a803b28760d48e4c2d3161c568a975e7dcfd80978"} err="failed to get container status \"9f168ff56d85d0d67d00fa1a803b28760d48e4c2d3161c568a975e7dcfd80978\": rpc error: code = NotFound desc = could not find container \"9f168ff56d85d0d67d00fa1a803b28760d48e4c2d3161c568a975e7dcfd80978\": container with ID starting with 9f168ff56d85d0d67d00fa1a803b28760d48e4c2d3161c568a975e7dcfd80978 not found: ID does not exist" Apr 16 16:11:28.767523 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:28.767489 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-7c68cb4fc8-gwb89" podStartSLOduration=1.4276118389999999 podStartE2EDuration="1.767478212s" podCreationTimestamp="2026-04-16 16:11:27 +0000 UTC" firstStartedPulling="2026-04-16 16:11:28.213807841 +0000 UTC m=+511.568797953" lastFinishedPulling="2026-04-16 16:11:28.553674219 +0000 UTC m=+511.908664326" observedRunningTime="2026-04-16 16:11:28.766113847 +0000 UTC m=+512.121103977" watchObservedRunningTime="2026-04-16 16:11:28.767478212 +0000 UTC m=+512.122468372" Apr 16 16:11:28.778611 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:28.778594 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-7c68cb4fc8-jpfzr"] Apr 16 16:11:28.780695 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:28.780675 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-7c68cb4fc8-jpfzr"] Apr 16 16:11:29.279082 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:29.279052 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e5ba264-e9f4-4f72-b79f-c53d6c16b138" path="/var/lib/kubelet/pods/8e5ba264-e9f4-4f72-b79f-c53d6c16b138/volumes" Apr 16 16:11:33.069057 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:33.069028 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-69589b6f45-b29qs"] Apr 16 16:11:33.069451 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:33.069316 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e5ba264-e9f4-4f72-b79f-c53d6c16b138" containerName="manager" Apr 16 16:11:33.069451 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:33.069328 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e5ba264-e9f4-4f72-b79f-c53d6c16b138" containerName="manager" Apr 16 16:11:33.069451 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:33.069377 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="8e5ba264-e9f4-4f72-b79f-c53d6c16b138" containerName="manager" Apr 16 16:11:33.073435 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:33.073415 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69589b6f45-b29qs" Apr 16 16:11:33.080487 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:33.080464 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-69589b6f45-b29qs"] Apr 16 16:11:33.192483 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:33.192455 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/de4905de-aba1-464b-b9be-0bb515ddf375-oauth-serving-cert\") pod \"console-69589b6f45-b29qs\" (UID: \"de4905de-aba1-464b-b9be-0bb515ddf375\") " pod="openshift-console/console-69589b6f45-b29qs" Apr 16 16:11:33.192483 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:33.192483 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5zsd\" (UniqueName: \"kubernetes.io/projected/de4905de-aba1-464b-b9be-0bb515ddf375-kube-api-access-b5zsd\") pod \"console-69589b6f45-b29qs\" (UID: \"de4905de-aba1-464b-b9be-0bb515ddf375\") " pod="openshift-console/console-69589b6f45-b29qs" Apr 16 16:11:33.192629 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:33.192503 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/de4905de-aba1-464b-b9be-0bb515ddf375-console-serving-cert\") pod \"console-69589b6f45-b29qs\" (UID: \"de4905de-aba1-464b-b9be-0bb515ddf375\") " pod="openshift-console/console-69589b6f45-b29qs" Apr 16 16:11:33.192629 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:33.192536 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/de4905de-aba1-464b-b9be-0bb515ddf375-console-oauth-config\") pod \"console-69589b6f45-b29qs\" (UID: \"de4905de-aba1-464b-b9be-0bb515ddf375\") " pod="openshift-console/console-69589b6f45-b29qs" Apr 16 16:11:33.192629 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:33.192564 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/de4905de-aba1-464b-b9be-0bb515ddf375-console-config\") pod \"console-69589b6f45-b29qs\" (UID: \"de4905de-aba1-464b-b9be-0bb515ddf375\") " pod="openshift-console/console-69589b6f45-b29qs" Apr 16 16:11:33.192629 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:33.192582 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de4905de-aba1-464b-b9be-0bb515ddf375-trusted-ca-bundle\") pod \"console-69589b6f45-b29qs\" (UID: \"de4905de-aba1-464b-b9be-0bb515ddf375\") " pod="openshift-console/console-69589b6f45-b29qs" Apr 16 16:11:33.192629 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:33.192596 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/de4905de-aba1-464b-b9be-0bb515ddf375-service-ca\") pod \"console-69589b6f45-b29qs\" (UID: \"de4905de-aba1-464b-b9be-0bb515ddf375\") " pod="openshift-console/console-69589b6f45-b29qs" Apr 16 16:11:33.293310 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:33.293280 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/de4905de-aba1-464b-b9be-0bb515ddf375-console-config\") pod \"console-69589b6f45-b29qs\" (UID: \"de4905de-aba1-464b-b9be-0bb515ddf375\") " pod="openshift-console/console-69589b6f45-b29qs" Apr 16 16:11:33.293425 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:33.293316 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de4905de-aba1-464b-b9be-0bb515ddf375-trusted-ca-bundle\") pod \"console-69589b6f45-b29qs\" (UID: \"de4905de-aba1-464b-b9be-0bb515ddf375\") " pod="openshift-console/console-69589b6f45-b29qs" Apr 16 16:11:33.293425 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:33.293333 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/de4905de-aba1-464b-b9be-0bb515ddf375-service-ca\") pod \"console-69589b6f45-b29qs\" (UID: \"de4905de-aba1-464b-b9be-0bb515ddf375\") " pod="openshift-console/console-69589b6f45-b29qs" Apr 16 16:11:33.293425 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:33.293388 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/de4905de-aba1-464b-b9be-0bb515ddf375-oauth-serving-cert\") pod \"console-69589b6f45-b29qs\" (UID: \"de4905de-aba1-464b-b9be-0bb515ddf375\") " pod="openshift-console/console-69589b6f45-b29qs" Apr 16 16:11:33.293425 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:33.293405 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b5zsd\" (UniqueName: \"kubernetes.io/projected/de4905de-aba1-464b-b9be-0bb515ddf375-kube-api-access-b5zsd\") pod \"console-69589b6f45-b29qs\" (UID: \"de4905de-aba1-464b-b9be-0bb515ddf375\") " pod="openshift-console/console-69589b6f45-b29qs" Apr 16 16:11:33.293425 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:33.293422 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/de4905de-aba1-464b-b9be-0bb515ddf375-console-serving-cert\") pod \"console-69589b6f45-b29qs\" (UID: \"de4905de-aba1-464b-b9be-0bb515ddf375\") " pod="openshift-console/console-69589b6f45-b29qs" Apr 16 16:11:33.293625 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:33.293443 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/de4905de-aba1-464b-b9be-0bb515ddf375-console-oauth-config\") pod \"console-69589b6f45-b29qs\" (UID: \"de4905de-aba1-464b-b9be-0bb515ddf375\") " pod="openshift-console/console-69589b6f45-b29qs" Apr 16 16:11:33.294098 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:33.294067 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/de4905de-aba1-464b-b9be-0bb515ddf375-console-config\") pod \"console-69589b6f45-b29qs\" (UID: \"de4905de-aba1-464b-b9be-0bb515ddf375\") " pod="openshift-console/console-69589b6f45-b29qs" Apr 16 16:11:33.294242 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:33.294181 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/de4905de-aba1-464b-b9be-0bb515ddf375-service-ca\") pod \"console-69589b6f45-b29qs\" (UID: \"de4905de-aba1-464b-b9be-0bb515ddf375\") " pod="openshift-console/console-69589b6f45-b29qs" Apr 16 16:11:33.294323 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:33.294287 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/de4905de-aba1-464b-b9be-0bb515ddf375-oauth-serving-cert\") pod \"console-69589b6f45-b29qs\" (UID: \"de4905de-aba1-464b-b9be-0bb515ddf375\") " pod="openshift-console/console-69589b6f45-b29qs" Apr 16 16:11:33.294384 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:33.294329 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de4905de-aba1-464b-b9be-0bb515ddf375-trusted-ca-bundle\") pod \"console-69589b6f45-b29qs\" (UID: \"de4905de-aba1-464b-b9be-0bb515ddf375\") " pod="openshift-console/console-69589b6f45-b29qs" Apr 16 16:11:33.295912 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:33.295883 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/de4905de-aba1-464b-b9be-0bb515ddf375-console-serving-cert\") pod \"console-69589b6f45-b29qs\" (UID: \"de4905de-aba1-464b-b9be-0bb515ddf375\") " pod="openshift-console/console-69589b6f45-b29qs" Apr 16 16:11:33.296034 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:33.296018 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/de4905de-aba1-464b-b9be-0bb515ddf375-console-oauth-config\") pod \"console-69589b6f45-b29qs\" (UID: \"de4905de-aba1-464b-b9be-0bb515ddf375\") " pod="openshift-console/console-69589b6f45-b29qs" Apr 16 16:11:33.301101 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:33.301074 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5zsd\" (UniqueName: \"kubernetes.io/projected/de4905de-aba1-464b-b9be-0bb515ddf375-kube-api-access-b5zsd\") pod \"console-69589b6f45-b29qs\" (UID: \"de4905de-aba1-464b-b9be-0bb515ddf375\") " pod="openshift-console/console-69589b6f45-b29qs" Apr 16 16:11:33.382875 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:33.382815 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69589b6f45-b29qs" Apr 16 16:11:33.502570 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:33.502548 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-69589b6f45-b29qs"] Apr 16 16:11:33.504928 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:11:33.504898 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde4905de_aba1_464b_b9be_0bb515ddf375.slice/crio-c9c69a715859a828ca2c8660635623e44ac00d4b8424d2df24d7732223c62b5c WatchSource:0}: Error finding container c9c69a715859a828ca2c8660635623e44ac00d4b8424d2df24d7732223c62b5c: Status 404 returned error can't find the container with id c9c69a715859a828ca2c8660635623e44ac00d4b8424d2df24d7732223c62b5c Apr 16 16:11:33.765642 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:33.765607 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69589b6f45-b29qs" event={"ID":"de4905de-aba1-464b-b9be-0bb515ddf375","Type":"ContainerStarted","Data":"a08d64dda6433a1ea8da2c0e6ee1ad603c902c0ae2d80931357dedb2cdaf2f43"} Apr 16 16:11:33.765642 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:33.765644 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69589b6f45-b29qs" event={"ID":"de4905de-aba1-464b-b9be-0bb515ddf375","Type":"ContainerStarted","Data":"c9c69a715859a828ca2c8660635623e44ac00d4b8424d2df24d7732223c62b5c"} Apr 16 16:11:33.783954 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:33.783909 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-69589b6f45-b29qs" podStartSLOduration=0.78389808 podStartE2EDuration="783.89808ms" podCreationTimestamp="2026-04-16 16:11:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:11:33.783012898 +0000 UTC m=+517.138003028" watchObservedRunningTime="2026-04-16 16:11:33.78389808 +0000 UTC m=+517.138888200" Apr 16 16:11:43.383814 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:43.383768 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-69589b6f45-b29qs" Apr 16 16:11:43.383814 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:43.383818 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-69589b6f45-b29qs" Apr 16 16:11:43.388388 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:43.388367 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-69589b6f45-b29qs" Apr 16 16:11:43.799474 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:43.799448 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-69589b6f45-b29qs" Apr 16 16:11:43.843161 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:43.843133 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5fd85f4cb6-r9bxs"] Apr 16 16:11:59.757853 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:11:59.757823 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-7c68cb4fc8-gwb89" Apr 16 16:12:00.642181 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:00.642152 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-mw9hq"] Apr 16 16:12:00.646620 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:00.646599 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-mw9hq" Apr 16 16:12:00.648774 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:00.648753 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 16 16:12:00.648884 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:00.648794 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-d6vkv\"" Apr 16 16:12:00.654865 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:00.654842 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-mw9hq"] Apr 16 16:12:00.698040 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:00.698010 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a3adf15a-d6a5-42dc-b9f0-f81e45ce0206-tls-certs\") pod \"model-serving-api-86f7b4b499-mw9hq\" (UID: \"a3adf15a-d6a5-42dc-b9f0-f81e45ce0206\") " pod="kserve/model-serving-api-86f7b4b499-mw9hq" Apr 16 16:12:00.698155 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:00.698051 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84qrv\" (UniqueName: \"kubernetes.io/projected/a3adf15a-d6a5-42dc-b9f0-f81e45ce0206-kube-api-access-84qrv\") pod \"model-serving-api-86f7b4b499-mw9hq\" (UID: \"a3adf15a-d6a5-42dc-b9f0-f81e45ce0206\") " pod="kserve/model-serving-api-86f7b4b499-mw9hq" Apr 16 16:12:00.798546 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:00.798519 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a3adf15a-d6a5-42dc-b9f0-f81e45ce0206-tls-certs\") pod \"model-serving-api-86f7b4b499-mw9hq\" (UID: \"a3adf15a-d6a5-42dc-b9f0-f81e45ce0206\") " pod="kserve/model-serving-api-86f7b4b499-mw9hq" Apr 16 16:12:00.798880 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:00.798553 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84qrv\" (UniqueName: \"kubernetes.io/projected/a3adf15a-d6a5-42dc-b9f0-f81e45ce0206-kube-api-access-84qrv\") pod \"model-serving-api-86f7b4b499-mw9hq\" (UID: \"a3adf15a-d6a5-42dc-b9f0-f81e45ce0206\") " pod="kserve/model-serving-api-86f7b4b499-mw9hq" Apr 16 16:12:00.798880 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:12:00.798666 2576 secret.go:189] Couldn't get secret kserve/model-serving-api-tls: secret "model-serving-api-tls" not found Apr 16 16:12:00.798880 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:12:00.798740 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3adf15a-d6a5-42dc-b9f0-f81e45ce0206-tls-certs podName:a3adf15a-d6a5-42dc-b9f0-f81e45ce0206 nodeName:}" failed. No retries permitted until 2026-04-16 16:12:01.298723185 +0000 UTC m=+544.653713291 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/a3adf15a-d6a5-42dc-b9f0-f81e45ce0206-tls-certs") pod "model-serving-api-86f7b4b499-mw9hq" (UID: "a3adf15a-d6a5-42dc-b9f0-f81e45ce0206") : secret "model-serving-api-tls" not found Apr 16 16:12:00.806721 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:00.806702 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-84qrv\" (UniqueName: \"kubernetes.io/projected/a3adf15a-d6a5-42dc-b9f0-f81e45ce0206-kube-api-access-84qrv\") pod \"model-serving-api-86f7b4b499-mw9hq\" (UID: \"a3adf15a-d6a5-42dc-b9f0-f81e45ce0206\") " pod="kserve/model-serving-api-86f7b4b499-mw9hq" Apr 16 16:12:01.303334 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:01.303305 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a3adf15a-d6a5-42dc-b9f0-f81e45ce0206-tls-certs\") pod \"model-serving-api-86f7b4b499-mw9hq\" (UID: \"a3adf15a-d6a5-42dc-b9f0-f81e45ce0206\") " pod="kserve/model-serving-api-86f7b4b499-mw9hq" Apr 16 16:12:01.305599 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:01.305569 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a3adf15a-d6a5-42dc-b9f0-f81e45ce0206-tls-certs\") pod \"model-serving-api-86f7b4b499-mw9hq\" (UID: \"a3adf15a-d6a5-42dc-b9f0-f81e45ce0206\") " pod="kserve/model-serving-api-86f7b4b499-mw9hq" Apr 16 16:12:01.557474 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:01.557391 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-mw9hq" Apr 16 16:12:01.677450 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:01.677426 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-mw9hq"] Apr 16 16:12:01.679686 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:12:01.679658 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3adf15a_d6a5_42dc_b9f0_f81e45ce0206.slice/crio-13ae615bd6a965350c9bbf806525bd6387790f91a17f2f3447b8584abaf504ee WatchSource:0}: Error finding container 13ae615bd6a965350c9bbf806525bd6387790f91a17f2f3447b8584abaf504ee: Status 404 returned error can't find the container with id 13ae615bd6a965350c9bbf806525bd6387790f91a17f2f3447b8584abaf504ee Apr 16 16:12:01.848966 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:01.848890 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-mw9hq" event={"ID":"a3adf15a-d6a5-42dc-b9f0-f81e45ce0206","Type":"ContainerStarted","Data":"13ae615bd6a965350c9bbf806525bd6387790f91a17f2f3447b8584abaf504ee"} Apr 16 16:12:04.861835 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:04.861800 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-mw9hq" event={"ID":"a3adf15a-d6a5-42dc-b9f0-f81e45ce0206","Type":"ContainerStarted","Data":"ba84700ddc7620d94fed8f2bdbf1f41688a52e2ba04fa9fa52f70c6ebcb083f1"} Apr 16 16:12:04.862200 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:04.861944 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-mw9hq" Apr 16 16:12:04.877454 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:04.877412 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-mw9hq" podStartSLOduration=2.570911127 podStartE2EDuration="4.877398218s" podCreationTimestamp="2026-04-16 16:12:00 +0000 UTC" firstStartedPulling="2026-04-16 16:12:01.681512823 +0000 UTC m=+545.036502936" lastFinishedPulling="2026-04-16 16:12:03.98799992 +0000 UTC m=+547.342990027" observedRunningTime="2026-04-16 16:12:04.876754443 +0000 UTC m=+548.231744571" watchObservedRunningTime="2026-04-16 16:12:04.877398218 +0000 UTC m=+548.232388346" Apr 16 16:12:08.862231 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:08.862174 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5fd85f4cb6-r9bxs" podUID="74c7ba9b-eb07-4da1-879f-8f4413d87608" containerName="console" containerID="cri-o://6e3dce62dd0ebc455fe1894f9d3948ff06cffdf20690cac3bc123a006bbdeb8d" gracePeriod=15 Apr 16 16:12:09.099644 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:09.099622 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5fd85f4cb6-r9bxs_74c7ba9b-eb07-4da1-879f-8f4413d87608/console/0.log" Apr 16 16:12:09.099760 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:09.099693 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fd85f4cb6-r9bxs" Apr 16 16:12:09.162373 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:09.162353 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/74c7ba9b-eb07-4da1-879f-8f4413d87608-console-serving-cert\") pod \"74c7ba9b-eb07-4da1-879f-8f4413d87608\" (UID: \"74c7ba9b-eb07-4da1-879f-8f4413d87608\") " Apr 16 16:12:09.162490 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:09.162386 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/74c7ba9b-eb07-4da1-879f-8f4413d87608-oauth-serving-cert\") pod \"74c7ba9b-eb07-4da1-879f-8f4413d87608\" (UID: \"74c7ba9b-eb07-4da1-879f-8f4413d87608\") " Apr 16 16:12:09.162490 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:09.162403 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74c7ba9b-eb07-4da1-879f-8f4413d87608-trusted-ca-bundle\") pod \"74c7ba9b-eb07-4da1-879f-8f4413d87608\" (UID: \"74c7ba9b-eb07-4da1-879f-8f4413d87608\") " Apr 16 16:12:09.162490 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:09.162439 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/74c7ba9b-eb07-4da1-879f-8f4413d87608-console-oauth-config\") pod \"74c7ba9b-eb07-4da1-879f-8f4413d87608\" (UID: \"74c7ba9b-eb07-4da1-879f-8f4413d87608\") " Apr 16 16:12:09.162490 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:09.162469 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/74c7ba9b-eb07-4da1-879f-8f4413d87608-service-ca\") pod \"74c7ba9b-eb07-4da1-879f-8f4413d87608\" (UID: \"74c7ba9b-eb07-4da1-879f-8f4413d87608\") " Apr 16 16:12:09.162662 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:09.162500 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/74c7ba9b-eb07-4da1-879f-8f4413d87608-console-config\") pod \"74c7ba9b-eb07-4da1-879f-8f4413d87608\" (UID: \"74c7ba9b-eb07-4da1-879f-8f4413d87608\") " Apr 16 16:12:09.162662 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:09.162578 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spbn8\" (UniqueName: \"kubernetes.io/projected/74c7ba9b-eb07-4da1-879f-8f4413d87608-kube-api-access-spbn8\") pod \"74c7ba9b-eb07-4da1-879f-8f4413d87608\" (UID: \"74c7ba9b-eb07-4da1-879f-8f4413d87608\") " Apr 16 16:12:09.162837 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:09.162797 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74c7ba9b-eb07-4da1-879f-8f4413d87608-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "74c7ba9b-eb07-4da1-879f-8f4413d87608" (UID: "74c7ba9b-eb07-4da1-879f-8f4413d87608"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:12:09.162944 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:09.162903 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74c7ba9b-eb07-4da1-879f-8f4413d87608-service-ca" (OuterVolumeSpecName: "service-ca") pod "74c7ba9b-eb07-4da1-879f-8f4413d87608" (UID: "74c7ba9b-eb07-4da1-879f-8f4413d87608"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:12:09.162944 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:09.162911 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74c7ba9b-eb07-4da1-879f-8f4413d87608-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "74c7ba9b-eb07-4da1-879f-8f4413d87608" (UID: "74c7ba9b-eb07-4da1-879f-8f4413d87608"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:12:09.163114 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:09.162997 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74c7ba9b-eb07-4da1-879f-8f4413d87608-console-config" (OuterVolumeSpecName: "console-config") pod "74c7ba9b-eb07-4da1-879f-8f4413d87608" (UID: "74c7ba9b-eb07-4da1-879f-8f4413d87608"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:12:09.164612 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:09.164589 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74c7ba9b-eb07-4da1-879f-8f4413d87608-kube-api-access-spbn8" (OuterVolumeSpecName: "kube-api-access-spbn8") pod "74c7ba9b-eb07-4da1-879f-8f4413d87608" (UID: "74c7ba9b-eb07-4da1-879f-8f4413d87608"). InnerVolumeSpecName "kube-api-access-spbn8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:12:09.164736 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:09.164714 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74c7ba9b-eb07-4da1-879f-8f4413d87608-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "74c7ba9b-eb07-4da1-879f-8f4413d87608" (UID: "74c7ba9b-eb07-4da1-879f-8f4413d87608"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:12:09.164818 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:09.164775 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74c7ba9b-eb07-4da1-879f-8f4413d87608-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "74c7ba9b-eb07-4da1-879f-8f4413d87608" (UID: "74c7ba9b-eb07-4da1-879f-8f4413d87608"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:12:09.263798 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:09.263774 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/74c7ba9b-eb07-4da1-879f-8f4413d87608-console-serving-cert\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:12:09.263798 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:09.263795 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/74c7ba9b-eb07-4da1-879f-8f4413d87608-oauth-serving-cert\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:12:09.263918 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:09.263805 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74c7ba9b-eb07-4da1-879f-8f4413d87608-trusted-ca-bundle\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:12:09.263918 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:09.263814 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/74c7ba9b-eb07-4da1-879f-8f4413d87608-console-oauth-config\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:12:09.263918 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:09.263822 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/74c7ba9b-eb07-4da1-879f-8f4413d87608-service-ca\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:12:09.263918 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:09.263831 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/74c7ba9b-eb07-4da1-879f-8f4413d87608-console-config\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:12:09.263918 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:09.263839 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-spbn8\" (UniqueName: \"kubernetes.io/projected/74c7ba9b-eb07-4da1-879f-8f4413d87608-kube-api-access-spbn8\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:12:09.877499 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:09.877469 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5fd85f4cb6-r9bxs_74c7ba9b-eb07-4da1-879f-8f4413d87608/console/0.log" Apr 16 16:12:09.877936 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:09.877514 2576 generic.go:358] "Generic (PLEG): container finished" podID="74c7ba9b-eb07-4da1-879f-8f4413d87608" containerID="6e3dce62dd0ebc455fe1894f9d3948ff06cffdf20690cac3bc123a006bbdeb8d" exitCode=2 Apr 16 16:12:09.877936 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:09.877580 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5fd85f4cb6-r9bxs" Apr 16 16:12:09.877936 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:09.877589 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fd85f4cb6-r9bxs" event={"ID":"74c7ba9b-eb07-4da1-879f-8f4413d87608","Type":"ContainerDied","Data":"6e3dce62dd0ebc455fe1894f9d3948ff06cffdf20690cac3bc123a006bbdeb8d"} Apr 16 16:12:09.877936 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:09.877625 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5fd85f4cb6-r9bxs" event={"ID":"74c7ba9b-eb07-4da1-879f-8f4413d87608","Type":"ContainerDied","Data":"07aabae5aed6e8af1d1695d708fdeabfff968e57cec5090a44f91b281a547bd4"} Apr 16 16:12:09.877936 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:09.877647 2576 scope.go:117] "RemoveContainer" containerID="6e3dce62dd0ebc455fe1894f9d3948ff06cffdf20690cac3bc123a006bbdeb8d" Apr 16 16:12:09.885712 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:09.885697 2576 scope.go:117] "RemoveContainer" containerID="6e3dce62dd0ebc455fe1894f9d3948ff06cffdf20690cac3bc123a006bbdeb8d" Apr 16 16:12:09.885974 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:12:09.885949 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e3dce62dd0ebc455fe1894f9d3948ff06cffdf20690cac3bc123a006bbdeb8d\": container with ID starting with 6e3dce62dd0ebc455fe1894f9d3948ff06cffdf20690cac3bc123a006bbdeb8d not found: ID does not exist" containerID="6e3dce62dd0ebc455fe1894f9d3948ff06cffdf20690cac3bc123a006bbdeb8d" Apr 16 16:12:09.886032 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:09.885975 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e3dce62dd0ebc455fe1894f9d3948ff06cffdf20690cac3bc123a006bbdeb8d"} err="failed to get container status \"6e3dce62dd0ebc455fe1894f9d3948ff06cffdf20690cac3bc123a006bbdeb8d\": rpc error: code = NotFound desc = could not find container \"6e3dce62dd0ebc455fe1894f9d3948ff06cffdf20690cac3bc123a006bbdeb8d\": container with ID starting with 6e3dce62dd0ebc455fe1894f9d3948ff06cffdf20690cac3bc123a006bbdeb8d not found: ID does not exist" Apr 16 16:12:09.895033 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:09.895008 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5fd85f4cb6-r9bxs"] Apr 16 16:12:09.896589 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:09.896562 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5fd85f4cb6-r9bxs"] Apr 16 16:12:10.057166 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:10.057139 2576 patch_prober.go:28] interesting pod/console-5fd85f4cb6-r9bxs container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.133.0.20:8443/health\": context deadline exceeded" start-of-body= Apr 16 16:12:10.057291 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:10.057187 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/console-5fd85f4cb6-r9bxs" podUID="74c7ba9b-eb07-4da1-879f-8f4413d87608" containerName="console" probeResult="failure" output="Get \"https://10.133.0.20:8443/health\": context deadline exceeded" Apr 16 16:12:11.277959 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:11.277925 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74c7ba9b-eb07-4da1-879f-8f4413d87608" path="/var/lib/kubelet/pods/74c7ba9b-eb07-4da1-879f-8f4413d87608/volumes" Apr 16 16:12:15.869033 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:15.869005 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-mw9hq" Apr 16 16:12:43.826535 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:43.826505 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-n55dz"] Apr 16 16:12:43.826915 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:43.826783 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="74c7ba9b-eb07-4da1-879f-8f4413d87608" containerName="console" Apr 16 16:12:43.826915 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:43.826793 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="74c7ba9b-eb07-4da1-879f-8f4413d87608" containerName="console" Apr 16 16:12:43.826915 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:43.826851 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="74c7ba9b-eb07-4da1-879f-8f4413d87608" containerName="console" Apr 16 16:12:43.830774 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:43.830756 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-n55dz" Apr 16 16:12:43.832997 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:43.832976 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving\"" Apr 16 16:12:43.832997 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:43.832986 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving-artifact\"" Apr 16 16:12:43.833148 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:43.833025 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-jdvqm\"" Apr 16 16:12:43.836054 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:43.836032 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-n55dz"] Apr 16 16:12:43.908470 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:43.908429 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/07eed151-94e9-46aa-a532-2e2460f02e48-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-n55dz\" (UID: \"07eed151-94e9-46aa-a532-2e2460f02e48\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-n55dz" Apr 16 16:12:43.908638 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:43.908525 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/07eed151-94e9-46aa-a532-2e2460f02e48-data\") pod \"seaweedfs-tls-serving-7fd5766db9-n55dz\" (UID: \"07eed151-94e9-46aa-a532-2e2460f02e48\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-n55dz" Apr 16 16:12:43.908638 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:43.908584 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxk6n\" (UniqueName: \"kubernetes.io/projected/07eed151-94e9-46aa-a532-2e2460f02e48-kube-api-access-bxk6n\") pod \"seaweedfs-tls-serving-7fd5766db9-n55dz\" (UID: \"07eed151-94e9-46aa-a532-2e2460f02e48\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-n55dz" Apr 16 16:12:44.009634 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:44.009606 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/07eed151-94e9-46aa-a532-2e2460f02e48-data\") pod \"seaweedfs-tls-serving-7fd5766db9-n55dz\" (UID: \"07eed151-94e9-46aa-a532-2e2460f02e48\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-n55dz" Apr 16 16:12:44.009795 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:44.009672 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bxk6n\" (UniqueName: \"kubernetes.io/projected/07eed151-94e9-46aa-a532-2e2460f02e48-kube-api-access-bxk6n\") pod \"seaweedfs-tls-serving-7fd5766db9-n55dz\" (UID: \"07eed151-94e9-46aa-a532-2e2460f02e48\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-n55dz" Apr 16 16:12:44.009795 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:44.009720 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/07eed151-94e9-46aa-a532-2e2460f02e48-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-n55dz\" (UID: \"07eed151-94e9-46aa-a532-2e2460f02e48\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-n55dz" Apr 16 16:12:44.009903 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:12:44.009823 2576 projected.go:264] Couldn't get secret kserve/seaweedfs-tls-serving: secret "seaweedfs-tls-serving" not found Apr 16 16:12:44.009903 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:12:44.009837 2576 projected.go:194] Error preparing data for projected volume seaweedfs-tls-serving for pod kserve/seaweedfs-tls-serving-7fd5766db9-n55dz: secret "seaweedfs-tls-serving" not found Apr 16 16:12:44.009993 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:12:44.009910 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/07eed151-94e9-46aa-a532-2e2460f02e48-seaweedfs-tls-serving podName:07eed151-94e9-46aa-a532-2e2460f02e48 nodeName:}" failed. No retries permitted until 2026-04-16 16:12:44.50988851 +0000 UTC m=+587.864878630 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "seaweedfs-tls-serving" (UniqueName: "kubernetes.io/projected/07eed151-94e9-46aa-a532-2e2460f02e48-seaweedfs-tls-serving") pod "seaweedfs-tls-serving-7fd5766db9-n55dz" (UID: "07eed151-94e9-46aa-a532-2e2460f02e48") : secret "seaweedfs-tls-serving" not found Apr 16 16:12:44.009993 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:44.009938 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/07eed151-94e9-46aa-a532-2e2460f02e48-data\") pod \"seaweedfs-tls-serving-7fd5766db9-n55dz\" (UID: \"07eed151-94e9-46aa-a532-2e2460f02e48\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-n55dz" Apr 16 16:12:44.018721 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:44.018698 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxk6n\" (UniqueName: \"kubernetes.io/projected/07eed151-94e9-46aa-a532-2e2460f02e48-kube-api-access-bxk6n\") pod \"seaweedfs-tls-serving-7fd5766db9-n55dz\" (UID: \"07eed151-94e9-46aa-a532-2e2460f02e48\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-n55dz" Apr 16 16:12:44.513758 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:44.513714 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/07eed151-94e9-46aa-a532-2e2460f02e48-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-n55dz\" (UID: \"07eed151-94e9-46aa-a532-2e2460f02e48\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-n55dz" Apr 16 16:12:44.516012 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:44.515988 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/07eed151-94e9-46aa-a532-2e2460f02e48-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-n55dz\" (UID: \"07eed151-94e9-46aa-a532-2e2460f02e48\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-n55dz" Apr 16 16:12:44.740373 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:44.740338 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-n55dz" Apr 16 16:12:44.856811 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:44.856780 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-n55dz"] Apr 16 16:12:44.860359 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:12:44.860330 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07eed151_94e9_46aa_a532_2e2460f02e48.slice/crio-433e945bf9f57e36921dedd8a8101125180cbfb7f13d95c9f49147f4c3ea74d0 WatchSource:0}: Error finding container 433e945bf9f57e36921dedd8a8101125180cbfb7f13d95c9f49147f4c3ea74d0: Status 404 returned error can't find the container with id 433e945bf9f57e36921dedd8a8101125180cbfb7f13d95c9f49147f4c3ea74d0 Apr 16 16:12:44.990577 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:44.990543 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-n55dz" event={"ID":"07eed151-94e9-46aa-a532-2e2460f02e48","Type":"ContainerStarted","Data":"433e945bf9f57e36921dedd8a8101125180cbfb7f13d95c9f49147f4c3ea74d0"} Apr 16 16:12:48.002297 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:48.002263 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-n55dz" event={"ID":"07eed151-94e9-46aa-a532-2e2460f02e48","Type":"ContainerStarted","Data":"236d2136c9867698ce35a6702a9c0ed5c16b45949c8e93bce2c4baa213f99a22"} Apr 16 16:12:48.018792 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:12:48.018751 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-serving-7fd5766db9-n55dz" podStartSLOduration=2.661459583 podStartE2EDuration="5.018739974s" podCreationTimestamp="2026-04-16 16:12:43 +0000 UTC" firstStartedPulling="2026-04-16 16:12:44.861615932 +0000 UTC m=+588.216606039" lastFinishedPulling="2026-04-16 16:12:47.218896313 +0000 UTC m=+590.573886430" observedRunningTime="2026-04-16 16:12:48.017928602 +0000 UTC m=+591.372918730" watchObservedRunningTime="2026-04-16 16:12:48.018739974 +0000 UTC m=+591.373730103" Apr 16 16:13:04.775700 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:13:04.775659 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-hslct"] Apr 16 16:13:04.780300 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:13:04.780277 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-hslct" Apr 16 16:13:04.782527 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:13:04.782508 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-wzmqv\"" Apr 16 16:13:04.788374 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:13:04.788350 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-hslct"] Apr 16 16:13:04.873153 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:13:04.873117 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/30fba28e-9e58-4c6d-9378-3dda92bce2d4-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-8689cc5967-hslct\" (UID: \"30fba28e-9e58-4c6d-9378-3dda92bce2d4\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-hslct" Apr 16 16:13:04.973556 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:13:04.973520 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/30fba28e-9e58-4c6d-9378-3dda92bce2d4-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-8689cc5967-hslct\" (UID: \"30fba28e-9e58-4c6d-9378-3dda92bce2d4\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-hslct" Apr 16 16:13:04.973870 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:13:04.973852 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/30fba28e-9e58-4c6d-9378-3dda92bce2d4-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-8689cc5967-hslct\" (UID: \"30fba28e-9e58-4c6d-9378-3dda92bce2d4\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-hslct" Apr 16 16:13:05.090485 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:13:05.090409 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-hslct" Apr 16 16:13:05.205700 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:13:05.205677 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-hslct"] Apr 16 16:13:05.208615 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:13:05.208555 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30fba28e_9e58_4c6d_9378_3dda92bce2d4.slice/crio-e9a5e4dea4fba364dc4a5ba95fc5dfe2de2a83f456f70f55d63d74fe8b8e6a0e WatchSource:0}: Error finding container e9a5e4dea4fba364dc4a5ba95fc5dfe2de2a83f456f70f55d63d74fe8b8e6a0e: Status 404 returned error can't find the container with id e9a5e4dea4fba364dc4a5ba95fc5dfe2de2a83f456f70f55d63d74fe8b8e6a0e Apr 16 16:13:06.062976 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:13:06.062935 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-hslct" event={"ID":"30fba28e-9e58-4c6d-9378-3dda92bce2d4","Type":"ContainerStarted","Data":"e9a5e4dea4fba364dc4a5ba95fc5dfe2de2a83f456f70f55d63d74fe8b8e6a0e"} Apr 16 16:13:08.071317 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:13:08.071276 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-hslct" event={"ID":"30fba28e-9e58-4c6d-9378-3dda92bce2d4","Type":"ContainerStarted","Data":"14db879667cd358a611f1e48093204c74b2ceabe9e451ea1d9eaf050b4b2fc3f"} Apr 16 16:13:12.084202 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:13:12.084111 2576 generic.go:358] "Generic (PLEG): container finished" podID="30fba28e-9e58-4c6d-9378-3dda92bce2d4" containerID="14db879667cd358a611f1e48093204c74b2ceabe9e451ea1d9eaf050b4b2fc3f" exitCode=0 Apr 16 16:13:12.084202 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:13:12.084158 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-hslct" event={"ID":"30fba28e-9e58-4c6d-9378-3dda92bce2d4","Type":"ContainerDied","Data":"14db879667cd358a611f1e48093204c74b2ceabe9e451ea1d9eaf050b4b2fc3f"} Apr 16 16:13:26.140038 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:13:26.139988 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-hslct" event={"ID":"30fba28e-9e58-4c6d-9378-3dda92bce2d4","Type":"ContainerStarted","Data":"ddc88d4f6d390678e2c134d8308b9f89b2a82500e413abe10f243dc4c928498d"} Apr 16 16:13:29.151118 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:13:29.151074 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-hslct" event={"ID":"30fba28e-9e58-4c6d-9378-3dda92bce2d4","Type":"ContainerStarted","Data":"1ff35d74a9d8d991364b72cde55b14250dd3267c6e9768b64d1207e372ff57ac"} Apr 16 16:13:29.151607 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:13:29.151361 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-hslct" Apr 16 16:13:29.152741 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:13:29.152710 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-hslct" podUID="30fba28e-9e58-4c6d-9378-3dda92bce2d4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 16 16:13:29.168665 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:13:29.168619 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-hslct" podStartSLOduration=1.9707319970000001 podStartE2EDuration="25.168600922s" podCreationTimestamp="2026-04-16 16:13:04 +0000 UTC" firstStartedPulling="2026-04-16 16:13:05.210557023 +0000 UTC m=+608.565547133" lastFinishedPulling="2026-04-16 16:13:28.408425952 +0000 UTC m=+631.763416058" observedRunningTime="2026-04-16 16:13:29.167147284 +0000 UTC m=+632.522137415" watchObservedRunningTime="2026-04-16 16:13:29.168600922 +0000 UTC m=+632.523591105" Apr 16 16:13:30.153906 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:13:30.153861 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-hslct" Apr 16 16:13:30.154329 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:13:30.153988 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-hslct" podUID="30fba28e-9e58-4c6d-9378-3dda92bce2d4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 16 16:13:30.154898 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:13:30.154869 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-hslct" podUID="30fba28e-9e58-4c6d-9378-3dda92bce2d4" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:13:31.157287 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:13:31.157250 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-hslct" podUID="30fba28e-9e58-4c6d-9378-3dda92bce2d4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 16 16:13:31.157718 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:13:31.157599 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-hslct" podUID="30fba28e-9e58-4c6d-9378-3dda92bce2d4" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:13:41.158233 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:13:41.158175 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-hslct" podUID="30fba28e-9e58-4c6d-9378-3dda92bce2d4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 16 16:13:41.158778 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:13:41.158627 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-hslct" podUID="30fba28e-9e58-4c6d-9378-3dda92bce2d4" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:13:51.158032 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:13:51.157985 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-hslct" podUID="30fba28e-9e58-4c6d-9378-3dda92bce2d4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 16 16:13:51.158486 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:13:51.158456 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-hslct" podUID="30fba28e-9e58-4c6d-9378-3dda92bce2d4" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:14:01.158197 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:14:01.158142 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-hslct" podUID="30fba28e-9e58-4c6d-9378-3dda92bce2d4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 16 16:14:01.158684 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:14:01.158641 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-hslct" podUID="30fba28e-9e58-4c6d-9378-3dda92bce2d4" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:14:11.157399 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:14:11.157350 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-hslct" podUID="30fba28e-9e58-4c6d-9378-3dda92bce2d4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 16 16:14:11.157810 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:14:11.157785 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-hslct" podUID="30fba28e-9e58-4c6d-9378-3dda92bce2d4" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:14:21.157797 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:14:21.157744 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-hslct" podUID="30fba28e-9e58-4c6d-9378-3dda92bce2d4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 16 16:14:21.158322 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:14:21.158186 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-hslct" podUID="30fba28e-9e58-4c6d-9378-3dda92bce2d4" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:14:31.158012 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:14:31.157981 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-hslct" Apr 16 16:14:31.158466 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:14:31.158106 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-hslct" Apr 16 16:14:40.034801 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:14:40.034770 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-hslct"] Apr 16 16:14:40.035166 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:14:40.035043 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-hslct" podUID="30fba28e-9e58-4c6d-9378-3dda92bce2d4" containerName="kserve-container" containerID="cri-o://ddc88d4f6d390678e2c134d8308b9f89b2a82500e413abe10f243dc4c928498d" gracePeriod=30 Apr 16 16:14:40.035166 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:14:40.035072 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-hslct" podUID="30fba28e-9e58-4c6d-9378-3dda92bce2d4" containerName="agent" containerID="cri-o://1ff35d74a9d8d991364b72cde55b14250dd3267c6e9768b64d1207e372ff57ac" gracePeriod=30 Apr 16 16:14:40.159607 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:14:40.159577 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-7pwn5"] Apr 16 16:14:40.163010 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:14:40.162988 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-7pwn5" Apr 16 16:14:40.173838 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:14:40.173814 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-7pwn5"] Apr 16 16:14:40.228740 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:14:40.228707 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8db39ab-6ed2-403b-84ba-127282e8b1d9-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-56bb8466fd-7pwn5\" (UID: \"f8db39ab-6ed2-403b-84ba-127282e8b1d9\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-7pwn5" Apr 16 16:14:40.329601 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:14:40.329534 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8db39ab-6ed2-403b-84ba-127282e8b1d9-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-56bb8466fd-7pwn5\" (UID: \"f8db39ab-6ed2-403b-84ba-127282e8b1d9\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-7pwn5" Apr 16 16:14:40.329905 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:14:40.329880 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8db39ab-6ed2-403b-84ba-127282e8b1d9-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-56bb8466fd-7pwn5\" (UID: \"f8db39ab-6ed2-403b-84ba-127282e8b1d9\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-7pwn5" Apr 16 16:14:40.472605 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:14:40.472582 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-7pwn5" Apr 16 16:14:40.593690 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:14:40.593664 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-7pwn5"] Apr 16 16:14:40.596471 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:14:40.596431 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8db39ab_6ed2_403b_84ba_127282e8b1d9.slice/crio-9252d57970da34ac6cc9285a11e7f24d9d0844152928037279b0e8c91939cf4e WatchSource:0}: Error finding container 9252d57970da34ac6cc9285a11e7f24d9d0844152928037279b0e8c91939cf4e: Status 404 returned error can't find the container with id 9252d57970da34ac6cc9285a11e7f24d9d0844152928037279b0e8c91939cf4e Apr 16 16:14:40.598443 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:14:40.598422 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:14:41.158001 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:14:41.157963 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-hslct" podUID="30fba28e-9e58-4c6d-9378-3dda92bce2d4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 16 16:14:41.158510 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:14:41.158485 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-hslct" podUID="30fba28e-9e58-4c6d-9378-3dda92bce2d4" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:14:41.374717 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:14:41.374679 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-7pwn5" event={"ID":"f8db39ab-6ed2-403b-84ba-127282e8b1d9","Type":"ContainerStarted","Data":"5599be3b9f7cef3068a2825ca8d2758f276e101268efeee8b4c8f5ee9f4253ad"} Apr 16 16:14:41.374717 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:14:41.374722 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-7pwn5" event={"ID":"f8db39ab-6ed2-403b-84ba-127282e8b1d9","Type":"ContainerStarted","Data":"9252d57970da34ac6cc9285a11e7f24d9d0844152928037279b0e8c91939cf4e"} Apr 16 16:14:44.385852 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:14:44.385812 2576 generic.go:358] "Generic (PLEG): container finished" podID="30fba28e-9e58-4c6d-9378-3dda92bce2d4" containerID="ddc88d4f6d390678e2c134d8308b9f89b2a82500e413abe10f243dc4c928498d" exitCode=0 Apr 16 16:14:44.386190 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:14:44.385886 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-hslct" event={"ID":"30fba28e-9e58-4c6d-9378-3dda92bce2d4","Type":"ContainerDied","Data":"ddc88d4f6d390678e2c134d8308b9f89b2a82500e413abe10f243dc4c928498d"} Apr 16 16:14:45.390523 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:14:45.390490 2576 generic.go:358] "Generic (PLEG): container finished" podID="f8db39ab-6ed2-403b-84ba-127282e8b1d9" containerID="5599be3b9f7cef3068a2825ca8d2758f276e101268efeee8b4c8f5ee9f4253ad" exitCode=0 Apr 16 16:14:45.390901 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:14:45.390564 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-7pwn5" event={"ID":"f8db39ab-6ed2-403b-84ba-127282e8b1d9","Type":"ContainerDied","Data":"5599be3b9f7cef3068a2825ca8d2758f276e101268efeee8b4c8f5ee9f4253ad"} Apr 16 16:14:46.395265 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:14:46.395181 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-7pwn5" event={"ID":"f8db39ab-6ed2-403b-84ba-127282e8b1d9","Type":"ContainerStarted","Data":"bf5eb67f736a1bcbab37d8f722891c6e98998c96f537daea3c6826e66277d43c"} Apr 16 16:14:46.395265 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:14:46.395232 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-7pwn5" event={"ID":"f8db39ab-6ed2-403b-84ba-127282e8b1d9","Type":"ContainerStarted","Data":"6757b52bd3d6c44a822be4fc27adb3db07c8af9b8ddbd1a95a3f59dbbce59dfb"} Apr 16 16:14:46.395660 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:14:46.395506 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-7pwn5" Apr 16 16:14:46.396833 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:14:46.396807 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-7pwn5" podUID="f8db39ab-6ed2-403b-84ba-127282e8b1d9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:5000: connect: connection refused" Apr 16 16:14:47.398888 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:14:47.398862 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-7pwn5" Apr 16 16:14:47.399385 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:14:47.398988 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-7pwn5" podUID="f8db39ab-6ed2-403b-84ba-127282e8b1d9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:5000: connect: connection refused" Apr 16 16:14:47.400068 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:14:47.400048 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-7pwn5" podUID="f8db39ab-6ed2-403b-84ba-127282e8b1d9" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:14:48.401951 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:14:48.401912 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-7pwn5" podUID="f8db39ab-6ed2-403b-84ba-127282e8b1d9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:5000: connect: connection refused" Apr 16 16:14:48.402341 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:14:48.402237 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-7pwn5" podUID="f8db39ab-6ed2-403b-84ba-127282e8b1d9" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:14:51.157978 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:14:51.157939 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-hslct" podUID="30fba28e-9e58-4c6d-9378-3dda92bce2d4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 16 16:14:51.158450 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:14:51.158267 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-hslct" podUID="30fba28e-9e58-4c6d-9378-3dda92bce2d4" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:14:58.402034 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:14:58.401988 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-7pwn5" podUID="f8db39ab-6ed2-403b-84ba-127282e8b1d9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:5000: connect: connection refused" Apr 16 16:14:58.402527 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:14:58.402435 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-7pwn5" podUID="f8db39ab-6ed2-403b-84ba-127282e8b1d9" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:15:01.157489 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:15:01.157442 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-hslct" podUID="30fba28e-9e58-4c6d-9378-3dda92bce2d4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 16 16:15:01.157855 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:15:01.157566 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-hslct" Apr 16 16:15:01.157855 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:15:01.157806 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-hslct" podUID="30fba28e-9e58-4c6d-9378-3dda92bce2d4" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:15:01.157930 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:15:01.157920 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-hslct" Apr 16 16:15:01.174467 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:15:01.174419 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-7pwn5" podStartSLOduration=21.174406874 podStartE2EDuration="21.174406874s" podCreationTimestamp="2026-04-16 16:14:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:14:46.421512056 +0000 UTC m=+709.776502186" watchObservedRunningTime="2026-04-16 16:15:01.174406874 +0000 UTC m=+724.529397003" Apr 16 16:15:08.402265 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:15:08.402176 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-7pwn5" podUID="f8db39ab-6ed2-403b-84ba-127282e8b1d9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:5000: connect: connection refused" Apr 16 16:15:08.402705 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:15:08.402571 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-7pwn5" podUID="f8db39ab-6ed2-403b-84ba-127282e8b1d9" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:15:10.172157 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:15:10.172132 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-hslct" Apr 16 16:15:10.272880 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:15:10.272848 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/30fba28e-9e58-4c6d-9378-3dda92bce2d4-kserve-provision-location\") pod \"30fba28e-9e58-4c6d-9378-3dda92bce2d4\" (UID: \"30fba28e-9e58-4c6d-9378-3dda92bce2d4\") " Apr 16 16:15:10.273165 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:15:10.273144 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30fba28e-9e58-4c6d-9378-3dda92bce2d4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "30fba28e-9e58-4c6d-9378-3dda92bce2d4" (UID: "30fba28e-9e58-4c6d-9378-3dda92bce2d4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:15:10.373953 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:15:10.373869 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/30fba28e-9e58-4c6d-9378-3dda92bce2d4-kserve-provision-location\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:15:10.467741 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:15:10.467702 2576 generic.go:358] "Generic (PLEG): container finished" podID="30fba28e-9e58-4c6d-9378-3dda92bce2d4" containerID="1ff35d74a9d8d991364b72cde55b14250dd3267c6e9768b64d1207e372ff57ac" exitCode=0 Apr 16 16:15:10.467908 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:15:10.467759 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-hslct" event={"ID":"30fba28e-9e58-4c6d-9378-3dda92bce2d4","Type":"ContainerDied","Data":"1ff35d74a9d8d991364b72cde55b14250dd3267c6e9768b64d1207e372ff57ac"} Apr 16 16:15:10.467908 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:15:10.467792 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-hslct" Apr 16 16:15:10.467908 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:15:10.467804 2576 scope.go:117] "RemoveContainer" containerID="1ff35d74a9d8d991364b72cde55b14250dd3267c6e9768b64d1207e372ff57ac" Apr 16 16:15:10.468018 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:15:10.467792 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-hslct" event={"ID":"30fba28e-9e58-4c6d-9378-3dda92bce2d4","Type":"ContainerDied","Data":"e9a5e4dea4fba364dc4a5ba95fc5dfe2de2a83f456f70f55d63d74fe8b8e6a0e"} Apr 16 16:15:10.475964 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:15:10.475941 2576 scope.go:117] "RemoveContainer" containerID="ddc88d4f6d390678e2c134d8308b9f89b2a82500e413abe10f243dc4c928498d" Apr 16 16:15:10.483080 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:15:10.483061 2576 scope.go:117] "RemoveContainer" containerID="14db879667cd358a611f1e48093204c74b2ceabe9e451ea1d9eaf050b4b2fc3f" Apr 16 16:15:10.488849 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:15:10.488825 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-hslct"] Apr 16 16:15:10.490962 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:15:10.490946 2576 scope.go:117] "RemoveContainer" containerID="1ff35d74a9d8d991364b72cde55b14250dd3267c6e9768b64d1207e372ff57ac" Apr 16 16:15:10.491284 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:15:10.491254 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ff35d74a9d8d991364b72cde55b14250dd3267c6e9768b64d1207e372ff57ac\": container with ID starting with 1ff35d74a9d8d991364b72cde55b14250dd3267c6e9768b64d1207e372ff57ac not found: ID does not exist" containerID="1ff35d74a9d8d991364b72cde55b14250dd3267c6e9768b64d1207e372ff57ac" Apr 16 16:15:10.491386 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:15:10.491293 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ff35d74a9d8d991364b72cde55b14250dd3267c6e9768b64d1207e372ff57ac"} err="failed to get container status \"1ff35d74a9d8d991364b72cde55b14250dd3267c6e9768b64d1207e372ff57ac\": rpc error: code = NotFound desc = could not find container \"1ff35d74a9d8d991364b72cde55b14250dd3267c6e9768b64d1207e372ff57ac\": container with ID starting with 1ff35d74a9d8d991364b72cde55b14250dd3267c6e9768b64d1207e372ff57ac not found: ID does not exist" Apr 16 16:15:10.491386 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:15:10.491312 2576 scope.go:117] "RemoveContainer" containerID="ddc88d4f6d390678e2c134d8308b9f89b2a82500e413abe10f243dc4c928498d" Apr 16 16:15:10.491581 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:15:10.491562 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddc88d4f6d390678e2c134d8308b9f89b2a82500e413abe10f243dc4c928498d\": container with ID starting with ddc88d4f6d390678e2c134d8308b9f89b2a82500e413abe10f243dc4c928498d not found: ID does not exist" containerID="ddc88d4f6d390678e2c134d8308b9f89b2a82500e413abe10f243dc4c928498d" Apr 16 16:15:10.491638 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:15:10.491587 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddc88d4f6d390678e2c134d8308b9f89b2a82500e413abe10f243dc4c928498d"} err="failed to get container status \"ddc88d4f6d390678e2c134d8308b9f89b2a82500e413abe10f243dc4c928498d\": rpc error: code = NotFound desc = could not find container \"ddc88d4f6d390678e2c134d8308b9f89b2a82500e413abe10f243dc4c928498d\": container with ID starting with ddc88d4f6d390678e2c134d8308b9f89b2a82500e413abe10f243dc4c928498d not found: ID does not exist" Apr 16 16:15:10.491638 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:15:10.491603 2576 scope.go:117] "RemoveContainer" containerID="14db879667cd358a611f1e48093204c74b2ceabe9e451ea1d9eaf050b4b2fc3f" Apr 16 16:15:10.491832 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:15:10.491812 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14db879667cd358a611f1e48093204c74b2ceabe9e451ea1d9eaf050b4b2fc3f\": container with ID starting with 14db879667cd358a611f1e48093204c74b2ceabe9e451ea1d9eaf050b4b2fc3f not found: ID does not exist" containerID="14db879667cd358a611f1e48093204c74b2ceabe9e451ea1d9eaf050b4b2fc3f" Apr 16 16:15:10.491921 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:15:10.491832 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14db879667cd358a611f1e48093204c74b2ceabe9e451ea1d9eaf050b4b2fc3f"} err="failed to get container status \"14db879667cd358a611f1e48093204c74b2ceabe9e451ea1d9eaf050b4b2fc3f\": rpc error: code = NotFound desc = could not find container \"14db879667cd358a611f1e48093204c74b2ceabe9e451ea1d9eaf050b4b2fc3f\": container with ID starting with 14db879667cd358a611f1e48093204c74b2ceabe9e451ea1d9eaf050b4b2fc3f not found: ID does not exist" Apr 16 16:15:10.492328 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:15:10.492310 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-8689cc5967-hslct"] Apr 16 16:15:11.278774 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:15:11.278743 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30fba28e-9e58-4c6d-9378-3dda92bce2d4" path="/var/lib/kubelet/pods/30fba28e-9e58-4c6d-9378-3dda92bce2d4/volumes" Apr 16 16:15:18.402934 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:15:18.402888 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-7pwn5" podUID="f8db39ab-6ed2-403b-84ba-127282e8b1d9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:5000: connect: connection refused" Apr 16 16:15:18.403367 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:15:18.403321 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-7pwn5" podUID="f8db39ab-6ed2-403b-84ba-127282e8b1d9" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:15:28.402136 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:15:28.402088 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-7pwn5" podUID="f8db39ab-6ed2-403b-84ba-127282e8b1d9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:5000: connect: connection refused" Apr 16 16:15:28.402609 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:15:28.402571 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-7pwn5" podUID="f8db39ab-6ed2-403b-84ba-127282e8b1d9" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:15:38.402896 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:15:38.402850 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-7pwn5" podUID="f8db39ab-6ed2-403b-84ba-127282e8b1d9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:5000: connect: connection refused" Apr 16 16:15:38.403387 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:15:38.403353 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-7pwn5" podUID="f8db39ab-6ed2-403b-84ba-127282e8b1d9" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:15:48.402640 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:15:48.402591 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-7pwn5" podUID="f8db39ab-6ed2-403b-84ba-127282e8b1d9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:5000: connect: connection refused" Apr 16 16:15:48.403097 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:15:48.403014 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-7pwn5" podUID="f8db39ab-6ed2-403b-84ba-127282e8b1d9" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:15:57.278780 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:15:57.278751 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-7pwn5" Apr 16 16:15:57.279094 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:15:57.278797 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-7pwn5" Apr 16 16:16:05.167557 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:05.167523 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-7pwn5"] Apr 16 16:16:05.168112 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:05.167906 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-7pwn5" podUID="f8db39ab-6ed2-403b-84ba-127282e8b1d9" containerName="kserve-container" containerID="cri-o://6757b52bd3d6c44a822be4fc27adb3db07c8af9b8ddbd1a95a3f59dbbce59dfb" gracePeriod=30 Apr 16 16:16:05.168112 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:05.167970 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-7pwn5" podUID="f8db39ab-6ed2-403b-84ba-127282e8b1d9" containerName="agent" containerID="cri-o://bf5eb67f736a1bcbab37d8f722891c6e98998c96f537daea3c6826e66277d43c" gracePeriod=30 Apr 16 16:16:07.275751 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:07.275705 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-7pwn5" podUID="f8db39ab-6ed2-403b-84ba-127282e8b1d9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:5000: connect: connection refused" Apr 16 16:16:07.276195 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:07.276040 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-7pwn5" podUID="f8db39ab-6ed2-403b-84ba-127282e8b1d9" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:16:09.656170 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:09.656140 2576 generic.go:358] "Generic (PLEG): container finished" podID="f8db39ab-6ed2-403b-84ba-127282e8b1d9" containerID="6757b52bd3d6c44a822be4fc27adb3db07c8af9b8ddbd1a95a3f59dbbce59dfb" exitCode=0 Apr 16 16:16:09.656540 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:09.656223 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-7pwn5" event={"ID":"f8db39ab-6ed2-403b-84ba-127282e8b1d9","Type":"ContainerDied","Data":"6757b52bd3d6c44a822be4fc27adb3db07c8af9b8ddbd1a95a3f59dbbce59dfb"} Apr 16 16:16:15.268467 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:15.268430 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-tjnn6"] Apr 16 16:16:15.268847 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:15.268820 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="30fba28e-9e58-4c6d-9378-3dda92bce2d4" containerName="kserve-container" Apr 16 16:16:15.268847 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:15.268834 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="30fba28e-9e58-4c6d-9378-3dda92bce2d4" containerName="kserve-container" Apr 16 16:16:15.268847 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:15.268844 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="30fba28e-9e58-4c6d-9378-3dda92bce2d4" containerName="agent" Apr 16 16:16:15.268955 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:15.268849 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="30fba28e-9e58-4c6d-9378-3dda92bce2d4" containerName="agent" Apr 16 16:16:15.268955 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:15.268857 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="30fba28e-9e58-4c6d-9378-3dda92bce2d4" containerName="storage-initializer" Apr 16 16:16:15.268955 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:15.268863 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="30fba28e-9e58-4c6d-9378-3dda92bce2d4" containerName="storage-initializer" Apr 16 16:16:15.268955 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:15.268917 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="30fba28e-9e58-4c6d-9378-3dda92bce2d4" containerName="kserve-container" Apr 16 16:16:15.268955 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:15.268924 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="30fba28e-9e58-4c6d-9378-3dda92bce2d4" containerName="agent" Apr 16 16:16:15.271976 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:15.271957 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-tjnn6" Apr 16 16:16:15.280136 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:15.280109 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-tjnn6"] Apr 16 16:16:15.366054 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:15.366016 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f239e257-511b-48b0-b865-328dffe69fa2-kserve-provision-location\") pod \"isvc-logger-predictor-8444b4768-tjnn6\" (UID: \"f239e257-511b-48b0-b865-328dffe69fa2\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-tjnn6" Apr 16 16:16:15.466567 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:15.466527 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f239e257-511b-48b0-b865-328dffe69fa2-kserve-provision-location\") pod \"isvc-logger-predictor-8444b4768-tjnn6\" (UID: \"f239e257-511b-48b0-b865-328dffe69fa2\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-tjnn6" Apr 16 16:16:15.466910 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:15.466892 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f239e257-511b-48b0-b865-328dffe69fa2-kserve-provision-location\") pod \"isvc-logger-predictor-8444b4768-tjnn6\" (UID: \"f239e257-511b-48b0-b865-328dffe69fa2\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-tjnn6" Apr 16 16:16:15.583950 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:15.583869 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-tjnn6" Apr 16 16:16:15.703314 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:15.703242 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-tjnn6"] Apr 16 16:16:15.705640 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:16:15.705601 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf239e257_511b_48b0_b865_328dffe69fa2.slice/crio-4b0bd73238f88442299525c2560a3fb4d0962b70d6fc42f1f3a131be447c8d30 WatchSource:0}: Error finding container 4b0bd73238f88442299525c2560a3fb4d0962b70d6fc42f1f3a131be447c8d30: Status 404 returned error can't find the container with id 4b0bd73238f88442299525c2560a3fb4d0962b70d6fc42f1f3a131be447c8d30 Apr 16 16:16:16.677472 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:16.677438 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-tjnn6" event={"ID":"f239e257-511b-48b0-b865-328dffe69fa2","Type":"ContainerStarted","Data":"75b5f10b752e2e50f21e2965c2c5c77969b2f7a2f03ca5a67ba6d3eb44c6f51b"} Apr 16 16:16:16.677472 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:16.677476 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-tjnn6" event={"ID":"f239e257-511b-48b0-b865-328dffe69fa2","Type":"ContainerStarted","Data":"4b0bd73238f88442299525c2560a3fb4d0962b70d6fc42f1f3a131be447c8d30"} Apr 16 16:16:17.275778 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:17.275739 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-7pwn5" podUID="f8db39ab-6ed2-403b-84ba-127282e8b1d9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:5000: connect: connection refused" Apr 16 16:16:17.276062 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:17.276038 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-7pwn5" podUID="f8db39ab-6ed2-403b-84ba-127282e8b1d9" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:16:19.687699 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:19.687665 2576 generic.go:358] "Generic (PLEG): container finished" podID="f239e257-511b-48b0-b865-328dffe69fa2" containerID="75b5f10b752e2e50f21e2965c2c5c77969b2f7a2f03ca5a67ba6d3eb44c6f51b" exitCode=0 Apr 16 16:16:19.688076 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:19.687736 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-tjnn6" event={"ID":"f239e257-511b-48b0-b865-328dffe69fa2","Type":"ContainerDied","Data":"75b5f10b752e2e50f21e2965c2c5c77969b2f7a2f03ca5a67ba6d3eb44c6f51b"} Apr 16 16:16:20.693844 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:20.693812 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-tjnn6" event={"ID":"f239e257-511b-48b0-b865-328dffe69fa2","Type":"ContainerStarted","Data":"d905f6905fde40afad048ed397a21406fc9d0e87ccd1841a7bb5ce5d421ba19c"} Apr 16 16:16:20.693844 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:20.693847 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-tjnn6" event={"ID":"f239e257-511b-48b0-b865-328dffe69fa2","Type":"ContainerStarted","Data":"8cc16547cbc6e477b4114990a2c9df931568158282d8653ad0f04cc797e0bbcf"} Apr 16 16:16:20.694348 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:20.694139 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-tjnn6" Apr 16 16:16:20.694348 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:20.694165 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-tjnn6" Apr 16 16:16:20.695587 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:20.695559 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-tjnn6" podUID="f239e257-511b-48b0-b865-328dffe69fa2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 16 16:16:20.696342 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:20.696310 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-tjnn6" podUID="f239e257-511b-48b0-b865-328dffe69fa2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:16:20.712002 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:20.711965 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-tjnn6" podStartSLOduration=5.711953449 podStartE2EDuration="5.711953449s" podCreationTimestamp="2026-04-16 16:16:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:16:20.710721975 +0000 UTC m=+804.065712099" watchObservedRunningTime="2026-04-16 16:16:20.711953449 +0000 UTC m=+804.066943579" Apr 16 16:16:21.698293 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:21.698249 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-tjnn6" podUID="f239e257-511b-48b0-b865-328dffe69fa2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 16 16:16:21.698766 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:21.698738 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-tjnn6" podUID="f239e257-511b-48b0-b865-328dffe69fa2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:16:27.275600 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:27.275559 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-7pwn5" podUID="f8db39ab-6ed2-403b-84ba-127282e8b1d9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.31:5000: connect: connection refused" Apr 16 16:16:27.276062 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:27.275875 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-7pwn5" podUID="f8db39ab-6ed2-403b-84ba-127282e8b1d9" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:16:27.277752 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:27.277733 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-7pwn5" Apr 16 16:16:27.277813 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:27.277783 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-7pwn5" Apr 16 16:16:31.698787 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:31.698739 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-tjnn6" podUID="f239e257-511b-48b0-b865-328dffe69fa2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 16 16:16:31.699161 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:31.699116 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-tjnn6" podUID="f239e257-511b-48b0-b865-328dffe69fa2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:16:35.303873 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:35.303849 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-7pwn5" Apr 16 16:16:35.416104 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:35.416068 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8db39ab-6ed2-403b-84ba-127282e8b1d9-kserve-provision-location\") pod \"f8db39ab-6ed2-403b-84ba-127282e8b1d9\" (UID: \"f8db39ab-6ed2-403b-84ba-127282e8b1d9\") " Apr 16 16:16:35.416378 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:35.416354 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8db39ab-6ed2-403b-84ba-127282e8b1d9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f8db39ab-6ed2-403b-84ba-127282e8b1d9" (UID: "f8db39ab-6ed2-403b-84ba-127282e8b1d9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:16:35.517150 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:35.517071 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8db39ab-6ed2-403b-84ba-127282e8b1d9-kserve-provision-location\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:16:35.743757 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:35.743728 2576 generic.go:358] "Generic (PLEG): container finished" podID="f8db39ab-6ed2-403b-84ba-127282e8b1d9" containerID="bf5eb67f736a1bcbab37d8f722891c6e98998c96f537daea3c6826e66277d43c" exitCode=0 Apr 16 16:16:35.743913 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:35.743794 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-7pwn5" event={"ID":"f8db39ab-6ed2-403b-84ba-127282e8b1d9","Type":"ContainerDied","Data":"bf5eb67f736a1bcbab37d8f722891c6e98998c96f537daea3c6826e66277d43c"} Apr 16 16:16:35.743913 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:35.743810 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-7pwn5" Apr 16 16:16:35.743913 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:35.743826 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-7pwn5" event={"ID":"f8db39ab-6ed2-403b-84ba-127282e8b1d9","Type":"ContainerDied","Data":"9252d57970da34ac6cc9285a11e7f24d9d0844152928037279b0e8c91939cf4e"} Apr 16 16:16:35.743913 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:35.743845 2576 scope.go:117] "RemoveContainer" containerID="bf5eb67f736a1bcbab37d8f722891c6e98998c96f537daea3c6826e66277d43c" Apr 16 16:16:35.754499 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:35.754482 2576 scope.go:117] "RemoveContainer" containerID="6757b52bd3d6c44a822be4fc27adb3db07c8af9b8ddbd1a95a3f59dbbce59dfb" Apr 16 16:16:35.761777 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:35.761760 2576 scope.go:117] "RemoveContainer" containerID="5599be3b9f7cef3068a2825ca8d2758f276e101268efeee8b4c8f5ee9f4253ad" Apr 16 16:16:35.765664 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:35.765647 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-7pwn5"] Apr 16 16:16:35.769283 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:35.769266 2576 scope.go:117] "RemoveContainer" containerID="bf5eb67f736a1bcbab37d8f722891c6e98998c96f537daea3c6826e66277d43c" Apr 16 16:16:35.769534 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:16:35.769517 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf5eb67f736a1bcbab37d8f722891c6e98998c96f537daea3c6826e66277d43c\": container with ID starting with bf5eb67f736a1bcbab37d8f722891c6e98998c96f537daea3c6826e66277d43c not found: ID does not exist" containerID="bf5eb67f736a1bcbab37d8f722891c6e98998c96f537daea3c6826e66277d43c" Apr 16 16:16:35.769590 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:35.769543 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf5eb67f736a1bcbab37d8f722891c6e98998c96f537daea3c6826e66277d43c"} err="failed to get container status \"bf5eb67f736a1bcbab37d8f722891c6e98998c96f537daea3c6826e66277d43c\": rpc error: code = NotFound desc = could not find container \"bf5eb67f736a1bcbab37d8f722891c6e98998c96f537daea3c6826e66277d43c\": container with ID starting with bf5eb67f736a1bcbab37d8f722891c6e98998c96f537daea3c6826e66277d43c not found: ID does not exist" Apr 16 16:16:35.769590 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:35.769562 2576 scope.go:117] "RemoveContainer" containerID="6757b52bd3d6c44a822be4fc27adb3db07c8af9b8ddbd1a95a3f59dbbce59dfb" Apr 16 16:16:35.769672 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:35.769591 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-56bb8466fd-7pwn5"] Apr 16 16:16:35.769778 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:16:35.769761 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6757b52bd3d6c44a822be4fc27adb3db07c8af9b8ddbd1a95a3f59dbbce59dfb\": container with ID starting with 6757b52bd3d6c44a822be4fc27adb3db07c8af9b8ddbd1a95a3f59dbbce59dfb not found: ID does not exist" containerID="6757b52bd3d6c44a822be4fc27adb3db07c8af9b8ddbd1a95a3f59dbbce59dfb" Apr 16 16:16:35.769819 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:35.769786 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6757b52bd3d6c44a822be4fc27adb3db07c8af9b8ddbd1a95a3f59dbbce59dfb"} err="failed to get container status \"6757b52bd3d6c44a822be4fc27adb3db07c8af9b8ddbd1a95a3f59dbbce59dfb\": rpc error: code = NotFound desc = could not find container \"6757b52bd3d6c44a822be4fc27adb3db07c8af9b8ddbd1a95a3f59dbbce59dfb\": container with ID starting with 6757b52bd3d6c44a822be4fc27adb3db07c8af9b8ddbd1a95a3f59dbbce59dfb not found: ID does not exist" Apr 16 16:16:35.769819 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:35.769804 2576 scope.go:117] "RemoveContainer" containerID="5599be3b9f7cef3068a2825ca8d2758f276e101268efeee8b4c8f5ee9f4253ad" Apr 16 16:16:35.770049 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:16:35.770033 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5599be3b9f7cef3068a2825ca8d2758f276e101268efeee8b4c8f5ee9f4253ad\": container with ID starting with 5599be3b9f7cef3068a2825ca8d2758f276e101268efeee8b4c8f5ee9f4253ad not found: ID does not exist" containerID="5599be3b9f7cef3068a2825ca8d2758f276e101268efeee8b4c8f5ee9f4253ad" Apr 16 16:16:35.770095 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:35.770052 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5599be3b9f7cef3068a2825ca8d2758f276e101268efeee8b4c8f5ee9f4253ad"} err="failed to get container status \"5599be3b9f7cef3068a2825ca8d2758f276e101268efeee8b4c8f5ee9f4253ad\": rpc error: code = NotFound desc = could not find container \"5599be3b9f7cef3068a2825ca8d2758f276e101268efeee8b4c8f5ee9f4253ad\": container with ID starting with 5599be3b9f7cef3068a2825ca8d2758f276e101268efeee8b4c8f5ee9f4253ad not found: ID does not exist" Apr 16 16:16:37.278034 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:37.278000 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8db39ab-6ed2-403b-84ba-127282e8b1d9" path="/var/lib/kubelet/pods/f8db39ab-6ed2-403b-84ba-127282e8b1d9/volumes" Apr 16 16:16:41.698687 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:41.698628 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-tjnn6" podUID="f239e257-511b-48b0-b865-328dffe69fa2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 16 16:16:41.699179 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:41.699158 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-tjnn6" podUID="f239e257-511b-48b0-b865-328dffe69fa2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:16:51.699121 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:51.699075 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-tjnn6" podUID="f239e257-511b-48b0-b865-328dffe69fa2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 16 16:16:51.699562 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:16:51.699477 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-tjnn6" podUID="f239e257-511b-48b0-b865-328dffe69fa2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:17:01.698955 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:17:01.698909 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-tjnn6" podUID="f239e257-511b-48b0-b865-328dffe69fa2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 16 16:17:01.699473 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:17:01.699448 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-tjnn6" podUID="f239e257-511b-48b0-b865-328dffe69fa2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:17:11.698371 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:17:11.698326 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-tjnn6" podUID="f239e257-511b-48b0-b865-328dffe69fa2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 16 16:17:11.698910 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:17:11.698796 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-tjnn6" podUID="f239e257-511b-48b0-b865-328dffe69fa2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:17:21.699320 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:17:21.699273 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-tjnn6" podUID="f239e257-511b-48b0-b865-328dffe69fa2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 16 16:17:21.699728 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:17:21.699695 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-tjnn6" podUID="f239e257-511b-48b0-b865-328dffe69fa2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:17:31.699038 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:17:31.699007 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-tjnn6" Apr 16 16:17:31.699431 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:17:31.699193 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-tjnn6" Apr 16 16:17:40.481443 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:17:40.481407 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-tjnn6"] Apr 16 16:17:40.481795 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:17:40.481731 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-tjnn6" podUID="f239e257-511b-48b0-b865-328dffe69fa2" containerName="kserve-container" containerID="cri-o://8cc16547cbc6e477b4114990a2c9df931568158282d8653ad0f04cc797e0bbcf" gracePeriod=30 Apr 16 16:17:40.481906 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:17:40.481847 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-tjnn6" podUID="f239e257-511b-48b0-b865-328dffe69fa2" containerName="agent" containerID="cri-o://d905f6905fde40afad048ed397a21406fc9d0e87ccd1841a7bb5ce5d421ba19c" gracePeriod=30 Apr 16 16:17:40.532823 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:17:40.532795 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-p796g"] Apr 16 16:17:40.533152 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:17:40.533140 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8db39ab-6ed2-403b-84ba-127282e8b1d9" containerName="kserve-container" Apr 16 16:17:40.533206 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:17:40.533154 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8db39ab-6ed2-403b-84ba-127282e8b1d9" containerName="kserve-container" Apr 16 16:17:40.533206 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:17:40.533173 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8db39ab-6ed2-403b-84ba-127282e8b1d9" containerName="storage-initializer" Apr 16 16:17:40.533206 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:17:40.533179 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8db39ab-6ed2-403b-84ba-127282e8b1d9" containerName="storage-initializer" Apr 16 16:17:40.533206 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:17:40.533186 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8db39ab-6ed2-403b-84ba-127282e8b1d9" containerName="agent" Apr 16 16:17:40.533206 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:17:40.533191 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8db39ab-6ed2-403b-84ba-127282e8b1d9" containerName="agent" Apr 16 16:17:40.533397 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:17:40.533256 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="f8db39ab-6ed2-403b-84ba-127282e8b1d9" containerName="agent" Apr 16 16:17:40.533397 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:17:40.533264 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="f8db39ab-6ed2-403b-84ba-127282e8b1d9" containerName="kserve-container" Apr 16 16:17:40.536356 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:17:40.536340 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-p796g" Apr 16 16:17:40.544078 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:17:40.544052 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-p796g"] Apr 16 16:17:40.587543 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:17:40.587504 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c67c8121-e4de-408e-bb71-1fed0775d5e1-kserve-provision-location\") pod \"isvc-lightgbm-predictor-bd79f4555-p796g\" (UID: \"c67c8121-e4de-408e-bb71-1fed0775d5e1\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-p796g" Apr 16 16:17:40.688687 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:17:40.688654 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c67c8121-e4de-408e-bb71-1fed0775d5e1-kserve-provision-location\") pod \"isvc-lightgbm-predictor-bd79f4555-p796g\" (UID: \"c67c8121-e4de-408e-bb71-1fed0775d5e1\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-p796g" Apr 16 16:17:40.689014 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:17:40.688997 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c67c8121-e4de-408e-bb71-1fed0775d5e1-kserve-provision-location\") pod \"isvc-lightgbm-predictor-bd79f4555-p796g\" (UID: \"c67c8121-e4de-408e-bb71-1fed0775d5e1\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-p796g" Apr 16 16:17:40.869321 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:17:40.869242 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-p796g" Apr 16 16:17:40.986850 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:17:40.986827 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-p796g"] Apr 16 16:17:40.989522 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:17:40.989487 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc67c8121_e4de_408e_bb71_1fed0775d5e1.slice/crio-978764e5548f7ca5b87e93f4df77aa52c3bffaa52c016ab6305f5b9c5d9675c0 WatchSource:0}: Error finding container 978764e5548f7ca5b87e93f4df77aa52c3bffaa52c016ab6305f5b9c5d9675c0: Status 404 returned error can't find the container with id 978764e5548f7ca5b87e93f4df77aa52c3bffaa52c016ab6305f5b9c5d9675c0 Apr 16 16:17:41.698549 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:17:41.698512 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-tjnn6" podUID="f239e257-511b-48b0-b865-328dffe69fa2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 16 16:17:41.698945 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:17:41.698828 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-tjnn6" podUID="f239e257-511b-48b0-b865-328dffe69fa2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:17:41.955076 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:17:41.954987 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-p796g" event={"ID":"c67c8121-e4de-408e-bb71-1fed0775d5e1","Type":"ContainerStarted","Data":"065952c0f5d74c7c164780d04abc62a08e8ca1fd033821b24ee3c58b88d6ea29"} Apr 16 16:17:41.955076 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:17:41.955026 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-p796g" event={"ID":"c67c8121-e4de-408e-bb71-1fed0775d5e1","Type":"ContainerStarted","Data":"978764e5548f7ca5b87e93f4df77aa52c3bffaa52c016ab6305f5b9c5d9675c0"} Apr 16 16:17:44.968599 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:17:44.968568 2576 generic.go:358] "Generic (PLEG): container finished" podID="f239e257-511b-48b0-b865-328dffe69fa2" containerID="8cc16547cbc6e477b4114990a2c9df931568158282d8653ad0f04cc797e0bbcf" exitCode=0 Apr 16 16:17:44.968970 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:17:44.968648 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-tjnn6" event={"ID":"f239e257-511b-48b0-b865-328dffe69fa2","Type":"ContainerDied","Data":"8cc16547cbc6e477b4114990a2c9df931568158282d8653ad0f04cc797e0bbcf"} Apr 16 16:17:44.969918 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:17:44.969897 2576 generic.go:358] "Generic (PLEG): container finished" podID="c67c8121-e4de-408e-bb71-1fed0775d5e1" containerID="065952c0f5d74c7c164780d04abc62a08e8ca1fd033821b24ee3c58b88d6ea29" exitCode=0 Apr 16 16:17:44.970016 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:17:44.969970 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-p796g" event={"ID":"c67c8121-e4de-408e-bb71-1fed0775d5e1","Type":"ContainerDied","Data":"065952c0f5d74c7c164780d04abc62a08e8ca1fd033821b24ee3c58b88d6ea29"} Apr 16 16:17:51.698434 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:17:51.698390 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-tjnn6" podUID="f239e257-511b-48b0-b865-328dffe69fa2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 16 16:17:51.698838 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:17:51.698784 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-tjnn6" podUID="f239e257-511b-48b0-b865-328dffe69fa2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:17:51.995555 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:17:51.995471 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-p796g" event={"ID":"c67c8121-e4de-408e-bb71-1fed0775d5e1","Type":"ContainerStarted","Data":"60993ff28fc6ab8d40f8f83678f7d3b531335022cc08ce9be6562e610a28252d"} Apr 16 16:17:51.995766 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:17:51.995747 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-p796g" Apr 16 16:17:51.997021 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:17:51.996995 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-p796g" podUID="c67c8121-e4de-408e-bb71-1fed0775d5e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 16 16:17:52.012354 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:17:52.012308 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-p796g" podStartSLOduration=6.030845709 podStartE2EDuration="12.01229329s" podCreationTimestamp="2026-04-16 16:17:40 +0000 UTC" firstStartedPulling="2026-04-16 16:17:44.971277066 +0000 UTC m=+888.326267172" lastFinishedPulling="2026-04-16 16:17:50.952724642 +0000 UTC m=+894.307714753" observedRunningTime="2026-04-16 16:17:52.010984985 +0000 UTC m=+895.365975128" watchObservedRunningTime="2026-04-16 16:17:52.01229329 +0000 UTC m=+895.367283479" Apr 16 16:17:52.999591 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:17:52.999554 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-p796g" podUID="c67c8121-e4de-408e-bb71-1fed0775d5e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 16 16:18:01.699309 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:18:01.699261 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-tjnn6" podUID="f239e257-511b-48b0-b865-328dffe69fa2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 16 16:18:01.699744 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:18:01.699398 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-tjnn6" Apr 16 16:18:01.699744 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:18:01.699570 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-tjnn6" podUID="f239e257-511b-48b0-b865-328dffe69fa2" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 16:18:01.699744 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:18:01.699676 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-tjnn6" Apr 16 16:18:03.000017 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:18:02.999972 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-p796g" podUID="c67c8121-e4de-408e-bb71-1fed0775d5e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 16 16:18:10.623859 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:18:10.623839 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-tjnn6" Apr 16 16:18:10.631613 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:18:10.631590 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f239e257-511b-48b0-b865-328dffe69fa2-kserve-provision-location\") pod \"f239e257-511b-48b0-b865-328dffe69fa2\" (UID: \"f239e257-511b-48b0-b865-328dffe69fa2\") " Apr 16 16:18:10.631871 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:18:10.631850 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f239e257-511b-48b0-b865-328dffe69fa2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f239e257-511b-48b0-b865-328dffe69fa2" (UID: "f239e257-511b-48b0-b865-328dffe69fa2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:18:10.732879 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:18:10.732843 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f239e257-511b-48b0-b865-328dffe69fa2-kserve-provision-location\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:18:11.057666 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:18:11.057632 2576 generic.go:358] "Generic (PLEG): container finished" podID="f239e257-511b-48b0-b865-328dffe69fa2" containerID="d905f6905fde40afad048ed397a21406fc9d0e87ccd1841a7bb5ce5d421ba19c" exitCode=0 Apr 16 16:18:11.057834 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:18:11.057686 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-tjnn6" event={"ID":"f239e257-511b-48b0-b865-328dffe69fa2","Type":"ContainerDied","Data":"d905f6905fde40afad048ed397a21406fc9d0e87ccd1841a7bb5ce5d421ba19c"} Apr 16 16:18:11.057834 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:18:11.057709 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-tjnn6" Apr 16 16:18:11.057834 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:18:11.057723 2576 scope.go:117] "RemoveContainer" containerID="d905f6905fde40afad048ed397a21406fc9d0e87ccd1841a7bb5ce5d421ba19c" Apr 16 16:18:11.057834 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:18:11.057713 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-tjnn6" event={"ID":"f239e257-511b-48b0-b865-328dffe69fa2","Type":"ContainerDied","Data":"4b0bd73238f88442299525c2560a3fb4d0962b70d6fc42f1f3a131be447c8d30"} Apr 16 16:18:11.065793 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:18:11.065772 2576 scope.go:117] "RemoveContainer" containerID="8cc16547cbc6e477b4114990a2c9df931568158282d8653ad0f04cc797e0bbcf" Apr 16 16:18:11.072673 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:18:11.072657 2576 scope.go:117] "RemoveContainer" containerID="75b5f10b752e2e50f21e2965c2c5c77969b2f7a2f03ca5a67ba6d3eb44c6f51b" Apr 16 16:18:11.079171 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:18:11.079154 2576 scope.go:117] "RemoveContainer" containerID="d905f6905fde40afad048ed397a21406fc9d0e87ccd1841a7bb5ce5d421ba19c" Apr 16 16:18:11.079421 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:18:11.079403 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d905f6905fde40afad048ed397a21406fc9d0e87ccd1841a7bb5ce5d421ba19c\": container with ID starting with d905f6905fde40afad048ed397a21406fc9d0e87ccd1841a7bb5ce5d421ba19c not found: ID does not exist" containerID="d905f6905fde40afad048ed397a21406fc9d0e87ccd1841a7bb5ce5d421ba19c" Apr 16 16:18:11.079488 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:18:11.079429 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d905f6905fde40afad048ed397a21406fc9d0e87ccd1841a7bb5ce5d421ba19c"} err="failed to get container status \"d905f6905fde40afad048ed397a21406fc9d0e87ccd1841a7bb5ce5d421ba19c\": rpc error: code = NotFound desc = could not find container \"d905f6905fde40afad048ed397a21406fc9d0e87ccd1841a7bb5ce5d421ba19c\": container with ID starting with d905f6905fde40afad048ed397a21406fc9d0e87ccd1841a7bb5ce5d421ba19c not found: ID does not exist" Apr 16 16:18:11.079488 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:18:11.079447 2576 scope.go:117] "RemoveContainer" containerID="8cc16547cbc6e477b4114990a2c9df931568158282d8653ad0f04cc797e0bbcf" Apr 16 16:18:11.079673 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:18:11.079655 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cc16547cbc6e477b4114990a2c9df931568158282d8653ad0f04cc797e0bbcf\": container with ID starting with 8cc16547cbc6e477b4114990a2c9df931568158282d8653ad0f04cc797e0bbcf not found: ID does not exist" containerID="8cc16547cbc6e477b4114990a2c9df931568158282d8653ad0f04cc797e0bbcf" Apr 16 16:18:11.079751 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:18:11.079680 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cc16547cbc6e477b4114990a2c9df931568158282d8653ad0f04cc797e0bbcf"} err="failed to get container status \"8cc16547cbc6e477b4114990a2c9df931568158282d8653ad0f04cc797e0bbcf\": rpc error: code = NotFound desc = could not find container \"8cc16547cbc6e477b4114990a2c9df931568158282d8653ad0f04cc797e0bbcf\": container with ID starting with 8cc16547cbc6e477b4114990a2c9df931568158282d8653ad0f04cc797e0bbcf not found: ID does not exist" Apr 16 16:18:11.079751 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:18:11.079696 2576 scope.go:117] "RemoveContainer" containerID="75b5f10b752e2e50f21e2965c2c5c77969b2f7a2f03ca5a67ba6d3eb44c6f51b" Apr 16 16:18:11.079935 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:18:11.079918 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75b5f10b752e2e50f21e2965c2c5c77969b2f7a2f03ca5a67ba6d3eb44c6f51b\": container with ID starting with 75b5f10b752e2e50f21e2965c2c5c77969b2f7a2f03ca5a67ba6d3eb44c6f51b not found: ID does not exist" containerID="75b5f10b752e2e50f21e2965c2c5c77969b2f7a2f03ca5a67ba6d3eb44c6f51b" Apr 16 16:18:11.079974 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:18:11.079941 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75b5f10b752e2e50f21e2965c2c5c77969b2f7a2f03ca5a67ba6d3eb44c6f51b"} err="failed to get container status \"75b5f10b752e2e50f21e2965c2c5c77969b2f7a2f03ca5a67ba6d3eb44c6f51b\": rpc error: code = NotFound desc = could not find container \"75b5f10b752e2e50f21e2965c2c5c77969b2f7a2f03ca5a67ba6d3eb44c6f51b\": container with ID starting with 75b5f10b752e2e50f21e2965c2c5c77969b2f7a2f03ca5a67ba6d3eb44c6f51b not found: ID does not exist" Apr 16 16:18:11.083966 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:18:11.083946 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-tjnn6"] Apr 16 16:18:11.088456 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:18:11.088435 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-8444b4768-tjnn6"] Apr 16 16:18:11.278735 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:18:11.278707 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f239e257-511b-48b0-b865-328dffe69fa2" path="/var/lib/kubelet/pods/f239e257-511b-48b0-b865-328dffe69fa2/volumes" Apr 16 16:18:13.000248 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:18:13.000195 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-p796g" podUID="c67c8121-e4de-408e-bb71-1fed0775d5e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 16 16:18:22.999494 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:18:22.999449 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-p796g" podUID="c67c8121-e4de-408e-bb71-1fed0775d5e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 16 16:18:32.999631 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:18:32.999585 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-p796g" podUID="c67c8121-e4de-408e-bb71-1fed0775d5e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 16 16:18:43.000179 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:18:43.000137 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-p796g" podUID="c67c8121-e4de-408e-bb71-1fed0775d5e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 16 16:18:53.000322 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:18:53.000280 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-p796g" podUID="c67c8121-e4de-408e-bb71-1fed0775d5e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 16 16:18:59.274878 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:18:59.274837 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-p796g" podUID="c67c8121-e4de-408e-bb71-1fed0775d5e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.33:8080: connect: connection refused" Apr 16 16:19:09.278357 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:19:09.278327 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-p796g" Apr 16 16:19:10.635482 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:19:10.635450 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-p796g"] Apr 16 16:19:10.635829 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:19:10.635690 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-p796g" podUID="c67c8121-e4de-408e-bb71-1fed0775d5e1" containerName="kserve-container" containerID="cri-o://60993ff28fc6ab8d40f8f83678f7d3b531335022cc08ce9be6562e610a28252d" gracePeriod=30 Apr 16 16:19:10.749880 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:19:10.749845 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-hlccs"] Apr 16 16:19:10.750255 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:19:10.750238 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f239e257-511b-48b0-b865-328dffe69fa2" containerName="agent" Apr 16 16:19:10.750255 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:19:10.750256 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f239e257-511b-48b0-b865-328dffe69fa2" containerName="agent" Apr 16 16:19:10.750414 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:19:10.750281 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f239e257-511b-48b0-b865-328dffe69fa2" containerName="kserve-container" Apr 16 16:19:10.750414 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:19:10.750289 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f239e257-511b-48b0-b865-328dffe69fa2" containerName="kserve-container" Apr 16 16:19:10.750414 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:19:10.750317 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f239e257-511b-48b0-b865-328dffe69fa2" containerName="storage-initializer" Apr 16 16:19:10.750414 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:19:10.750326 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f239e257-511b-48b0-b865-328dffe69fa2" containerName="storage-initializer" Apr 16 16:19:10.750414 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:19:10.750402 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="f239e257-511b-48b0-b865-328dffe69fa2" containerName="kserve-container" Apr 16 16:19:10.750414 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:19:10.750416 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="f239e257-511b-48b0-b865-328dffe69fa2" containerName="agent" Apr 16 16:19:10.754573 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:19:10.754552 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-hlccs" Apr 16 16:19:10.766397 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:19:10.766373 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-hlccs"] Apr 16 16:19:10.897900 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:19:10.897872 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a5e1cf3a-f9e9-442d-9a0b-14b37dd7131a-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-7fd76f4446-hlccs\" (UID: \"a5e1cf3a-f9e9-442d-9a0b-14b37dd7131a\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-hlccs" Apr 16 16:19:10.999203 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:19:10.999168 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a5e1cf3a-f9e9-442d-9a0b-14b37dd7131a-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-7fd76f4446-hlccs\" (UID: \"a5e1cf3a-f9e9-442d-9a0b-14b37dd7131a\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-hlccs" Apr 16 16:19:10.999562 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:19:10.999538 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a5e1cf3a-f9e9-442d-9a0b-14b37dd7131a-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-7fd76f4446-hlccs\" (UID: \"a5e1cf3a-f9e9-442d-9a0b-14b37dd7131a\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-hlccs" Apr 16 16:19:11.063786 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:19:11.063759 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-hlccs" Apr 16 16:19:11.182663 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:19:11.182585 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-hlccs"] Apr 16 16:19:11.185747 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:19:11.185716 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5e1cf3a_f9e9_442d_9a0b_14b37dd7131a.slice/crio-3fd7bfe03e321ee007c32364da4ff4c6e399025522dff7c3012c5dd86c0d0f87 WatchSource:0}: Error finding container 3fd7bfe03e321ee007c32364da4ff4c6e399025522dff7c3012c5dd86c0d0f87: Status 404 returned error can't find the container with id 3fd7bfe03e321ee007c32364da4ff4c6e399025522dff7c3012c5dd86c0d0f87 Apr 16 16:19:11.248561 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:19:11.248530 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-hlccs" event={"ID":"a5e1cf3a-f9e9-442d-9a0b-14b37dd7131a","Type":"ContainerStarted","Data":"3fd7bfe03e321ee007c32364da4ff4c6e399025522dff7c3012c5dd86c0d0f87"} Apr 16 16:19:12.253643 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:19:12.253612 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-hlccs" event={"ID":"a5e1cf3a-f9e9-442d-9a0b-14b37dd7131a","Type":"ContainerStarted","Data":"65d3dc3dc1facf28c372d8c5705f3b974f59ba64d440cef9cdc4c17b8f32d8f4"} Apr 16 16:19:14.677537 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:19:14.677516 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-p796g" Apr 16 16:19:14.827638 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:19:14.827573 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c67c8121-e4de-408e-bb71-1fed0775d5e1-kserve-provision-location\") pod \"c67c8121-e4de-408e-bb71-1fed0775d5e1\" (UID: \"c67c8121-e4de-408e-bb71-1fed0775d5e1\") " Apr 16 16:19:14.827918 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:19:14.827899 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c67c8121-e4de-408e-bb71-1fed0775d5e1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c67c8121-e4de-408e-bb71-1fed0775d5e1" (UID: "c67c8121-e4de-408e-bb71-1fed0775d5e1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:19:14.928940 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:19:14.928912 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c67c8121-e4de-408e-bb71-1fed0775d5e1-kserve-provision-location\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:19:15.263428 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:19:15.263399 2576 generic.go:358] "Generic (PLEG): container finished" podID="c67c8121-e4de-408e-bb71-1fed0775d5e1" containerID="60993ff28fc6ab8d40f8f83678f7d3b531335022cc08ce9be6562e610a28252d" exitCode=0 Apr 16 16:19:15.263558 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:19:15.263452 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-p796g" event={"ID":"c67c8121-e4de-408e-bb71-1fed0775d5e1","Type":"ContainerDied","Data":"60993ff28fc6ab8d40f8f83678f7d3b531335022cc08ce9be6562e610a28252d"} Apr 16 16:19:15.263558 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:19:15.263477 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-p796g" Apr 16 16:19:15.263558 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:19:15.263488 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-p796g" event={"ID":"c67c8121-e4de-408e-bb71-1fed0775d5e1","Type":"ContainerDied","Data":"978764e5548f7ca5b87e93f4df77aa52c3bffaa52c016ab6305f5b9c5d9675c0"} Apr 16 16:19:15.263558 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:19:15.263505 2576 scope.go:117] "RemoveContainer" containerID="60993ff28fc6ab8d40f8f83678f7d3b531335022cc08ce9be6562e610a28252d" Apr 16 16:19:15.264808 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:19:15.264789 2576 generic.go:358] "Generic (PLEG): container finished" podID="a5e1cf3a-f9e9-442d-9a0b-14b37dd7131a" containerID="65d3dc3dc1facf28c372d8c5705f3b974f59ba64d440cef9cdc4c17b8f32d8f4" exitCode=0 Apr 16 16:19:15.264880 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:19:15.264830 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-hlccs" event={"ID":"a5e1cf3a-f9e9-442d-9a0b-14b37dd7131a","Type":"ContainerDied","Data":"65d3dc3dc1facf28c372d8c5705f3b974f59ba64d440cef9cdc4c17b8f32d8f4"} Apr 16 16:19:15.271501 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:19:15.271479 2576 scope.go:117] "RemoveContainer" containerID="065952c0f5d74c7c164780d04abc62a08e8ca1fd033821b24ee3c58b88d6ea29" Apr 16 16:19:15.278699 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:19:15.278676 2576 scope.go:117] "RemoveContainer" containerID="60993ff28fc6ab8d40f8f83678f7d3b531335022cc08ce9be6562e610a28252d" Apr 16 16:19:15.278946 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:19:15.278928 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60993ff28fc6ab8d40f8f83678f7d3b531335022cc08ce9be6562e610a28252d\": container with ID starting with 60993ff28fc6ab8d40f8f83678f7d3b531335022cc08ce9be6562e610a28252d not found: ID does not exist" containerID="60993ff28fc6ab8d40f8f83678f7d3b531335022cc08ce9be6562e610a28252d" Apr 16 16:19:15.278997 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:19:15.278956 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60993ff28fc6ab8d40f8f83678f7d3b531335022cc08ce9be6562e610a28252d"} err="failed to get container status \"60993ff28fc6ab8d40f8f83678f7d3b531335022cc08ce9be6562e610a28252d\": rpc error: code = NotFound desc = could not find container \"60993ff28fc6ab8d40f8f83678f7d3b531335022cc08ce9be6562e610a28252d\": container with ID starting with 60993ff28fc6ab8d40f8f83678f7d3b531335022cc08ce9be6562e610a28252d not found: ID does not exist" Apr 16 16:19:15.278997 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:19:15.278973 2576 scope.go:117] "RemoveContainer" containerID="065952c0f5d74c7c164780d04abc62a08e8ca1fd033821b24ee3c58b88d6ea29" Apr 16 16:19:15.279199 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:19:15.279182 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"065952c0f5d74c7c164780d04abc62a08e8ca1fd033821b24ee3c58b88d6ea29\": container with ID starting with 065952c0f5d74c7c164780d04abc62a08e8ca1fd033821b24ee3c58b88d6ea29 not found: ID does not exist" containerID="065952c0f5d74c7c164780d04abc62a08e8ca1fd033821b24ee3c58b88d6ea29" Apr 16 16:19:15.279273 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:19:15.279204 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"065952c0f5d74c7c164780d04abc62a08e8ca1fd033821b24ee3c58b88d6ea29"} err="failed to get container status \"065952c0f5d74c7c164780d04abc62a08e8ca1fd033821b24ee3c58b88d6ea29\": rpc error: code = NotFound desc = could not find container \"065952c0f5d74c7c164780d04abc62a08e8ca1fd033821b24ee3c58b88d6ea29\": container with ID starting with 065952c0f5d74c7c164780d04abc62a08e8ca1fd033821b24ee3c58b88d6ea29 not found: ID does not exist" Apr 16 16:19:15.294116 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:19:15.294096 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-p796g"] Apr 16 16:19:15.299911 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:19:15.299891 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bd79f4555-p796g"] Apr 16 16:19:16.269627 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:19:16.269538 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-hlccs" event={"ID":"a5e1cf3a-f9e9-442d-9a0b-14b37dd7131a","Type":"ContainerStarted","Data":"e95fec9030e60d6ee4a99f80dc7cad2aa3994e99788007a28f4feb4343638918"} Apr 16 16:19:16.270071 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:19:16.269894 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-hlccs" Apr 16 16:19:16.271155 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:19:16.271130 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-hlccs" podUID="a5e1cf3a-f9e9-442d-9a0b-14b37dd7131a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 16 16:19:16.286303 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:19:16.286264 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-hlccs" podStartSLOduration=6.286253338 podStartE2EDuration="6.286253338s" podCreationTimestamp="2026-04-16 16:19:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:19:16.284521434 +0000 UTC m=+979.639511562" watchObservedRunningTime="2026-04-16 16:19:16.286253338 +0000 UTC m=+979.641243527" Apr 16 16:19:17.273336 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:19:17.273295 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-hlccs" podUID="a5e1cf3a-f9e9-442d-9a0b-14b37dd7131a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 16 16:19:17.277961 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:19:17.277930 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c67c8121-e4de-408e-bb71-1fed0775d5e1" path="/var/lib/kubelet/pods/c67c8121-e4de-408e-bb71-1fed0775d5e1/volumes" Apr 16 16:19:27.274225 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:19:27.274169 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-hlccs" podUID="a5e1cf3a-f9e9-442d-9a0b-14b37dd7131a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 16 16:19:37.273495 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:19:37.273449 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-hlccs" podUID="a5e1cf3a-f9e9-442d-9a0b-14b37dd7131a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 16 16:19:47.273861 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:19:47.273816 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-hlccs" podUID="a5e1cf3a-f9e9-442d-9a0b-14b37dd7131a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 16 16:19:57.273523 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:19:57.273482 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-hlccs" podUID="a5e1cf3a-f9e9-442d-9a0b-14b37dd7131a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 16 16:20:07.273608 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:20:07.273560 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-hlccs" podUID="a5e1cf3a-f9e9-442d-9a0b-14b37dd7131a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 16 16:20:17.274254 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:20:17.274195 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-hlccs" podUID="a5e1cf3a-f9e9-442d-9a0b-14b37dd7131a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 16 16:20:22.274371 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:20:22.274336 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-hlccs" podUID="a5e1cf3a-f9e9-442d-9a0b-14b37dd7131a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 16 16:20:32.276386 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:20:32.276355 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-hlccs" Apr 16 16:20:41.097119 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:20:41.097086 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-hlccs"] Apr 16 16:20:41.097521 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:20:41.097380 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-hlccs" podUID="a5e1cf3a-f9e9-442d-9a0b-14b37dd7131a" containerName="kserve-container" containerID="cri-o://e95fec9030e60d6ee4a99f80dc7cad2aa3994e99788007a28f4feb4343638918" gracePeriod=30 Apr 16 16:20:41.177451 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:20:41.177419 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-788c698548-7djwg"] Apr 16 16:20:41.177741 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:20:41.177730 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c67c8121-e4de-408e-bb71-1fed0775d5e1" containerName="kserve-container" Apr 16 16:20:41.177785 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:20:41.177743 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c67c8121-e4de-408e-bb71-1fed0775d5e1" containerName="kserve-container" Apr 16 16:20:41.177785 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:20:41.177764 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c67c8121-e4de-408e-bb71-1fed0775d5e1" containerName="storage-initializer" Apr 16 16:20:41.177785 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:20:41.177770 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c67c8121-e4de-408e-bb71-1fed0775d5e1" containerName="storage-initializer" Apr 16 16:20:41.177879 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:20:41.177822 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c67c8121-e4de-408e-bb71-1fed0775d5e1" containerName="kserve-container" Apr 16 16:20:41.180903 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:20:41.180882 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-788c698548-7djwg" Apr 16 16:20:41.189687 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:20:41.189663 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-788c698548-7djwg"] Apr 16 16:20:41.281799 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:20:41.281771 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b2bbce9b-7153-4424-88cb-a8890b480273-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-788c698548-7djwg\" (UID: \"b2bbce9b-7153-4424-88cb-a8890b480273\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-788c698548-7djwg" Apr 16 16:20:41.383151 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:20:41.383073 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b2bbce9b-7153-4424-88cb-a8890b480273-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-788c698548-7djwg\" (UID: \"b2bbce9b-7153-4424-88cb-a8890b480273\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-788c698548-7djwg" Apr 16 16:20:41.383488 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:20:41.383468 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b2bbce9b-7153-4424-88cb-a8890b480273-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-788c698548-7djwg\" (UID: \"b2bbce9b-7153-4424-88cb-a8890b480273\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-788c698548-7djwg" Apr 16 16:20:41.491287 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:20:41.491236 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-788c698548-7djwg" Apr 16 16:20:41.607521 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:20:41.607396 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-788c698548-7djwg"] Apr 16 16:20:41.610325 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:20:41.610294 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2bbce9b_7153_4424_88cb_a8890b480273.slice/crio-0b77c9b6bca92ef40bc9d2c73eb8e67e04f35bd13e7f12366c1bc5824d989b9b WatchSource:0}: Error finding container 0b77c9b6bca92ef40bc9d2c73eb8e67e04f35bd13e7f12366c1bc5824d989b9b: Status 404 returned error can't find the container with id 0b77c9b6bca92ef40bc9d2c73eb8e67e04f35bd13e7f12366c1bc5824d989b9b Apr 16 16:20:41.612611 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:20:41.612590 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:20:42.274974 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:20:42.274937 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-hlccs" podUID="a5e1cf3a-f9e9-442d-9a0b-14b37dd7131a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.34:8080: connect: connection refused" Apr 16 16:20:42.527110 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:20:42.527018 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-788c698548-7djwg" event={"ID":"b2bbce9b-7153-4424-88cb-a8890b480273","Type":"ContainerStarted","Data":"e63023ee94b73d7eac8320b75974146c979952c20480f4e29054ba891883c295"} Apr 16 16:20:42.527110 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:20:42.527055 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-788c698548-7djwg" event={"ID":"b2bbce9b-7153-4424-88cb-a8890b480273","Type":"ContainerStarted","Data":"0b77c9b6bca92ef40bc9d2c73eb8e67e04f35bd13e7f12366c1bc5824d989b9b"} Apr 16 16:20:45.444769 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:20:45.444746 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-hlccs" Apr 16 16:20:45.517656 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:20:45.517629 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a5e1cf3a-f9e9-442d-9a0b-14b37dd7131a-kserve-provision-location\") pod \"a5e1cf3a-f9e9-442d-9a0b-14b37dd7131a\" (UID: \"a5e1cf3a-f9e9-442d-9a0b-14b37dd7131a\") " Apr 16 16:20:45.517936 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:20:45.517913 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5e1cf3a-f9e9-442d-9a0b-14b37dd7131a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a5e1cf3a-f9e9-442d-9a0b-14b37dd7131a" (UID: "a5e1cf3a-f9e9-442d-9a0b-14b37dd7131a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:20:45.539542 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:20:45.539516 2576 generic.go:358] "Generic (PLEG): container finished" podID="a5e1cf3a-f9e9-442d-9a0b-14b37dd7131a" containerID="e95fec9030e60d6ee4a99f80dc7cad2aa3994e99788007a28f4feb4343638918" exitCode=0 Apr 16 16:20:45.539655 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:20:45.539582 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-hlccs" Apr 16 16:20:45.539655 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:20:45.539587 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-hlccs" event={"ID":"a5e1cf3a-f9e9-442d-9a0b-14b37dd7131a","Type":"ContainerDied","Data":"e95fec9030e60d6ee4a99f80dc7cad2aa3994e99788007a28f4feb4343638918"} Apr 16 16:20:45.539655 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:20:45.539619 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-hlccs" event={"ID":"a5e1cf3a-f9e9-442d-9a0b-14b37dd7131a","Type":"ContainerDied","Data":"3fd7bfe03e321ee007c32364da4ff4c6e399025522dff7c3012c5dd86c0d0f87"} Apr 16 16:20:45.539655 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:20:45.539635 2576 scope.go:117] "RemoveContainer" containerID="e95fec9030e60d6ee4a99f80dc7cad2aa3994e99788007a28f4feb4343638918" Apr 16 16:20:45.540956 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:20:45.540930 2576 generic.go:358] "Generic (PLEG): container finished" podID="b2bbce9b-7153-4424-88cb-a8890b480273" containerID="e63023ee94b73d7eac8320b75974146c979952c20480f4e29054ba891883c295" exitCode=0 Apr 16 16:20:45.541045 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:20:45.540972 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-788c698548-7djwg" event={"ID":"b2bbce9b-7153-4424-88cb-a8890b480273","Type":"ContainerDied","Data":"e63023ee94b73d7eac8320b75974146c979952c20480f4e29054ba891883c295"} Apr 16 16:20:45.547691 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:20:45.547672 2576 scope.go:117] "RemoveContainer" containerID="65d3dc3dc1facf28c372d8c5705f3b974f59ba64d440cef9cdc4c17b8f32d8f4" Apr 16 16:20:45.554598 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:20:45.554584 2576 scope.go:117] "RemoveContainer" containerID="e95fec9030e60d6ee4a99f80dc7cad2aa3994e99788007a28f4feb4343638918" Apr 16 16:20:45.554836 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:20:45.554818 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e95fec9030e60d6ee4a99f80dc7cad2aa3994e99788007a28f4feb4343638918\": container with ID starting with e95fec9030e60d6ee4a99f80dc7cad2aa3994e99788007a28f4feb4343638918 not found: ID does not exist" containerID="e95fec9030e60d6ee4a99f80dc7cad2aa3994e99788007a28f4feb4343638918" Apr 16 16:20:45.554883 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:20:45.554851 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e95fec9030e60d6ee4a99f80dc7cad2aa3994e99788007a28f4feb4343638918"} err="failed to get container status \"e95fec9030e60d6ee4a99f80dc7cad2aa3994e99788007a28f4feb4343638918\": rpc error: code = NotFound desc = could not find container \"e95fec9030e60d6ee4a99f80dc7cad2aa3994e99788007a28f4feb4343638918\": container with ID starting with e95fec9030e60d6ee4a99f80dc7cad2aa3994e99788007a28f4feb4343638918 not found: ID does not exist" Apr 16 16:20:45.554883 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:20:45.554872 2576 scope.go:117] "RemoveContainer" containerID="65d3dc3dc1facf28c372d8c5705f3b974f59ba64d440cef9cdc4c17b8f32d8f4" Apr 16 16:20:45.555075 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:20:45.555060 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65d3dc3dc1facf28c372d8c5705f3b974f59ba64d440cef9cdc4c17b8f32d8f4\": container with ID starting with 65d3dc3dc1facf28c372d8c5705f3b974f59ba64d440cef9cdc4c17b8f32d8f4 not found: ID does not exist" containerID="65d3dc3dc1facf28c372d8c5705f3b974f59ba64d440cef9cdc4c17b8f32d8f4" Apr 16 16:20:45.555118 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:20:45.555079 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65d3dc3dc1facf28c372d8c5705f3b974f59ba64d440cef9cdc4c17b8f32d8f4"} err="failed to get container status \"65d3dc3dc1facf28c372d8c5705f3b974f59ba64d440cef9cdc4c17b8f32d8f4\": rpc error: code = NotFound desc = could not find container \"65d3dc3dc1facf28c372d8c5705f3b974f59ba64d440cef9cdc4c17b8f32d8f4\": container with ID starting with 65d3dc3dc1facf28c372d8c5705f3b974f59ba64d440cef9cdc4c17b8f32d8f4 not found: ID does not exist" Apr 16 16:20:45.570413 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:20:45.570393 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-hlccs"] Apr 16 16:20:45.574003 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:20:45.573982 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-7fd76f4446-hlccs"] Apr 16 16:20:45.618091 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:20:45.618070 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a5e1cf3a-f9e9-442d-9a0b-14b37dd7131a-kserve-provision-location\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:20:47.283679 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:20:47.283641 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5e1cf3a-f9e9-442d-9a0b-14b37dd7131a" path="/var/lib/kubelet/pods/a5e1cf3a-f9e9-442d-9a0b-14b37dd7131a/volumes" Apr 16 16:23:00.017251 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:00.017201 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-788c698548-7djwg" event={"ID":"b2bbce9b-7153-4424-88cb-a8890b480273","Type":"ContainerStarted","Data":"6d1ec70036401754be6715c0f3399508d28beb28f21aad30a3e495ee0e2c64b4"} Apr 16 16:23:00.017689 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:00.017343 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-788c698548-7djwg" Apr 16 16:23:00.047648 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:00.047603 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-788c698548-7djwg" podStartSLOduration=4.957388952 podStartE2EDuration="2m19.047590752s" podCreationTimestamp="2026-04-16 16:20:41 +0000 UTC" firstStartedPulling="2026-04-16 16:20:45.542050669 +0000 UTC m=+1068.897040780" lastFinishedPulling="2026-04-16 16:22:59.63225247 +0000 UTC m=+1202.987242580" observedRunningTime="2026-04-16 16:23:00.046233155 +0000 UTC m=+1203.401223281" watchObservedRunningTime="2026-04-16 16:23:00.047590752 +0000 UTC m=+1203.402580880" Apr 16 16:23:31.026014 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:31.025980 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-788c698548-7djwg" Apr 16 16:23:31.369689 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:31.369618 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-788c698548-7djwg"] Apr 16 16:23:31.369870 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:31.369849 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-788c698548-7djwg" podUID="b2bbce9b-7153-4424-88cb-a8890b480273" containerName="kserve-container" containerID="cri-o://6d1ec70036401754be6715c0f3399508d28beb28f21aad30a3e495ee0e2c64b4" gracePeriod=30 Apr 16 16:23:31.466413 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:31.466378 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-57f9c55f54-dqcxt"] Apr 16 16:23:31.466702 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:31.466686 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a5e1cf3a-f9e9-442d-9a0b-14b37dd7131a" containerName="storage-initializer" Apr 16 16:23:31.466702 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:31.466701 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5e1cf3a-f9e9-442d-9a0b-14b37dd7131a" containerName="storage-initializer" Apr 16 16:23:31.466840 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:31.466709 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a5e1cf3a-f9e9-442d-9a0b-14b37dd7131a" containerName="kserve-container" Apr 16 16:23:31.466840 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:31.466715 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5e1cf3a-f9e9-442d-9a0b-14b37dd7131a" containerName="kserve-container" Apr 16 16:23:31.466840 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:31.466776 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a5e1cf3a-f9e9-442d-9a0b-14b37dd7131a" containerName="kserve-container" Apr 16 16:23:31.489693 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:31.489658 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-57f9c55f54-dqcxt"] Apr 16 16:23:31.489859 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:31.489780 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-57f9c55f54-dqcxt" Apr 16 16:23:31.602545 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:31.602509 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/13a92dfd-4167-4713-b202-01b18a6724ed-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-57f9c55f54-dqcxt\" (UID: \"13a92dfd-4167-4713-b202-01b18a6724ed\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-57f9c55f54-dqcxt" Apr 16 16:23:31.702936 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:31.702900 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/13a92dfd-4167-4713-b202-01b18a6724ed-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-57f9c55f54-dqcxt\" (UID: \"13a92dfd-4167-4713-b202-01b18a6724ed\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-57f9c55f54-dqcxt" Apr 16 16:23:31.703287 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:31.703268 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/13a92dfd-4167-4713-b202-01b18a6724ed-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-57f9c55f54-dqcxt\" (UID: \"13a92dfd-4167-4713-b202-01b18a6724ed\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-57f9c55f54-dqcxt" Apr 16 16:23:31.799524 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:31.799500 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-57f9c55f54-dqcxt" Apr 16 16:23:31.920695 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:31.920607 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-57f9c55f54-dqcxt"] Apr 16 16:23:31.923500 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:23:31.923466 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13a92dfd_4167_4713_b202_01b18a6724ed.slice/crio-ec90d740df0a768444cf64deb7f5e0bdc1b795d85990d614cb15e276722a4e44 WatchSource:0}: Error finding container ec90d740df0a768444cf64deb7f5e0bdc1b795d85990d614cb15e276722a4e44: Status 404 returned error can't find the container with id ec90d740df0a768444cf64deb7f5e0bdc1b795d85990d614cb15e276722a4e44 Apr 16 16:23:32.118490 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:32.118448 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-57f9c55f54-dqcxt" event={"ID":"13a92dfd-4167-4713-b202-01b18a6724ed","Type":"ContainerStarted","Data":"90b2dfcf334eb7cedc7a286c3a85971014d488aa90f18ed7ac4b386f6357d38e"} Apr 16 16:23:32.118490 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:32.118495 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-57f9c55f54-dqcxt" event={"ID":"13a92dfd-4167-4713-b202-01b18a6724ed","Type":"ContainerStarted","Data":"ec90d740df0a768444cf64deb7f5e0bdc1b795d85990d614cb15e276722a4e44"} Apr 16 16:23:32.425623 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:32.425602 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-788c698548-7djwg" Apr 16 16:23:32.611314 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:32.611233 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b2bbce9b-7153-4424-88cb-a8890b480273-kserve-provision-location\") pod \"b2bbce9b-7153-4424-88cb-a8890b480273\" (UID: \"b2bbce9b-7153-4424-88cb-a8890b480273\") " Apr 16 16:23:32.611536 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:32.611508 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2bbce9b-7153-4424-88cb-a8890b480273-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b2bbce9b-7153-4424-88cb-a8890b480273" (UID: "b2bbce9b-7153-4424-88cb-a8890b480273"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:23:32.712599 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:32.712570 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b2bbce9b-7153-4424-88cb-a8890b480273-kserve-provision-location\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:23:33.122574 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:33.122540 2576 generic.go:358] "Generic (PLEG): container finished" podID="b2bbce9b-7153-4424-88cb-a8890b480273" containerID="6d1ec70036401754be6715c0f3399508d28beb28f21aad30a3e495ee0e2c64b4" exitCode=0 Apr 16 16:23:33.122958 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:33.122582 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-788c698548-7djwg" event={"ID":"b2bbce9b-7153-4424-88cb-a8890b480273","Type":"ContainerDied","Data":"6d1ec70036401754be6715c0f3399508d28beb28f21aad30a3e495ee0e2c64b4"} Apr 16 16:23:33.122958 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:33.122609 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-788c698548-7djwg" Apr 16 16:23:33.122958 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:33.122617 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-788c698548-7djwg" event={"ID":"b2bbce9b-7153-4424-88cb-a8890b480273","Type":"ContainerDied","Data":"0b77c9b6bca92ef40bc9d2c73eb8e67e04f35bd13e7f12366c1bc5824d989b9b"} Apr 16 16:23:33.122958 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:33.122632 2576 scope.go:117] "RemoveContainer" containerID="6d1ec70036401754be6715c0f3399508d28beb28f21aad30a3e495ee0e2c64b4" Apr 16 16:23:33.137166 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:33.137144 2576 scope.go:117] "RemoveContainer" containerID="e63023ee94b73d7eac8320b75974146c979952c20480f4e29054ba891883c295" Apr 16 16:23:33.144790 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:33.144768 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-788c698548-7djwg"] Apr 16 16:23:33.144948 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:33.144927 2576 scope.go:117] "RemoveContainer" containerID="6d1ec70036401754be6715c0f3399508d28beb28f21aad30a3e495ee0e2c64b4" Apr 16 16:23:33.145192 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:23:33.145172 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d1ec70036401754be6715c0f3399508d28beb28f21aad30a3e495ee0e2c64b4\": container with ID starting with 6d1ec70036401754be6715c0f3399508d28beb28f21aad30a3e495ee0e2c64b4 not found: ID does not exist" containerID="6d1ec70036401754be6715c0f3399508d28beb28f21aad30a3e495ee0e2c64b4" Apr 16 16:23:33.145283 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:33.145203 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d1ec70036401754be6715c0f3399508d28beb28f21aad30a3e495ee0e2c64b4"} err="failed to get container status \"6d1ec70036401754be6715c0f3399508d28beb28f21aad30a3e495ee0e2c64b4\": rpc error: code = NotFound desc = could not find container \"6d1ec70036401754be6715c0f3399508d28beb28f21aad30a3e495ee0e2c64b4\": container with ID starting with 6d1ec70036401754be6715c0f3399508d28beb28f21aad30a3e495ee0e2c64b4 not found: ID does not exist" Apr 16 16:23:33.145283 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:33.145243 2576 scope.go:117] "RemoveContainer" containerID="e63023ee94b73d7eac8320b75974146c979952c20480f4e29054ba891883c295" Apr 16 16:23:33.145528 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:23:33.145511 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e63023ee94b73d7eac8320b75974146c979952c20480f4e29054ba891883c295\": container with ID starting with e63023ee94b73d7eac8320b75974146c979952c20480f4e29054ba891883c295 not found: ID does not exist" containerID="e63023ee94b73d7eac8320b75974146c979952c20480f4e29054ba891883c295" Apr 16 16:23:33.145570 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:33.145537 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e63023ee94b73d7eac8320b75974146c979952c20480f4e29054ba891883c295"} err="failed to get container status \"e63023ee94b73d7eac8320b75974146c979952c20480f4e29054ba891883c295\": rpc error: code = NotFound desc = could not find container \"e63023ee94b73d7eac8320b75974146c979952c20480f4e29054ba891883c295\": container with ID starting with e63023ee94b73d7eac8320b75974146c979952c20480f4e29054ba891883c295 not found: ID does not exist" Apr 16 16:23:33.149254 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:33.149234 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-788c698548-7djwg"] Apr 16 16:23:33.278630 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:33.278598 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2bbce9b-7153-4424-88cb-a8890b480273" path="/var/lib/kubelet/pods/b2bbce9b-7153-4424-88cb-a8890b480273/volumes" Apr 16 16:23:36.134637 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:36.134605 2576 generic.go:358] "Generic (PLEG): container finished" podID="13a92dfd-4167-4713-b202-01b18a6724ed" containerID="90b2dfcf334eb7cedc7a286c3a85971014d488aa90f18ed7ac4b386f6357d38e" exitCode=0 Apr 16 16:23:36.134972 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:36.134677 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-57f9c55f54-dqcxt" event={"ID":"13a92dfd-4167-4713-b202-01b18a6724ed","Type":"ContainerDied","Data":"90b2dfcf334eb7cedc7a286c3a85971014d488aa90f18ed7ac4b386f6357d38e"} Apr 16 16:23:37.139689 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:37.139654 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-57f9c55f54-dqcxt" event={"ID":"13a92dfd-4167-4713-b202-01b18a6724ed","Type":"ContainerStarted","Data":"99642c5d379f0f307f44baa4cefcad883662d62a4a487cdf7bed1642c25af5d9"} Apr 16 16:23:37.140126 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:37.139936 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-57f9c55f54-dqcxt" Apr 16 16:23:37.141525 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:37.141498 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-57f9c55f54-dqcxt" podUID="13a92dfd-4167-4713-b202-01b18a6724ed" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 16 16:23:37.158820 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:37.158773 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-57f9c55f54-dqcxt" podStartSLOduration=6.1587593609999995 podStartE2EDuration="6.158759361s" podCreationTimestamp="2026-04-16 16:23:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:23:37.157261207 +0000 UTC m=+1240.512251336" watchObservedRunningTime="2026-04-16 16:23:37.158759361 +0000 UTC m=+1240.513749489" Apr 16 16:23:38.143499 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:38.143456 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-57f9c55f54-dqcxt" podUID="13a92dfd-4167-4713-b202-01b18a6724ed" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 16 16:23:48.145082 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:48.145008 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-57f9c55f54-dqcxt" Apr 16 16:23:51.478432 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:51.478399 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-57f9c55f54-dqcxt"] Apr 16 16:23:51.478810 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:51.478621 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-57f9c55f54-dqcxt" podUID="13a92dfd-4167-4713-b202-01b18a6724ed" containerName="kserve-container" containerID="cri-o://99642c5d379f0f307f44baa4cefcad883662d62a4a487cdf7bed1642c25af5d9" gracePeriod=30 Apr 16 16:23:51.584652 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:51.584618 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-65b68fcf68-z8jdj"] Apr 16 16:23:51.584949 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:51.584935 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b2bbce9b-7153-4424-88cb-a8890b480273" containerName="storage-initializer" Apr 16 16:23:51.584996 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:51.584951 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2bbce9b-7153-4424-88cb-a8890b480273" containerName="storage-initializer" Apr 16 16:23:51.584996 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:51.584960 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b2bbce9b-7153-4424-88cb-a8890b480273" containerName="kserve-container" Apr 16 16:23:51.584996 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:51.584966 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2bbce9b-7153-4424-88cb-a8890b480273" containerName="kserve-container" Apr 16 16:23:51.585091 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:51.585021 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="b2bbce9b-7153-4424-88cb-a8890b480273" containerName="kserve-container" Apr 16 16:23:51.587796 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:51.587780 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-65b68fcf68-z8jdj" Apr 16 16:23:51.595515 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:51.595489 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-65b68fcf68-z8jdj"] Apr 16 16:23:51.647461 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:51.647433 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/76d22f44-ecbe-4cbe-87be-c3cf8a26b5db-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-65b68fcf68-z8jdj\" (UID: \"76d22f44-ecbe-4cbe-87be-c3cf8a26b5db\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-65b68fcf68-z8jdj" Apr 16 16:23:51.748065 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:51.747982 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/76d22f44-ecbe-4cbe-87be-c3cf8a26b5db-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-65b68fcf68-z8jdj\" (UID: \"76d22f44-ecbe-4cbe-87be-c3cf8a26b5db\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-65b68fcf68-z8jdj" Apr 16 16:23:51.748371 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:51.748351 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/76d22f44-ecbe-4cbe-87be-c3cf8a26b5db-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-65b68fcf68-z8jdj\" (UID: \"76d22f44-ecbe-4cbe-87be-c3cf8a26b5db\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-65b68fcf68-z8jdj" Apr 16 16:23:51.898785 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:51.898750 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-65b68fcf68-z8jdj" Apr 16 16:23:52.019311 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:52.019282 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-65b68fcf68-z8jdj"] Apr 16 16:23:52.021188 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:23:52.021159 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76d22f44_ecbe_4cbe_87be_c3cf8a26b5db.slice/crio-d7f071aa7ee6092fda9834d3d160d2741c47a7a5f50b3d648ae0be69716e63ff WatchSource:0}: Error finding container d7f071aa7ee6092fda9834d3d160d2741c47a7a5f50b3d648ae0be69716e63ff: Status 404 returned error can't find the container with id d7f071aa7ee6092fda9834d3d160d2741c47a7a5f50b3d648ae0be69716e63ff Apr 16 16:23:52.104961 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:52.104942 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-57f9c55f54-dqcxt" Apr 16 16:23:52.151790 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:52.151762 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/13a92dfd-4167-4713-b202-01b18a6724ed-kserve-provision-location\") pod \"13a92dfd-4167-4713-b202-01b18a6724ed\" (UID: \"13a92dfd-4167-4713-b202-01b18a6724ed\") " Apr 16 16:23:52.152199 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:52.152170 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13a92dfd-4167-4713-b202-01b18a6724ed-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "13a92dfd-4167-4713-b202-01b18a6724ed" (UID: "13a92dfd-4167-4713-b202-01b18a6724ed"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:23:52.189837 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:52.189806 2576 generic.go:358] "Generic (PLEG): container finished" podID="13a92dfd-4167-4713-b202-01b18a6724ed" containerID="99642c5d379f0f307f44baa4cefcad883662d62a4a487cdf7bed1642c25af5d9" exitCode=0 Apr 16 16:23:52.189971 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:52.189896 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-57f9c55f54-dqcxt" Apr 16 16:23:52.190035 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:52.189893 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-57f9c55f54-dqcxt" event={"ID":"13a92dfd-4167-4713-b202-01b18a6724ed","Type":"ContainerDied","Data":"99642c5d379f0f307f44baa4cefcad883662d62a4a487cdf7bed1642c25af5d9"} Apr 16 16:23:52.190035 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:52.190019 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-57f9c55f54-dqcxt" event={"ID":"13a92dfd-4167-4713-b202-01b18a6724ed","Type":"ContainerDied","Data":"ec90d740df0a768444cf64deb7f5e0bdc1b795d85990d614cb15e276722a4e44"} Apr 16 16:23:52.190131 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:52.190049 2576 scope.go:117] "RemoveContainer" containerID="99642c5d379f0f307f44baa4cefcad883662d62a4a487cdf7bed1642c25af5d9" Apr 16 16:23:52.191344 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:52.191318 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-65b68fcf68-z8jdj" event={"ID":"76d22f44-ecbe-4cbe-87be-c3cf8a26b5db","Type":"ContainerStarted","Data":"7b92a1aeeff49b59455a6c842b840fa21354256b09484eaee571e8d8924d6291"} Apr 16 16:23:52.191437 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:52.191353 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-65b68fcf68-z8jdj" event={"ID":"76d22f44-ecbe-4cbe-87be-c3cf8a26b5db","Type":"ContainerStarted","Data":"d7f071aa7ee6092fda9834d3d160d2741c47a7a5f50b3d648ae0be69716e63ff"} Apr 16 16:23:52.198898 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:52.198884 2576 scope.go:117] "RemoveContainer" containerID="90b2dfcf334eb7cedc7a286c3a85971014d488aa90f18ed7ac4b386f6357d38e" Apr 16 16:23:52.207161 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:52.207137 2576 scope.go:117] "RemoveContainer" containerID="99642c5d379f0f307f44baa4cefcad883662d62a4a487cdf7bed1642c25af5d9" Apr 16 16:23:52.207481 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:23:52.207460 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99642c5d379f0f307f44baa4cefcad883662d62a4a487cdf7bed1642c25af5d9\": container with ID starting with 99642c5d379f0f307f44baa4cefcad883662d62a4a487cdf7bed1642c25af5d9 not found: ID does not exist" containerID="99642c5d379f0f307f44baa4cefcad883662d62a4a487cdf7bed1642c25af5d9" Apr 16 16:23:52.207554 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:52.207489 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99642c5d379f0f307f44baa4cefcad883662d62a4a487cdf7bed1642c25af5d9"} err="failed to get container status \"99642c5d379f0f307f44baa4cefcad883662d62a4a487cdf7bed1642c25af5d9\": rpc error: code = NotFound desc = could not find container \"99642c5d379f0f307f44baa4cefcad883662d62a4a487cdf7bed1642c25af5d9\": container with ID starting with 99642c5d379f0f307f44baa4cefcad883662d62a4a487cdf7bed1642c25af5d9 not found: ID does not exist" Apr 16 16:23:52.207554 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:52.207508 2576 scope.go:117] "RemoveContainer" containerID="90b2dfcf334eb7cedc7a286c3a85971014d488aa90f18ed7ac4b386f6357d38e" Apr 16 16:23:52.207732 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:23:52.207716 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90b2dfcf334eb7cedc7a286c3a85971014d488aa90f18ed7ac4b386f6357d38e\": container with ID starting with 90b2dfcf334eb7cedc7a286c3a85971014d488aa90f18ed7ac4b386f6357d38e not found: ID does not exist" containerID="90b2dfcf334eb7cedc7a286c3a85971014d488aa90f18ed7ac4b386f6357d38e" Apr 16 16:23:52.207774 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:52.207738 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90b2dfcf334eb7cedc7a286c3a85971014d488aa90f18ed7ac4b386f6357d38e"} err="failed to get container status \"90b2dfcf334eb7cedc7a286c3a85971014d488aa90f18ed7ac4b386f6357d38e\": rpc error: code = NotFound desc = could not find container \"90b2dfcf334eb7cedc7a286c3a85971014d488aa90f18ed7ac4b386f6357d38e\": container with ID starting with 90b2dfcf334eb7cedc7a286c3a85971014d488aa90f18ed7ac4b386f6357d38e not found: ID does not exist" Apr 16 16:23:52.222473 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:52.222437 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-57f9c55f54-dqcxt"] Apr 16 16:23:52.223855 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:52.223831 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-57f9c55f54-dqcxt"] Apr 16 16:23:52.253299 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:52.253277 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/13a92dfd-4167-4713-b202-01b18a6724ed-kserve-provision-location\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:23:53.278054 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:53.277983 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13a92dfd-4167-4713-b202-01b18a6724ed" path="/var/lib/kubelet/pods/13a92dfd-4167-4713-b202-01b18a6724ed/volumes" Apr 16 16:23:57.206784 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:57.206748 2576 generic.go:358] "Generic (PLEG): container finished" podID="76d22f44-ecbe-4cbe-87be-c3cf8a26b5db" containerID="7b92a1aeeff49b59455a6c842b840fa21354256b09484eaee571e8d8924d6291" exitCode=0 Apr 16 16:23:57.207160 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:57.206795 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-65b68fcf68-z8jdj" event={"ID":"76d22f44-ecbe-4cbe-87be-c3cf8a26b5db","Type":"ContainerDied","Data":"7b92a1aeeff49b59455a6c842b840fa21354256b09484eaee571e8d8924d6291"} Apr 16 16:23:58.211661 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:58.211631 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-65b68fcf68-z8jdj" event={"ID":"76d22f44-ecbe-4cbe-87be-c3cf8a26b5db","Type":"ContainerStarted","Data":"b278dc6626f8076327ad5941a611d680896252d266285a34611022d8f2ce4174"} Apr 16 16:23:58.212025 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:58.211955 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-65b68fcf68-z8jdj" Apr 16 16:23:58.227630 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:23:58.227588 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-65b68fcf68-z8jdj" podStartSLOduration=7.227576194 podStartE2EDuration="7.227576194s" podCreationTimestamp="2026-04-16 16:23:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:23:58.226714426 +0000 UTC m=+1261.581704556" watchObservedRunningTime="2026-04-16 16:23:58.227576194 +0000 UTC m=+1261.582566324" Apr 16 16:24:29.220312 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:24:29.220275 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-65b68fcf68-z8jdj" Apr 16 16:24:31.681932 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:24:31.681895 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-65b68fcf68-z8jdj"] Apr 16 16:24:31.682336 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:24:31.682169 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-65b68fcf68-z8jdj" podUID="76d22f44-ecbe-4cbe-87be-c3cf8a26b5db" containerName="kserve-container" containerID="cri-o://b278dc6626f8076327ad5941a611d680896252d266285a34611022d8f2ce4174" gracePeriod=30 Apr 16 16:24:31.750963 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:24:31.750925 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-6qt9g"] Apr 16 16:24:31.751554 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:24:31.751528 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="13a92dfd-4167-4713-b202-01b18a6724ed" containerName="storage-initializer" Apr 16 16:24:31.751554 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:24:31.751553 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="13a92dfd-4167-4713-b202-01b18a6724ed" containerName="storage-initializer" Apr 16 16:24:31.751719 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:24:31.751565 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="13a92dfd-4167-4713-b202-01b18a6724ed" containerName="kserve-container" Apr 16 16:24:31.751719 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:24:31.751574 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="13a92dfd-4167-4713-b202-01b18a6724ed" containerName="kserve-container" Apr 16 16:24:31.751719 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:24:31.751662 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="13a92dfd-4167-4713-b202-01b18a6724ed" containerName="kserve-container" Apr 16 16:24:31.755411 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:24:31.755392 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-6qt9g" Apr 16 16:24:31.767001 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:24:31.766975 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-6qt9g"] Apr 16 16:24:31.836775 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:24:31.836741 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87ef0162-5856-4c65-87f0-4c0fa0e67a3b-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-64dbc547c5-6qt9g\" (UID: \"87ef0162-5856-4c65-87f0-4c0fa0e67a3b\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-6qt9g" Apr 16 16:24:31.938287 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:24:31.938169 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87ef0162-5856-4c65-87f0-4c0fa0e67a3b-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-64dbc547c5-6qt9g\" (UID: \"87ef0162-5856-4c65-87f0-4c0fa0e67a3b\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-6qt9g" Apr 16 16:24:31.938552 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:24:31.938531 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87ef0162-5856-4c65-87f0-4c0fa0e67a3b-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-64dbc547c5-6qt9g\" (UID: \"87ef0162-5856-4c65-87f0-4c0fa0e67a3b\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-6qt9g" Apr 16 16:24:32.066176 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:24:32.066138 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-6qt9g" Apr 16 16:24:32.184472 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:24:32.184428 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-6qt9g"] Apr 16 16:24:32.187483 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:24:32.187452 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87ef0162_5856_4c65_87f0_4c0fa0e67a3b.slice/crio-21a299795581a0a4de6521e0d2485ed522e7437097dc332a1d9460420e17dcb5 WatchSource:0}: Error finding container 21a299795581a0a4de6521e0d2485ed522e7437097dc332a1d9460420e17dcb5: Status 404 returned error can't find the container with id 21a299795581a0a4de6521e0d2485ed522e7437097dc332a1d9460420e17dcb5 Apr 16 16:24:32.313123 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:24:32.313087 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-6qt9g" event={"ID":"87ef0162-5856-4c65-87f0-4c0fa0e67a3b","Type":"ContainerStarted","Data":"5c1c2e9f96b26fb3cfb08f621e737e73db9b34b1b5f709f7ccf2848a7a676456"} Apr 16 16:24:32.313123 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:24:32.313127 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-6qt9g" event={"ID":"87ef0162-5856-4c65-87f0-4c0fa0e67a3b","Type":"ContainerStarted","Data":"21a299795581a0a4de6521e0d2485ed522e7437097dc332a1d9460420e17dcb5"} Apr 16 16:24:32.809934 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:24:32.809903 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-65b68fcf68-z8jdj" Apr 16 16:24:32.946522 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:24:32.946499 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/76d22f44-ecbe-4cbe-87be-c3cf8a26b5db-kserve-provision-location\") pod \"76d22f44-ecbe-4cbe-87be-c3cf8a26b5db\" (UID: \"76d22f44-ecbe-4cbe-87be-c3cf8a26b5db\") " Apr 16 16:24:32.946818 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:24:32.946795 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76d22f44-ecbe-4cbe-87be-c3cf8a26b5db-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "76d22f44-ecbe-4cbe-87be-c3cf8a26b5db" (UID: "76d22f44-ecbe-4cbe-87be-c3cf8a26b5db"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:24:33.047554 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:24:33.047524 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/76d22f44-ecbe-4cbe-87be-c3cf8a26b5db-kserve-provision-location\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:24:33.316815 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:24:33.316782 2576 generic.go:358] "Generic (PLEG): container finished" podID="76d22f44-ecbe-4cbe-87be-c3cf8a26b5db" containerID="b278dc6626f8076327ad5941a611d680896252d266285a34611022d8f2ce4174" exitCode=0 Apr 16 16:24:33.316937 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:24:33.316827 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-65b68fcf68-z8jdj" event={"ID":"76d22f44-ecbe-4cbe-87be-c3cf8a26b5db","Type":"ContainerDied","Data":"b278dc6626f8076327ad5941a611d680896252d266285a34611022d8f2ce4174"} Apr 16 16:24:33.316937 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:24:33.316842 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-65b68fcf68-z8jdj" Apr 16 16:24:33.316937 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:24:33.316859 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-65b68fcf68-z8jdj" event={"ID":"76d22f44-ecbe-4cbe-87be-c3cf8a26b5db","Type":"ContainerDied","Data":"d7f071aa7ee6092fda9834d3d160d2741c47a7a5f50b3d648ae0be69716e63ff"} Apr 16 16:24:33.316937 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:24:33.316879 2576 scope.go:117] "RemoveContainer" containerID="b278dc6626f8076327ad5941a611d680896252d266285a34611022d8f2ce4174" Apr 16 16:24:33.324360 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:24:33.324343 2576 scope.go:117] "RemoveContainer" containerID="7b92a1aeeff49b59455a6c842b840fa21354256b09484eaee571e8d8924d6291" Apr 16 16:24:33.331198 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:24:33.331180 2576 scope.go:117] "RemoveContainer" containerID="b278dc6626f8076327ad5941a611d680896252d266285a34611022d8f2ce4174" Apr 16 16:24:33.331484 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:24:33.331467 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b278dc6626f8076327ad5941a611d680896252d266285a34611022d8f2ce4174\": container with ID starting with b278dc6626f8076327ad5941a611d680896252d266285a34611022d8f2ce4174 not found: ID does not exist" containerID="b278dc6626f8076327ad5941a611d680896252d266285a34611022d8f2ce4174" Apr 16 16:24:33.331540 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:24:33.331492 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b278dc6626f8076327ad5941a611d680896252d266285a34611022d8f2ce4174"} err="failed to get container status \"b278dc6626f8076327ad5941a611d680896252d266285a34611022d8f2ce4174\": rpc error: code = NotFound desc = could not find container \"b278dc6626f8076327ad5941a611d680896252d266285a34611022d8f2ce4174\": container with ID starting with b278dc6626f8076327ad5941a611d680896252d266285a34611022d8f2ce4174 not found: ID does not exist" Apr 16 16:24:33.331540 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:24:33.331510 2576 scope.go:117] "RemoveContainer" containerID="7b92a1aeeff49b59455a6c842b840fa21354256b09484eaee571e8d8924d6291" Apr 16 16:24:33.331773 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:24:33.331752 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b92a1aeeff49b59455a6c842b840fa21354256b09484eaee571e8d8924d6291\": container with ID starting with 7b92a1aeeff49b59455a6c842b840fa21354256b09484eaee571e8d8924d6291 not found: ID does not exist" containerID="7b92a1aeeff49b59455a6c842b840fa21354256b09484eaee571e8d8924d6291" Apr 16 16:24:33.331815 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:24:33.331783 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b92a1aeeff49b59455a6c842b840fa21354256b09484eaee571e8d8924d6291"} err="failed to get container status \"7b92a1aeeff49b59455a6c842b840fa21354256b09484eaee571e8d8924d6291\": rpc error: code = NotFound desc = could not find container \"7b92a1aeeff49b59455a6c842b840fa21354256b09484eaee571e8d8924d6291\": container with ID starting with 7b92a1aeeff49b59455a6c842b840fa21354256b09484eaee571e8d8924d6291 not found: ID does not exist" Apr 16 16:24:33.332235 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:24:33.332203 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-65b68fcf68-z8jdj"] Apr 16 16:24:33.335079 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:24:33.335060 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-65b68fcf68-z8jdj"] Apr 16 16:24:35.277690 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:24:35.277656 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76d22f44-ecbe-4cbe-87be-c3cf8a26b5db" path="/var/lib/kubelet/pods/76d22f44-ecbe-4cbe-87be-c3cf8a26b5db/volumes" Apr 16 16:24:36.326869 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:24:36.326792 2576 generic.go:358] "Generic (PLEG): container finished" podID="87ef0162-5856-4c65-87f0-4c0fa0e67a3b" containerID="5c1c2e9f96b26fb3cfb08f621e737e73db9b34b1b5f709f7ccf2848a7a676456" exitCode=0 Apr 16 16:24:36.326869 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:24:36.326841 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-6qt9g" event={"ID":"87ef0162-5856-4c65-87f0-4c0fa0e67a3b","Type":"ContainerDied","Data":"5c1c2e9f96b26fb3cfb08f621e737e73db9b34b1b5f709f7ccf2848a7a676456"} Apr 16 16:24:37.333875 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:24:37.333840 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-6qt9g" event={"ID":"87ef0162-5856-4c65-87f0-4c0fa0e67a3b","Type":"ContainerStarted","Data":"ac2a9967e701138dbff1976a15d76d1b2d777a4936b3c1adfc7a7746ecf30e65"} Apr 16 16:24:39.343925 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:24:39.343887 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-6qt9g" event={"ID":"87ef0162-5856-4c65-87f0-4c0fa0e67a3b","Type":"ContainerStarted","Data":"764cb640b3b4e685e6b519b4dfa80393ee3b39574d39f81342dcd9367fb90d08"} Apr 16 16:24:39.344376 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:24:39.344002 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-6qt9g" Apr 16 16:24:39.362153 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:24:39.362107 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-6qt9g" podStartSLOduration=5.49384537 podStartE2EDuration="8.36209274s" podCreationTimestamp="2026-04-16 16:24:31 +0000 UTC" firstStartedPulling="2026-04-16 16:24:36.391753458 +0000 UTC m=+1299.746743564" lastFinishedPulling="2026-04-16 16:24:39.260000826 +0000 UTC m=+1302.614990934" observedRunningTime="2026-04-16 16:24:39.360588382 +0000 UTC m=+1302.715578512" watchObservedRunningTime="2026-04-16 16:24:39.36209274 +0000 UTC m=+1302.717082868" Apr 16 16:24:40.347360 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:24:40.347331 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-6qt9g" Apr 16 16:25:11.353549 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:25:11.353514 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-6qt9g" Apr 16 16:25:41.355009 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:25:41.354932 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-6qt9g" Apr 16 16:25:41.836187 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:25:41.836157 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-6qt9g"] Apr 16 16:25:41.836471 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:25:41.836435 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-6qt9g" podUID="87ef0162-5856-4c65-87f0-4c0fa0e67a3b" containerName="kserve-container" containerID="cri-o://ac2a9967e701138dbff1976a15d76d1b2d777a4936b3c1adfc7a7746ecf30e65" gracePeriod=30 Apr 16 16:25:41.836562 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:25:41.836476 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-6qt9g" podUID="87ef0162-5856-4c65-87f0-4c0fa0e67a3b" containerName="kserve-agent" containerID="cri-o://764cb640b3b4e685e6b519b4dfa80393ee3b39574d39f81342dcd9367fb90d08" gracePeriod=30 Apr 16 16:25:41.889775 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:25:41.889738 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-c57db76c5-zxxx9"] Apr 16 16:25:41.890064 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:25:41.890051 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="76d22f44-ecbe-4cbe-87be-c3cf8a26b5db" containerName="kserve-container" Apr 16 16:25:41.890115 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:25:41.890066 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="76d22f44-ecbe-4cbe-87be-c3cf8a26b5db" containerName="kserve-container" Apr 16 16:25:41.890115 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:25:41.890081 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="76d22f44-ecbe-4cbe-87be-c3cf8a26b5db" containerName="storage-initializer" Apr 16 16:25:41.890115 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:25:41.890087 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="76d22f44-ecbe-4cbe-87be-c3cf8a26b5db" containerName="storage-initializer" Apr 16 16:25:41.890205 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:25:41.890139 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="76d22f44-ecbe-4cbe-87be-c3cf8a26b5db" containerName="kserve-container" Apr 16 16:25:41.893462 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:25:41.893443 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-c57db76c5-zxxx9" Apr 16 16:25:41.904515 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:25:41.904494 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-c57db76c5-zxxx9"] Apr 16 16:25:41.981178 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:25:41.981151 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a642ceb0-7db9-4be3-8d65-25de0b4635b1-kserve-provision-location\") pod \"isvc-paddle-predictor-c57db76c5-zxxx9\" (UID: \"a642ceb0-7db9-4be3-8d65-25de0b4635b1\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-c57db76c5-zxxx9" Apr 16 16:25:42.081701 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:25:42.081672 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a642ceb0-7db9-4be3-8d65-25de0b4635b1-kserve-provision-location\") pod \"isvc-paddle-predictor-c57db76c5-zxxx9\" (UID: \"a642ceb0-7db9-4be3-8d65-25de0b4635b1\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-c57db76c5-zxxx9" Apr 16 16:25:42.082035 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:25:42.082016 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a642ceb0-7db9-4be3-8d65-25de0b4635b1-kserve-provision-location\") pod \"isvc-paddle-predictor-c57db76c5-zxxx9\" (UID: \"a642ceb0-7db9-4be3-8d65-25de0b4635b1\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-c57db76c5-zxxx9" Apr 16 16:25:42.203154 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:25:42.203116 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-c57db76c5-zxxx9" Apr 16 16:25:42.328349 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:25:42.328289 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-c57db76c5-zxxx9"] Apr 16 16:25:42.332631 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:25:42.332592 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda642ceb0_7db9_4be3_8d65_25de0b4635b1.slice/crio-703ca97bb7a7d3ee57071b148769540ecd4a0b529eabec97e2175c0341d4e246 WatchSource:0}: Error finding container 703ca97bb7a7d3ee57071b148769540ecd4a0b529eabec97e2175c0341d4e246: Status 404 returned error can't find the container with id 703ca97bb7a7d3ee57071b148769540ecd4a0b529eabec97e2175c0341d4e246 Apr 16 16:25:42.335072 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:25:42.335054 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:25:42.554091 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:25:42.554007 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-c57db76c5-zxxx9" event={"ID":"a642ceb0-7db9-4be3-8d65-25de0b4635b1","Type":"ContainerStarted","Data":"7d6d0cab4c50071fead44215af4c2c87e8c5f70b8f7d6342d23a4cd6d4b1ae29"} Apr 16 16:25:42.554091 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:25:42.554047 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-c57db76c5-zxxx9" event={"ID":"a642ceb0-7db9-4be3-8d65-25de0b4635b1","Type":"ContainerStarted","Data":"703ca97bb7a7d3ee57071b148769540ecd4a0b529eabec97e2175c0341d4e246"} Apr 16 16:25:44.562446 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:25:44.562408 2576 generic.go:358] "Generic (PLEG): container finished" podID="87ef0162-5856-4c65-87f0-4c0fa0e67a3b" containerID="ac2a9967e701138dbff1976a15d76d1b2d777a4936b3c1adfc7a7746ecf30e65" exitCode=0 Apr 16 16:25:44.562807 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:25:44.562480 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-6qt9g" event={"ID":"87ef0162-5856-4c65-87f0-4c0fa0e67a3b","Type":"ContainerDied","Data":"ac2a9967e701138dbff1976a15d76d1b2d777a4936b3c1adfc7a7746ecf30e65"} Apr 16 16:25:47.573580 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:25:47.573548 2576 generic.go:358] "Generic (PLEG): container finished" podID="a642ceb0-7db9-4be3-8d65-25de0b4635b1" containerID="7d6d0cab4c50071fead44215af4c2c87e8c5f70b8f7d6342d23a4cd6d4b1ae29" exitCode=0 Apr 16 16:25:47.573969 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:25:47.573620 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-c57db76c5-zxxx9" event={"ID":"a642ceb0-7db9-4be3-8d65-25de0b4635b1","Type":"ContainerDied","Data":"7d6d0cab4c50071fead44215af4c2c87e8c5f70b8f7d6342d23a4cd6d4b1ae29"} Apr 16 16:25:51.351418 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:25:51.351376 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-6qt9g" podUID="87ef0162-5856-4c65-87f0-4c0fa0e67a3b" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.38:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.133.0.38:8080: connect: connection refused" Apr 16 16:25:59.616644 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:25:59.616604 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-c57db76c5-zxxx9" event={"ID":"a642ceb0-7db9-4be3-8d65-25de0b4635b1","Type":"ContainerStarted","Data":"e08bfc6db9da2545ea2902b73b7f16f2f16ae2eb2d808235fbb8b493396b630f"} Apr 16 16:25:59.617108 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:25:59.616946 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-c57db76c5-zxxx9" Apr 16 16:25:59.618179 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:25:59.618152 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-c57db76c5-zxxx9" podUID="a642ceb0-7db9-4be3-8d65-25de0b4635b1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 16 16:25:59.636169 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:25:59.636125 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-predictor-c57db76c5-zxxx9" podStartSLOduration=6.757132714 podStartE2EDuration="18.636112411s" podCreationTimestamp="2026-04-16 16:25:41 +0000 UTC" firstStartedPulling="2026-04-16 16:25:47.57482217 +0000 UTC m=+1370.929812281" lastFinishedPulling="2026-04-16 16:25:59.45380186 +0000 UTC m=+1382.808791978" observedRunningTime="2026-04-16 16:25:59.634465343 +0000 UTC m=+1382.989455472" watchObservedRunningTime="2026-04-16 16:25:59.636112411 +0000 UTC m=+1382.991102604" Apr 16 16:26:00.620112 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:00.620076 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-c57db76c5-zxxx9" podUID="a642ceb0-7db9-4be3-8d65-25de0b4635b1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 16 16:26:01.350650 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:01.350609 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-6qt9g" podUID="87ef0162-5856-4c65-87f0-4c0fa0e67a3b" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.38:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.133.0.38:8080: connect: connection refused" Apr 16 16:26:10.620303 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:10.620261 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-c57db76c5-zxxx9" podUID="a642ceb0-7db9-4be3-8d65-25de0b4635b1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 16 16:26:11.351453 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:11.351414 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-6qt9g" podUID="87ef0162-5856-4c65-87f0-4c0fa0e67a3b" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.38:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.133.0.38:8080: connect: connection refused" Apr 16 16:26:11.351621 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:11.351527 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-6qt9g" Apr 16 16:26:11.975064 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:11.975040 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-6qt9g" Apr 16 16:26:12.022195 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:12.022163 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87ef0162-5856-4c65-87f0-4c0fa0e67a3b-kserve-provision-location\") pod \"87ef0162-5856-4c65-87f0-4c0fa0e67a3b\" (UID: \"87ef0162-5856-4c65-87f0-4c0fa0e67a3b\") " Apr 16 16:26:12.022452 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:12.022428 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87ef0162-5856-4c65-87f0-4c0fa0e67a3b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "87ef0162-5856-4c65-87f0-4c0fa0e67a3b" (UID: "87ef0162-5856-4c65-87f0-4c0fa0e67a3b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:26:12.123099 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:12.123015 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87ef0162-5856-4c65-87f0-4c0fa0e67a3b-kserve-provision-location\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:26:12.660407 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:12.660374 2576 generic.go:358] "Generic (PLEG): container finished" podID="87ef0162-5856-4c65-87f0-4c0fa0e67a3b" containerID="764cb640b3b4e685e6b519b4dfa80393ee3b39574d39f81342dcd9367fb90d08" exitCode=0 Apr 16 16:26:12.660584 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:12.660455 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-6qt9g" Apr 16 16:26:12.660584 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:12.660462 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-6qt9g" event={"ID":"87ef0162-5856-4c65-87f0-4c0fa0e67a3b","Type":"ContainerDied","Data":"764cb640b3b4e685e6b519b4dfa80393ee3b39574d39f81342dcd9367fb90d08"} Apr 16 16:26:12.660584 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:12.660512 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-6qt9g" event={"ID":"87ef0162-5856-4c65-87f0-4c0fa0e67a3b","Type":"ContainerDied","Data":"21a299795581a0a4de6521e0d2485ed522e7437097dc332a1d9460420e17dcb5"} Apr 16 16:26:12.660584 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:12.660530 2576 scope.go:117] "RemoveContainer" containerID="764cb640b3b4e685e6b519b4dfa80393ee3b39574d39f81342dcd9367fb90d08" Apr 16 16:26:12.671924 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:12.671900 2576 scope.go:117] "RemoveContainer" containerID="ac2a9967e701138dbff1976a15d76d1b2d777a4936b3c1adfc7a7746ecf30e65" Apr 16 16:26:12.678980 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:12.678964 2576 scope.go:117] "RemoveContainer" containerID="5c1c2e9f96b26fb3cfb08f621e737e73db9b34b1b5f709f7ccf2848a7a676456" Apr 16 16:26:12.682794 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:12.682775 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-6qt9g"] Apr 16 16:26:12.685533 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:12.685509 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-64dbc547c5-6qt9g"] Apr 16 16:26:12.686770 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:12.686751 2576 scope.go:117] "RemoveContainer" containerID="764cb640b3b4e685e6b519b4dfa80393ee3b39574d39f81342dcd9367fb90d08" Apr 16 16:26:12.687005 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:26:12.686989 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"764cb640b3b4e685e6b519b4dfa80393ee3b39574d39f81342dcd9367fb90d08\": container with ID starting with 764cb640b3b4e685e6b519b4dfa80393ee3b39574d39f81342dcd9367fb90d08 not found: ID does not exist" containerID="764cb640b3b4e685e6b519b4dfa80393ee3b39574d39f81342dcd9367fb90d08" Apr 16 16:26:12.687052 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:12.687014 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"764cb640b3b4e685e6b519b4dfa80393ee3b39574d39f81342dcd9367fb90d08"} err="failed to get container status \"764cb640b3b4e685e6b519b4dfa80393ee3b39574d39f81342dcd9367fb90d08\": rpc error: code = NotFound desc = could not find container \"764cb640b3b4e685e6b519b4dfa80393ee3b39574d39f81342dcd9367fb90d08\": container with ID starting with 764cb640b3b4e685e6b519b4dfa80393ee3b39574d39f81342dcd9367fb90d08 not found: ID does not exist" Apr 16 16:26:12.687052 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:12.687032 2576 scope.go:117] "RemoveContainer" containerID="ac2a9967e701138dbff1976a15d76d1b2d777a4936b3c1adfc7a7746ecf30e65" Apr 16 16:26:12.687270 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:26:12.687255 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac2a9967e701138dbff1976a15d76d1b2d777a4936b3c1adfc7a7746ecf30e65\": container with ID starting with ac2a9967e701138dbff1976a15d76d1b2d777a4936b3c1adfc7a7746ecf30e65 not found: ID does not exist" containerID="ac2a9967e701138dbff1976a15d76d1b2d777a4936b3c1adfc7a7746ecf30e65" Apr 16 16:26:12.687328 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:12.687273 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac2a9967e701138dbff1976a15d76d1b2d777a4936b3c1adfc7a7746ecf30e65"} err="failed to get container status \"ac2a9967e701138dbff1976a15d76d1b2d777a4936b3c1adfc7a7746ecf30e65\": rpc error: code = NotFound desc = could not find container \"ac2a9967e701138dbff1976a15d76d1b2d777a4936b3c1adfc7a7746ecf30e65\": container with ID starting with ac2a9967e701138dbff1976a15d76d1b2d777a4936b3c1adfc7a7746ecf30e65 not found: ID does not exist" Apr 16 16:26:12.687328 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:12.687287 2576 scope.go:117] "RemoveContainer" containerID="5c1c2e9f96b26fb3cfb08f621e737e73db9b34b1b5f709f7ccf2848a7a676456" Apr 16 16:26:12.687511 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:26:12.687494 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c1c2e9f96b26fb3cfb08f621e737e73db9b34b1b5f709f7ccf2848a7a676456\": container with ID starting with 5c1c2e9f96b26fb3cfb08f621e737e73db9b34b1b5f709f7ccf2848a7a676456 not found: ID does not exist" containerID="5c1c2e9f96b26fb3cfb08f621e737e73db9b34b1b5f709f7ccf2848a7a676456" Apr 16 16:26:12.687553 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:12.687516 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c1c2e9f96b26fb3cfb08f621e737e73db9b34b1b5f709f7ccf2848a7a676456"} err="failed to get container status \"5c1c2e9f96b26fb3cfb08f621e737e73db9b34b1b5f709f7ccf2848a7a676456\": rpc error: code = NotFound desc = could not find container \"5c1c2e9f96b26fb3cfb08f621e737e73db9b34b1b5f709f7ccf2848a7a676456\": container with ID starting with 5c1c2e9f96b26fb3cfb08f621e737e73db9b34b1b5f709f7ccf2848a7a676456 not found: ID does not exist" Apr 16 16:26:13.277745 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:13.277715 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87ef0162-5856-4c65-87f0-4c0fa0e67a3b" path="/var/lib/kubelet/pods/87ef0162-5856-4c65-87f0-4c0fa0e67a3b/volumes" Apr 16 16:26:20.620415 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:20.620372 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-c57db76c5-zxxx9" podUID="a642ceb0-7db9-4be3-8d65-25de0b4635b1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 16 16:26:30.620481 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:30.620434 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-c57db76c5-zxxx9" podUID="a642ceb0-7db9-4be3-8d65-25de0b4635b1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 16 16:26:40.621387 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:40.621356 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-c57db76c5-zxxx9" Apr 16 16:26:43.473302 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:43.473269 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-c57db76c5-zxxx9"] Apr 16 16:26:43.473706 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:43.473523 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-predictor-c57db76c5-zxxx9" podUID="a642ceb0-7db9-4be3-8d65-25de0b4635b1" containerName="kserve-container" containerID="cri-o://e08bfc6db9da2545ea2902b73b7f16f2f16ae2eb2d808235fbb8b493396b630f" gracePeriod=30 Apr 16 16:26:43.553443 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:43.553410 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-8c7997d9b-dzhcc"] Apr 16 16:26:43.553738 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:43.553726 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87ef0162-5856-4c65-87f0-4c0fa0e67a3b" containerName="kserve-container" Apr 16 16:26:43.553782 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:43.553741 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="87ef0162-5856-4c65-87f0-4c0fa0e67a3b" containerName="kserve-container" Apr 16 16:26:43.553782 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:43.553755 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87ef0162-5856-4c65-87f0-4c0fa0e67a3b" containerName="kserve-agent" Apr 16 16:26:43.553782 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:43.553760 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="87ef0162-5856-4c65-87f0-4c0fa0e67a3b" containerName="kserve-agent" Apr 16 16:26:43.553782 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:43.553770 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87ef0162-5856-4c65-87f0-4c0fa0e67a3b" containerName="storage-initializer" Apr 16 16:26:43.553782 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:43.553779 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="87ef0162-5856-4c65-87f0-4c0fa0e67a3b" containerName="storage-initializer" Apr 16 16:26:43.553963 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:43.553823 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="87ef0162-5856-4c65-87f0-4c0fa0e67a3b" containerName="kserve-agent" Apr 16 16:26:43.553963 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:43.553836 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="87ef0162-5856-4c65-87f0-4c0fa0e67a3b" containerName="kserve-container" Apr 16 16:26:43.572780 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:43.572757 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-8c7997d9b-dzhcc"] Apr 16 16:26:43.572912 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:43.572862 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-8c7997d9b-dzhcc" Apr 16 16:26:43.664545 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:43.664516 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4f9b721b-551f-40d8-a78f-d07bfd3eae71-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-8c7997d9b-dzhcc\" (UID: \"4f9b721b-551f-40d8-a78f-d07bfd3eae71\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-8c7997d9b-dzhcc" Apr 16 16:26:43.765160 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:43.765080 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4f9b721b-551f-40d8-a78f-d07bfd3eae71-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-8c7997d9b-dzhcc\" (UID: \"4f9b721b-551f-40d8-a78f-d07bfd3eae71\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-8c7997d9b-dzhcc" Apr 16 16:26:43.765470 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:43.765450 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4f9b721b-551f-40d8-a78f-d07bfd3eae71-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-8c7997d9b-dzhcc\" (UID: \"4f9b721b-551f-40d8-a78f-d07bfd3eae71\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-8c7997d9b-dzhcc" Apr 16 16:26:43.883960 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:43.883915 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-8c7997d9b-dzhcc" Apr 16 16:26:44.004820 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:44.004795 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-8c7997d9b-dzhcc"] Apr 16 16:26:44.007196 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:26:44.007165 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f9b721b_551f_40d8_a78f_d07bfd3eae71.slice/crio-931c036a9e055a9ca2e3ef965414851ce5b07166343ee45263835b5f4f0889a1 WatchSource:0}: Error finding container 931c036a9e055a9ca2e3ef965414851ce5b07166343ee45263835b5f4f0889a1: Status 404 returned error can't find the container with id 931c036a9e055a9ca2e3ef965414851ce5b07166343ee45263835b5f4f0889a1 Apr 16 16:26:44.769540 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:44.769507 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-8c7997d9b-dzhcc" event={"ID":"4f9b721b-551f-40d8-a78f-d07bfd3eae71","Type":"ContainerStarted","Data":"3e4c9afbe0ffda777b75920f56acd948fa0931d14d7f5628fd25899989f70b70"} Apr 16 16:26:44.769540 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:44.769542 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-8c7997d9b-dzhcc" event={"ID":"4f9b721b-551f-40d8-a78f-d07bfd3eae71","Type":"ContainerStarted","Data":"931c036a9e055a9ca2e3ef965414851ce5b07166343ee45263835b5f4f0889a1"} Apr 16 16:26:46.016929 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:46.016905 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-c57db76c5-zxxx9" Apr 16 16:26:46.084608 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:46.084580 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a642ceb0-7db9-4be3-8d65-25de0b4635b1-kserve-provision-location\") pod \"a642ceb0-7db9-4be3-8d65-25de0b4635b1\" (UID: \"a642ceb0-7db9-4be3-8d65-25de0b4635b1\") " Apr 16 16:26:46.094250 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:46.094182 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a642ceb0-7db9-4be3-8d65-25de0b4635b1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a642ceb0-7db9-4be3-8d65-25de0b4635b1" (UID: "a642ceb0-7db9-4be3-8d65-25de0b4635b1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:26:46.186022 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:46.185987 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a642ceb0-7db9-4be3-8d65-25de0b4635b1-kserve-provision-location\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:26:46.776161 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:46.776132 2576 generic.go:358] "Generic (PLEG): container finished" podID="a642ceb0-7db9-4be3-8d65-25de0b4635b1" containerID="e08bfc6db9da2545ea2902b73b7f16f2f16ae2eb2d808235fbb8b493396b630f" exitCode=0 Apr 16 16:26:46.776355 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:46.776201 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-c57db76c5-zxxx9" Apr 16 16:26:46.776355 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:46.776203 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-c57db76c5-zxxx9" event={"ID":"a642ceb0-7db9-4be3-8d65-25de0b4635b1","Type":"ContainerDied","Data":"e08bfc6db9da2545ea2902b73b7f16f2f16ae2eb2d808235fbb8b493396b630f"} Apr 16 16:26:46.776355 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:46.776317 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-c57db76c5-zxxx9" event={"ID":"a642ceb0-7db9-4be3-8d65-25de0b4635b1","Type":"ContainerDied","Data":"703ca97bb7a7d3ee57071b148769540ecd4a0b529eabec97e2175c0341d4e246"} Apr 16 16:26:46.776355 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:46.776333 2576 scope.go:117] "RemoveContainer" containerID="e08bfc6db9da2545ea2902b73b7f16f2f16ae2eb2d808235fbb8b493396b630f" Apr 16 16:26:46.784637 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:46.784621 2576 scope.go:117] "RemoveContainer" containerID="7d6d0cab4c50071fead44215af4c2c87e8c5f70b8f7d6342d23a4cd6d4b1ae29" Apr 16 16:26:46.791259 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:46.791244 2576 scope.go:117] "RemoveContainer" containerID="e08bfc6db9da2545ea2902b73b7f16f2f16ae2eb2d808235fbb8b493396b630f" Apr 16 16:26:46.791490 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:26:46.791473 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e08bfc6db9da2545ea2902b73b7f16f2f16ae2eb2d808235fbb8b493396b630f\": container with ID starting with e08bfc6db9da2545ea2902b73b7f16f2f16ae2eb2d808235fbb8b493396b630f not found: ID does not exist" containerID="e08bfc6db9da2545ea2902b73b7f16f2f16ae2eb2d808235fbb8b493396b630f" Apr 16 16:26:46.791554 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:46.791499 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e08bfc6db9da2545ea2902b73b7f16f2f16ae2eb2d808235fbb8b493396b630f"} err="failed to get container status \"e08bfc6db9da2545ea2902b73b7f16f2f16ae2eb2d808235fbb8b493396b630f\": rpc error: code = NotFound desc = could not find container \"e08bfc6db9da2545ea2902b73b7f16f2f16ae2eb2d808235fbb8b493396b630f\": container with ID starting with e08bfc6db9da2545ea2902b73b7f16f2f16ae2eb2d808235fbb8b493396b630f not found: ID does not exist" Apr 16 16:26:46.791554 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:46.791515 2576 scope.go:117] "RemoveContainer" containerID="7d6d0cab4c50071fead44215af4c2c87e8c5f70b8f7d6342d23a4cd6d4b1ae29" Apr 16 16:26:46.791717 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:26:46.791702 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d6d0cab4c50071fead44215af4c2c87e8c5f70b8f7d6342d23a4cd6d4b1ae29\": container with ID starting with 7d6d0cab4c50071fead44215af4c2c87e8c5f70b8f7d6342d23a4cd6d4b1ae29 not found: ID does not exist" containerID="7d6d0cab4c50071fead44215af4c2c87e8c5f70b8f7d6342d23a4cd6d4b1ae29" Apr 16 16:26:46.791755 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:46.791720 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d6d0cab4c50071fead44215af4c2c87e8c5f70b8f7d6342d23a4cd6d4b1ae29"} err="failed to get container status \"7d6d0cab4c50071fead44215af4c2c87e8c5f70b8f7d6342d23a4cd6d4b1ae29\": rpc error: code = NotFound desc = could not find container \"7d6d0cab4c50071fead44215af4c2c87e8c5f70b8f7d6342d23a4cd6d4b1ae29\": container with ID starting with 7d6d0cab4c50071fead44215af4c2c87e8c5f70b8f7d6342d23a4cd6d4b1ae29 not found: ID does not exist" Apr 16 16:26:46.798827 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:46.798807 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-c57db76c5-zxxx9"] Apr 16 16:26:46.808278 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:46.806579 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-c57db76c5-zxxx9"] Apr 16 16:26:47.278195 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:47.278164 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a642ceb0-7db9-4be3-8d65-25de0b4635b1" path="/var/lib/kubelet/pods/a642ceb0-7db9-4be3-8d65-25de0b4635b1/volumes" Apr 16 16:26:49.789097 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:49.789064 2576 generic.go:358] "Generic (PLEG): container finished" podID="4f9b721b-551f-40d8-a78f-d07bfd3eae71" containerID="3e4c9afbe0ffda777b75920f56acd948fa0931d14d7f5628fd25899989f70b70" exitCode=0 Apr 16 16:26:49.789574 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:49.789142 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-8c7997d9b-dzhcc" event={"ID":"4f9b721b-551f-40d8-a78f-d07bfd3eae71","Type":"ContainerDied","Data":"3e4c9afbe0ffda777b75920f56acd948fa0931d14d7f5628fd25899989f70b70"} Apr 16 16:26:50.793140 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:50.793104 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-8c7997d9b-dzhcc" event={"ID":"4f9b721b-551f-40d8-a78f-d07bfd3eae71","Type":"ContainerStarted","Data":"22f7c4281c1b2333fedb9589a3032e96149582cafce9add13d52e3ba2ee332a6"} Apr 16 16:26:50.793540 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:50.793394 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-8c7997d9b-dzhcc" Apr 16 16:26:50.794670 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:50.794646 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-8c7997d9b-dzhcc" podUID="4f9b721b-551f-40d8-a78f-d07bfd3eae71" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 16 16:26:50.811485 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:50.811441 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-8c7997d9b-dzhcc" podStartSLOduration=7.8114283 podStartE2EDuration="7.8114283s" podCreationTimestamp="2026-04-16 16:26:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:26:50.810698675 +0000 UTC m=+1434.165688805" watchObservedRunningTime="2026-04-16 16:26:50.8114283 +0000 UTC m=+1434.166418429" Apr 16 16:26:51.796720 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:26:51.796687 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-8c7997d9b-dzhcc" podUID="4f9b721b-551f-40d8-a78f-d07bfd3eae71" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 16 16:27:01.796916 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:27:01.796820 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-8c7997d9b-dzhcc" podUID="4f9b721b-551f-40d8-a78f-d07bfd3eae71" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 16 16:27:11.797070 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:27:11.797023 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-8c7997d9b-dzhcc" podUID="4f9b721b-551f-40d8-a78f-d07bfd3eae71" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 16 16:27:21.797094 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:27:21.797051 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-8c7997d9b-dzhcc" podUID="4f9b721b-551f-40d8-a78f-d07bfd3eae71" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 16 16:27:31.797414 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:27:31.797384 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-8c7997d9b-dzhcc" Apr 16 16:27:34.964341 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:27:34.964311 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-8c7997d9b-dzhcc"] Apr 16 16:27:34.964925 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:27:34.964564 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-8c7997d9b-dzhcc" podUID="4f9b721b-551f-40d8-a78f-d07bfd3eae71" containerName="kserve-container" containerID="cri-o://22f7c4281c1b2333fedb9589a3032e96149582cafce9add13d52e3ba2ee332a6" gracePeriod=30 Apr 16 16:27:35.043658 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:27:35.043628 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-679d448945-b6j62"] Apr 16 16:27:35.043934 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:27:35.043921 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a642ceb0-7db9-4be3-8d65-25de0b4635b1" containerName="kserve-container" Apr 16 16:27:35.043977 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:27:35.043936 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a642ceb0-7db9-4be3-8d65-25de0b4635b1" containerName="kserve-container" Apr 16 16:27:35.043977 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:27:35.043956 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a642ceb0-7db9-4be3-8d65-25de0b4635b1" containerName="storage-initializer" Apr 16 16:27:35.043977 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:27:35.043962 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a642ceb0-7db9-4be3-8d65-25de0b4635b1" containerName="storage-initializer" Apr 16 16:27:35.044069 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:27:35.044011 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a642ceb0-7db9-4be3-8d65-25de0b4635b1" containerName="kserve-container" Apr 16 16:27:35.047043 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:27:35.047019 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-679d448945-b6j62" Apr 16 16:27:35.054609 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:27:35.054589 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-679d448945-b6j62"] Apr 16 16:27:35.148098 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:27:35.148066 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9f07735c-f528-4676-bd44-27236301557c-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-679d448945-b6j62\" (UID: \"9f07735c-f528-4676-bd44-27236301557c\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-679d448945-b6j62" Apr 16 16:27:35.249045 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:27:35.248961 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9f07735c-f528-4676-bd44-27236301557c-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-679d448945-b6j62\" (UID: \"9f07735c-f528-4676-bd44-27236301557c\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-679d448945-b6j62" Apr 16 16:27:35.249348 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:27:35.249329 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9f07735c-f528-4676-bd44-27236301557c-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-679d448945-b6j62\" (UID: \"9f07735c-f528-4676-bd44-27236301557c\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-679d448945-b6j62" Apr 16 16:27:35.357527 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:27:35.357487 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-679d448945-b6j62" Apr 16 16:27:35.474074 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:27:35.474042 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-679d448945-b6j62"] Apr 16 16:27:35.477134 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:27:35.477098 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f07735c_f528_4676_bd44_27236301557c.slice/crio-d830e84ff9a7d49dec1a8204fa9b551c22dba127d1c5ada64531defe72f22ca7 WatchSource:0}: Error finding container d830e84ff9a7d49dec1a8204fa9b551c22dba127d1c5ada64531defe72f22ca7: Status 404 returned error can't find the container with id d830e84ff9a7d49dec1a8204fa9b551c22dba127d1c5ada64531defe72f22ca7 Apr 16 16:27:35.939867 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:27:35.939827 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-679d448945-b6j62" event={"ID":"9f07735c-f528-4676-bd44-27236301557c","Type":"ContainerStarted","Data":"76e7ee981186ed6a9a711e747d874336a7c3f58944dc9f3d3fc36688527de52e"} Apr 16 16:27:35.939867 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:27:35.939872 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-679d448945-b6j62" event={"ID":"9f07735c-f528-4676-bd44-27236301557c","Type":"ContainerStarted","Data":"d830e84ff9a7d49dec1a8204fa9b551c22dba127d1c5ada64531defe72f22ca7"} Apr 16 16:27:37.601823 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:27:37.601796 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-8c7997d9b-dzhcc" Apr 16 16:27:37.666657 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:27:37.666633 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4f9b721b-551f-40d8-a78f-d07bfd3eae71-kserve-provision-location\") pod \"4f9b721b-551f-40d8-a78f-d07bfd3eae71\" (UID: \"4f9b721b-551f-40d8-a78f-d07bfd3eae71\") " Apr 16 16:27:37.675622 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:27:37.675596 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f9b721b-551f-40d8-a78f-d07bfd3eae71-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4f9b721b-551f-40d8-a78f-d07bfd3eae71" (UID: "4f9b721b-551f-40d8-a78f-d07bfd3eae71"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:27:37.767157 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:27:37.767107 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4f9b721b-551f-40d8-a78f-d07bfd3eae71-kserve-provision-location\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:27:37.947930 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:27:37.947899 2576 generic.go:358] "Generic (PLEG): container finished" podID="4f9b721b-551f-40d8-a78f-d07bfd3eae71" containerID="22f7c4281c1b2333fedb9589a3032e96149582cafce9add13d52e3ba2ee332a6" exitCode=0 Apr 16 16:27:37.948068 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:27:37.947957 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-8c7997d9b-dzhcc" event={"ID":"4f9b721b-551f-40d8-a78f-d07bfd3eae71","Type":"ContainerDied","Data":"22f7c4281c1b2333fedb9589a3032e96149582cafce9add13d52e3ba2ee332a6"} Apr 16 16:27:37.948068 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:27:37.947963 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-8c7997d9b-dzhcc" Apr 16 16:27:37.948068 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:27:37.947991 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-8c7997d9b-dzhcc" event={"ID":"4f9b721b-551f-40d8-a78f-d07bfd3eae71","Type":"ContainerDied","Data":"931c036a9e055a9ca2e3ef965414851ce5b07166343ee45263835b5f4f0889a1"} Apr 16 16:27:37.948068 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:27:37.948011 2576 scope.go:117] "RemoveContainer" containerID="22f7c4281c1b2333fedb9589a3032e96149582cafce9add13d52e3ba2ee332a6" Apr 16 16:27:37.956795 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:27:37.956779 2576 scope.go:117] "RemoveContainer" containerID="3e4c9afbe0ffda777b75920f56acd948fa0931d14d7f5628fd25899989f70b70" Apr 16 16:27:37.963623 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:27:37.963608 2576 scope.go:117] "RemoveContainer" containerID="22f7c4281c1b2333fedb9589a3032e96149582cafce9add13d52e3ba2ee332a6" Apr 16 16:27:37.963873 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:27:37.963855 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22f7c4281c1b2333fedb9589a3032e96149582cafce9add13d52e3ba2ee332a6\": container with ID starting with 22f7c4281c1b2333fedb9589a3032e96149582cafce9add13d52e3ba2ee332a6 not found: ID does not exist" containerID="22f7c4281c1b2333fedb9589a3032e96149582cafce9add13d52e3ba2ee332a6" Apr 16 16:27:37.963939 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:27:37.963884 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22f7c4281c1b2333fedb9589a3032e96149582cafce9add13d52e3ba2ee332a6"} err="failed to get container status \"22f7c4281c1b2333fedb9589a3032e96149582cafce9add13d52e3ba2ee332a6\": rpc error: code = NotFound desc = could not find container \"22f7c4281c1b2333fedb9589a3032e96149582cafce9add13d52e3ba2ee332a6\": container with ID starting with 22f7c4281c1b2333fedb9589a3032e96149582cafce9add13d52e3ba2ee332a6 not found: ID does not exist" Apr 16 16:27:37.963939 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:27:37.963907 2576 scope.go:117] "RemoveContainer" containerID="3e4c9afbe0ffda777b75920f56acd948fa0931d14d7f5628fd25899989f70b70" Apr 16 16:27:37.964126 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:27:37.964109 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e4c9afbe0ffda777b75920f56acd948fa0931d14d7f5628fd25899989f70b70\": container with ID starting with 3e4c9afbe0ffda777b75920f56acd948fa0931d14d7f5628fd25899989f70b70 not found: ID does not exist" containerID="3e4c9afbe0ffda777b75920f56acd948fa0931d14d7f5628fd25899989f70b70" Apr 16 16:27:37.964171 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:27:37.964133 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e4c9afbe0ffda777b75920f56acd948fa0931d14d7f5628fd25899989f70b70"} err="failed to get container status \"3e4c9afbe0ffda777b75920f56acd948fa0931d14d7f5628fd25899989f70b70\": rpc error: code = NotFound desc = could not find container \"3e4c9afbe0ffda777b75920f56acd948fa0931d14d7f5628fd25899989f70b70\": container with ID starting with 3e4c9afbe0ffda777b75920f56acd948fa0931d14d7f5628fd25899989f70b70 not found: ID does not exist" Apr 16 16:27:37.969798 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:27:37.969776 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-8c7997d9b-dzhcc"] Apr 16 16:27:37.973161 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:27:37.973145 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-8c7997d9b-dzhcc"] Apr 16 16:27:39.277821 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:27:39.277778 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f9b721b-551f-40d8-a78f-d07bfd3eae71" path="/var/lib/kubelet/pods/4f9b721b-551f-40d8-a78f-d07bfd3eae71/volumes" Apr 16 16:27:40.960163 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:27:40.960129 2576 generic.go:358] "Generic (PLEG): container finished" podID="9f07735c-f528-4676-bd44-27236301557c" containerID="76e7ee981186ed6a9a711e747d874336a7c3f58944dc9f3d3fc36688527de52e" exitCode=0 Apr 16 16:27:40.960646 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:27:40.960191 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-679d448945-b6j62" event={"ID":"9f07735c-f528-4676-bd44-27236301557c","Type":"ContainerDied","Data":"76e7ee981186ed6a9a711e747d874336a7c3f58944dc9f3d3fc36688527de52e"} Apr 16 16:27:41.964734 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:27:41.964702 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-679d448945-b6j62" event={"ID":"9f07735c-f528-4676-bd44-27236301557c","Type":"ContainerStarted","Data":"4077d485ce5b150654074d18715af45ea82c6165c805aa164bd96e27d98cbbf0"} Apr 16 16:27:41.965132 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:27:41.964985 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-679d448945-b6j62" Apr 16 16:27:41.966382 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:27:41.966352 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-679d448945-b6j62" podUID="9f07735c-f528-4676-bd44-27236301557c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 16 16:27:41.982559 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:27:41.982521 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-679d448945-b6j62" podStartSLOduration=6.982509406 podStartE2EDuration="6.982509406s" podCreationTimestamp="2026-04-16 16:27:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:27:41.980823942 +0000 UTC m=+1485.335814068" watchObservedRunningTime="2026-04-16 16:27:41.982509406 +0000 UTC m=+1485.337499535" Apr 16 16:27:42.967671 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:27:42.967628 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-679d448945-b6j62" podUID="9f07735c-f528-4676-bd44-27236301557c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 16 16:27:52.968402 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:27:52.968356 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-679d448945-b6j62" podUID="9f07735c-f528-4676-bd44-27236301557c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 16 16:28:02.968513 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:28:02.968464 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-679d448945-b6j62" podUID="9f07735c-f528-4676-bd44-27236301557c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 16 16:28:12.968383 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:28:12.968341 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-679d448945-b6j62" podUID="9f07735c-f528-4676-bd44-27236301557c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 16 16:28:22.969445 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:28:22.969365 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-679d448945-b6j62" Apr 16 16:28:26.783073 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:28:26.783040 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-679d448945-b6j62"] Apr 16 16:28:26.783462 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:28:26.783313 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-679d448945-b6j62" podUID="9f07735c-f528-4676-bd44-27236301557c" containerName="kserve-container" containerID="cri-o://4077d485ce5b150654074d18715af45ea82c6165c805aa164bd96e27d98cbbf0" gracePeriod=30 Apr 16 16:28:26.857174 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:28:26.857133 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-89795c578-km4b9"] Apr 16 16:28:26.857562 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:28:26.857544 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f9b721b-551f-40d8-a78f-d07bfd3eae71" containerName="kserve-container" Apr 16 16:28:26.857655 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:28:26.857566 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f9b721b-551f-40d8-a78f-d07bfd3eae71" containerName="kserve-container" Apr 16 16:28:26.857655 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:28:26.857602 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f9b721b-551f-40d8-a78f-d07bfd3eae71" containerName="storage-initializer" Apr 16 16:28:26.857655 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:28:26.857611 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f9b721b-551f-40d8-a78f-d07bfd3eae71" containerName="storage-initializer" Apr 16 16:28:26.857813 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:28:26.857692 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="4f9b721b-551f-40d8-a78f-d07bfd3eae71" containerName="kserve-container" Apr 16 16:28:26.860632 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:28:26.860606 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-89795c578-km4b9" Apr 16 16:28:26.868095 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:28:26.867641 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-89795c578-km4b9"] Apr 16 16:28:26.961739 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:28:26.961699 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/938ad486-5276-4135-aa5b-de465a2b49e9-kserve-provision-location\") pod \"isvc-pmml-predictor-89795c578-km4b9\" (UID: \"938ad486-5276-4135-aa5b-de465a2b49e9\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-89795c578-km4b9" Apr 16 16:28:27.063075 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:28:27.062969 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/938ad486-5276-4135-aa5b-de465a2b49e9-kserve-provision-location\") pod \"isvc-pmml-predictor-89795c578-km4b9\" (UID: \"938ad486-5276-4135-aa5b-de465a2b49e9\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-89795c578-km4b9" Apr 16 16:28:27.063389 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:28:27.063368 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/938ad486-5276-4135-aa5b-de465a2b49e9-kserve-provision-location\") pod \"isvc-pmml-predictor-89795c578-km4b9\" (UID: \"938ad486-5276-4135-aa5b-de465a2b49e9\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-89795c578-km4b9" Apr 16 16:28:27.172310 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:28:27.172277 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-89795c578-km4b9" Apr 16 16:28:27.295015 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:28:27.294891 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-89795c578-km4b9"] Apr 16 16:28:27.298018 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:28:27.297991 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod938ad486_5276_4135_aa5b_de465a2b49e9.slice/crio-7c114b964d4eedee627e360d709a6e5218fa66ef0b6584adf62e6bcdc7f65a45 WatchSource:0}: Error finding container 7c114b964d4eedee627e360d709a6e5218fa66ef0b6584adf62e6bcdc7f65a45: Status 404 returned error can't find the container with id 7c114b964d4eedee627e360d709a6e5218fa66ef0b6584adf62e6bcdc7f65a45 Apr 16 16:28:28.110646 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:28:28.110610 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-89795c578-km4b9" event={"ID":"938ad486-5276-4135-aa5b-de465a2b49e9","Type":"ContainerStarted","Data":"6ebfd42e4c3828803e9492f3766bbfdfc695758bf44d2360e33eed627273d435"} Apr 16 16:28:28.110646 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:28:28.110646 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-89795c578-km4b9" event={"ID":"938ad486-5276-4135-aa5b-de465a2b49e9","Type":"ContainerStarted","Data":"7c114b964d4eedee627e360d709a6e5218fa66ef0b6584adf62e6bcdc7f65a45"} Apr 16 16:28:29.516493 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:28:29.516468 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-679d448945-b6j62" Apr 16 16:28:29.584631 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:28:29.584600 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9f07735c-f528-4676-bd44-27236301557c-kserve-provision-location\") pod \"9f07735c-f528-4676-bd44-27236301557c\" (UID: \"9f07735c-f528-4676-bd44-27236301557c\") " Apr 16 16:28:29.594308 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:28:29.594242 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f07735c-f528-4676-bd44-27236301557c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9f07735c-f528-4676-bd44-27236301557c" (UID: "9f07735c-f528-4676-bd44-27236301557c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:28:29.685578 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:28:29.685545 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9f07735c-f528-4676-bd44-27236301557c-kserve-provision-location\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:28:30.118281 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:28:30.118237 2576 generic.go:358] "Generic (PLEG): container finished" podID="9f07735c-f528-4676-bd44-27236301557c" containerID="4077d485ce5b150654074d18715af45ea82c6165c805aa164bd96e27d98cbbf0" exitCode=0 Apr 16 16:28:30.118497 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:28:30.118324 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-679d448945-b6j62" Apr 16 16:28:30.118497 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:28:30.118334 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-679d448945-b6j62" event={"ID":"9f07735c-f528-4676-bd44-27236301557c","Type":"ContainerDied","Data":"4077d485ce5b150654074d18715af45ea82c6165c805aa164bd96e27d98cbbf0"} Apr 16 16:28:30.118497 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:28:30.118381 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-679d448945-b6j62" event={"ID":"9f07735c-f528-4676-bd44-27236301557c","Type":"ContainerDied","Data":"d830e84ff9a7d49dec1a8204fa9b551c22dba127d1c5ada64531defe72f22ca7"} Apr 16 16:28:30.118497 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:28:30.118401 2576 scope.go:117] "RemoveContainer" containerID="4077d485ce5b150654074d18715af45ea82c6165c805aa164bd96e27d98cbbf0" Apr 16 16:28:30.126961 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:28:30.126942 2576 scope.go:117] "RemoveContainer" containerID="76e7ee981186ed6a9a711e747d874336a7c3f58944dc9f3d3fc36688527de52e" Apr 16 16:28:30.134343 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:28:30.134324 2576 scope.go:117] "RemoveContainer" containerID="4077d485ce5b150654074d18715af45ea82c6165c805aa164bd96e27d98cbbf0" Apr 16 16:28:30.134588 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:28:30.134568 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4077d485ce5b150654074d18715af45ea82c6165c805aa164bd96e27d98cbbf0\": container with ID starting with 4077d485ce5b150654074d18715af45ea82c6165c805aa164bd96e27d98cbbf0 not found: ID does not exist" containerID="4077d485ce5b150654074d18715af45ea82c6165c805aa164bd96e27d98cbbf0" Apr 16 16:28:30.134661 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:28:30.134603 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4077d485ce5b150654074d18715af45ea82c6165c805aa164bd96e27d98cbbf0"} err="failed to get container status \"4077d485ce5b150654074d18715af45ea82c6165c805aa164bd96e27d98cbbf0\": rpc error: code = NotFound desc = could not find container \"4077d485ce5b150654074d18715af45ea82c6165c805aa164bd96e27d98cbbf0\": container with ID starting with 4077d485ce5b150654074d18715af45ea82c6165c805aa164bd96e27d98cbbf0 not found: ID does not exist" Apr 16 16:28:30.134661 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:28:30.134629 2576 scope.go:117] "RemoveContainer" containerID="76e7ee981186ed6a9a711e747d874336a7c3f58944dc9f3d3fc36688527de52e" Apr 16 16:28:30.134865 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:28:30.134849 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76e7ee981186ed6a9a711e747d874336a7c3f58944dc9f3d3fc36688527de52e\": container with ID starting with 76e7ee981186ed6a9a711e747d874336a7c3f58944dc9f3d3fc36688527de52e not found: ID does not exist" containerID="76e7ee981186ed6a9a711e747d874336a7c3f58944dc9f3d3fc36688527de52e" Apr 16 16:28:30.134907 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:28:30.134870 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76e7ee981186ed6a9a711e747d874336a7c3f58944dc9f3d3fc36688527de52e"} err="failed to get container status \"76e7ee981186ed6a9a711e747d874336a7c3f58944dc9f3d3fc36688527de52e\": rpc error: code = NotFound desc = could not find container \"76e7ee981186ed6a9a711e747d874336a7c3f58944dc9f3d3fc36688527de52e\": container with ID starting with 76e7ee981186ed6a9a711e747d874336a7c3f58944dc9f3d3fc36688527de52e not found: ID does not exist" Apr 16 16:28:30.139546 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:28:30.139524 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-679d448945-b6j62"] Apr 16 16:28:30.142487 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:28:30.142466 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-679d448945-b6j62"] Apr 16 16:28:31.123024 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:28:31.122998 2576 generic.go:358] "Generic (PLEG): container finished" podID="938ad486-5276-4135-aa5b-de465a2b49e9" containerID="6ebfd42e4c3828803e9492f3766bbfdfc695758bf44d2360e33eed627273d435" exitCode=0 Apr 16 16:28:31.123340 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:28:31.123059 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-89795c578-km4b9" event={"ID":"938ad486-5276-4135-aa5b-de465a2b49e9","Type":"ContainerDied","Data":"6ebfd42e4c3828803e9492f3766bbfdfc695758bf44d2360e33eed627273d435"} Apr 16 16:28:31.279130 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:28:31.279086 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f07735c-f528-4676-bd44-27236301557c" path="/var/lib/kubelet/pods/9f07735c-f528-4676-bd44-27236301557c/volumes" Apr 16 16:28:39.158449 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:28:39.158416 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-89795c578-km4b9" event={"ID":"938ad486-5276-4135-aa5b-de465a2b49e9","Type":"ContainerStarted","Data":"0ab29f3a534ff72c3ef3b03d11732adbe77cbde344c65d7e0308a2e5c52e358b"} Apr 16 16:28:39.158829 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:28:39.158708 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-89795c578-km4b9" Apr 16 16:28:39.159943 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:28:39.159913 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-89795c578-km4b9" podUID="938ad486-5276-4135-aa5b-de465a2b49e9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 16 16:28:39.176578 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:28:39.176540 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-predictor-89795c578-km4b9" podStartSLOduration=6.18509099 podStartE2EDuration="13.176528412s" podCreationTimestamp="2026-04-16 16:28:26 +0000 UTC" firstStartedPulling="2026-04-16 16:28:31.124231042 +0000 UTC m=+1534.479221150" lastFinishedPulling="2026-04-16 16:28:38.115668455 +0000 UTC m=+1541.470658572" observedRunningTime="2026-04-16 16:28:39.174612234 +0000 UTC m=+1542.529602363" watchObservedRunningTime="2026-04-16 16:28:39.176528412 +0000 UTC m=+1542.531518541" Apr 16 16:28:40.162126 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:28:40.162093 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-89795c578-km4b9" podUID="938ad486-5276-4135-aa5b-de465a2b49e9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 16 16:28:50.162298 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:28:50.162256 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-89795c578-km4b9" podUID="938ad486-5276-4135-aa5b-de465a2b49e9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 16 16:29:00.162646 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:29:00.162602 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-89795c578-km4b9" podUID="938ad486-5276-4135-aa5b-de465a2b49e9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 16 16:29:10.163062 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:29:10.163020 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-89795c578-km4b9" podUID="938ad486-5276-4135-aa5b-de465a2b49e9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 16 16:29:20.162263 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:29:20.162196 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-89795c578-km4b9" podUID="938ad486-5276-4135-aa5b-de465a2b49e9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 16 16:29:30.162508 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:29:30.162469 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-89795c578-km4b9" podUID="938ad486-5276-4135-aa5b-de465a2b49e9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 16 16:29:40.162963 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:29:40.162922 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-89795c578-km4b9" podUID="938ad486-5276-4135-aa5b-de465a2b49e9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 16 16:29:50.162557 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:29:50.162473 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-89795c578-km4b9" podUID="938ad486-5276-4135-aa5b-de465a2b49e9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 16 16:30:00.163041 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:30:00.162998 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-89795c578-km4b9" Apr 16 16:30:07.885478 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:30:07.885448 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-89795c578-km4b9"] Apr 16 16:30:07.885892 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:30:07.885730 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-predictor-89795c578-km4b9" podUID="938ad486-5276-4135-aa5b-de465a2b49e9" containerName="kserve-container" containerID="cri-o://0ab29f3a534ff72c3ef3b03d11732adbe77cbde344c65d7e0308a2e5c52e358b" gracePeriod=30 Apr 16 16:30:07.997157 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:30:07.997122 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-xg946"] Apr 16 16:30:07.997449 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:30:07.997436 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9f07735c-f528-4676-bd44-27236301557c" containerName="kserve-container" Apr 16 16:30:07.997509 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:30:07.997450 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f07735c-f528-4676-bd44-27236301557c" containerName="kserve-container" Apr 16 16:30:07.997509 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:30:07.997464 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9f07735c-f528-4676-bd44-27236301557c" containerName="storage-initializer" Apr 16 16:30:07.997509 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:30:07.997470 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f07735c-f528-4676-bd44-27236301557c" containerName="storage-initializer" Apr 16 16:30:07.997609 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:30:07.997532 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9f07735c-f528-4676-bd44-27236301557c" containerName="kserve-container" Apr 16 16:30:07.999563 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:30:07.999544 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-xg946" Apr 16 16:30:08.009125 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:30:08.009093 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-xg946"] Apr 16 16:30:08.157341 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:30:08.157315 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7849911b-7aa1-4bdf-bcb3-1bf8cd0d5e8e-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-5bf494d8dc-xg946\" (UID: \"7849911b-7aa1-4bdf-bcb3-1bf8cd0d5e8e\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-xg946" Apr 16 16:30:08.258035 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:30:08.258009 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7849911b-7aa1-4bdf-bcb3-1bf8cd0d5e8e-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-5bf494d8dc-xg946\" (UID: \"7849911b-7aa1-4bdf-bcb3-1bf8cd0d5e8e\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-xg946" Apr 16 16:30:08.258359 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:30:08.258344 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7849911b-7aa1-4bdf-bcb3-1bf8cd0d5e8e-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-5bf494d8dc-xg946\" (UID: \"7849911b-7aa1-4bdf-bcb3-1bf8cd0d5e8e\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-xg946" Apr 16 16:30:08.309619 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:30:08.309598 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-xg946" Apr 16 16:30:08.438247 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:30:08.438155 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-xg946"] Apr 16 16:30:08.441280 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:30:08.441255 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7849911b_7aa1_4bdf_bcb3_1bf8cd0d5e8e.slice/crio-dfb17c784905d539c9423fa2a533ab1e3e1a399435dca7d05ff118832e9b37b3 WatchSource:0}: Error finding container dfb17c784905d539c9423fa2a533ab1e3e1a399435dca7d05ff118832e9b37b3: Status 404 returned error can't find the container with id dfb17c784905d539c9423fa2a533ab1e3e1a399435dca7d05ff118832e9b37b3 Apr 16 16:30:09.435307 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:30:09.435274 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-xg946" event={"ID":"7849911b-7aa1-4bdf-bcb3-1bf8cd0d5e8e","Type":"ContainerStarted","Data":"3693a6301774b8b2a5cc37365327a0709c2a121d36f04ae4a8af641369504df4"} Apr 16 16:30:09.435307 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:30:09.435311 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-xg946" event={"ID":"7849911b-7aa1-4bdf-bcb3-1bf8cd0d5e8e","Type":"ContainerStarted","Data":"dfb17c784905d539c9423fa2a533ab1e3e1a399435dca7d05ff118832e9b37b3"} Apr 16 16:30:10.163123 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:30:10.163080 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-89795c578-km4b9" podUID="938ad486-5276-4135-aa5b-de465a2b49e9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.42:8080: connect: connection refused" Apr 16 16:30:11.224435 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:30:11.224405 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-89795c578-km4b9" Apr 16 16:30:11.386777 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:30:11.386709 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/938ad486-5276-4135-aa5b-de465a2b49e9-kserve-provision-location\") pod \"938ad486-5276-4135-aa5b-de465a2b49e9\" (UID: \"938ad486-5276-4135-aa5b-de465a2b49e9\") " Apr 16 16:30:11.387012 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:30:11.386989 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/938ad486-5276-4135-aa5b-de465a2b49e9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "938ad486-5276-4135-aa5b-de465a2b49e9" (UID: "938ad486-5276-4135-aa5b-de465a2b49e9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:30:11.442444 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:30:11.442419 2576 generic.go:358] "Generic (PLEG): container finished" podID="938ad486-5276-4135-aa5b-de465a2b49e9" containerID="0ab29f3a534ff72c3ef3b03d11732adbe77cbde344c65d7e0308a2e5c52e358b" exitCode=0 Apr 16 16:30:11.442559 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:30:11.442449 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-89795c578-km4b9" event={"ID":"938ad486-5276-4135-aa5b-de465a2b49e9","Type":"ContainerDied","Data":"0ab29f3a534ff72c3ef3b03d11732adbe77cbde344c65d7e0308a2e5c52e358b"} Apr 16 16:30:11.442559 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:30:11.442469 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-89795c578-km4b9" event={"ID":"938ad486-5276-4135-aa5b-de465a2b49e9","Type":"ContainerDied","Data":"7c114b964d4eedee627e360d709a6e5218fa66ef0b6584adf62e6bcdc7f65a45"} Apr 16 16:30:11.442559 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:30:11.442483 2576 scope.go:117] "RemoveContainer" containerID="0ab29f3a534ff72c3ef3b03d11732adbe77cbde344c65d7e0308a2e5c52e358b" Apr 16 16:30:11.442559 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:30:11.442482 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-89795c578-km4b9" Apr 16 16:30:11.450124 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:30:11.450107 2576 scope.go:117] "RemoveContainer" containerID="6ebfd42e4c3828803e9492f3766bbfdfc695758bf44d2360e33eed627273d435" Apr 16 16:30:11.458552 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:30:11.458538 2576 scope.go:117] "RemoveContainer" containerID="0ab29f3a534ff72c3ef3b03d11732adbe77cbde344c65d7e0308a2e5c52e358b" Apr 16 16:30:11.458792 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:30:11.458775 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ab29f3a534ff72c3ef3b03d11732adbe77cbde344c65d7e0308a2e5c52e358b\": container with ID starting with 0ab29f3a534ff72c3ef3b03d11732adbe77cbde344c65d7e0308a2e5c52e358b not found: ID does not exist" containerID="0ab29f3a534ff72c3ef3b03d11732adbe77cbde344c65d7e0308a2e5c52e358b" Apr 16 16:30:11.458839 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:30:11.458799 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ab29f3a534ff72c3ef3b03d11732adbe77cbde344c65d7e0308a2e5c52e358b"} err="failed to get container status \"0ab29f3a534ff72c3ef3b03d11732adbe77cbde344c65d7e0308a2e5c52e358b\": rpc error: code = NotFound desc = could not find container \"0ab29f3a534ff72c3ef3b03d11732adbe77cbde344c65d7e0308a2e5c52e358b\": container with ID starting with 0ab29f3a534ff72c3ef3b03d11732adbe77cbde344c65d7e0308a2e5c52e358b not found: ID does not exist" Apr 16 16:30:11.458839 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:30:11.458814 2576 scope.go:117] "RemoveContainer" containerID="6ebfd42e4c3828803e9492f3766bbfdfc695758bf44d2360e33eed627273d435" Apr 16 16:30:11.459045 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:30:11.459027 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ebfd42e4c3828803e9492f3766bbfdfc695758bf44d2360e33eed627273d435\": container with ID starting with 6ebfd42e4c3828803e9492f3766bbfdfc695758bf44d2360e33eed627273d435 not found: ID does not exist" containerID="6ebfd42e4c3828803e9492f3766bbfdfc695758bf44d2360e33eed627273d435" Apr 16 16:30:11.459080 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:30:11.459052 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ebfd42e4c3828803e9492f3766bbfdfc695758bf44d2360e33eed627273d435"} err="failed to get container status \"6ebfd42e4c3828803e9492f3766bbfdfc695758bf44d2360e33eed627273d435\": rpc error: code = NotFound desc = could not find container \"6ebfd42e4c3828803e9492f3766bbfdfc695758bf44d2360e33eed627273d435\": container with ID starting with 6ebfd42e4c3828803e9492f3766bbfdfc695758bf44d2360e33eed627273d435 not found: ID does not exist" Apr 16 16:30:11.464036 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:30:11.464015 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-89795c578-km4b9"] Apr 16 16:30:11.471656 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:30:11.467265 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-89795c578-km4b9"] Apr 16 16:30:11.487710 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:30:11.487691 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/938ad486-5276-4135-aa5b-de465a2b49e9-kserve-provision-location\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:30:12.447757 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:30:12.447726 2576 generic.go:358] "Generic (PLEG): container finished" podID="7849911b-7aa1-4bdf-bcb3-1bf8cd0d5e8e" containerID="3693a6301774b8b2a5cc37365327a0709c2a121d36f04ae4a8af641369504df4" exitCode=0 Apr 16 16:30:12.448149 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:30:12.447769 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-xg946" event={"ID":"7849911b-7aa1-4bdf-bcb3-1bf8cd0d5e8e","Type":"ContainerDied","Data":"3693a6301774b8b2a5cc37365327a0709c2a121d36f04ae4a8af641369504df4"} Apr 16 16:30:13.277768 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:30:13.277734 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="938ad486-5276-4135-aa5b-de465a2b49e9" path="/var/lib/kubelet/pods/938ad486-5276-4135-aa5b-de465a2b49e9/volumes" Apr 16 16:30:13.453375 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:30:13.453339 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-xg946" event={"ID":"7849911b-7aa1-4bdf-bcb3-1bf8cd0d5e8e","Type":"ContainerStarted","Data":"fbe110cd1ec055e866f369dda2ce4c5a31fb83679aeafa385d8d96d48e136019"} Apr 16 16:30:13.453790 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:30:13.453660 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-xg946" Apr 16 16:30:13.454497 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:30:13.454465 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-xg946" podUID="7849911b-7aa1-4bdf-bcb3-1bf8cd0d5e8e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 16 16:30:13.470701 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:30:13.470662 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-xg946" podStartSLOduration=6.470650403 podStartE2EDuration="6.470650403s" podCreationTimestamp="2026-04-16 16:30:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:30:13.468284825 +0000 UTC m=+1636.823274955" watchObservedRunningTime="2026-04-16 16:30:13.470650403 +0000 UTC m=+1636.825640528" Apr 16 16:30:14.457112 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:30:14.457060 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-xg946" podUID="7849911b-7aa1-4bdf-bcb3-1bf8cd0d5e8e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 16 16:30:24.458014 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:30:24.457974 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-xg946" podUID="7849911b-7aa1-4bdf-bcb3-1bf8cd0d5e8e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 16 16:30:34.457606 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:30:34.457558 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-xg946" podUID="7849911b-7aa1-4bdf-bcb3-1bf8cd0d5e8e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 16 16:30:44.457291 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:30:44.457249 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-xg946" podUID="7849911b-7aa1-4bdf-bcb3-1bf8cd0d5e8e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 16 16:30:54.457750 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:30:54.457707 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-xg946" podUID="7849911b-7aa1-4bdf-bcb3-1bf8cd0d5e8e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 16 16:31:04.457133 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:31:04.457086 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-xg946" podUID="7849911b-7aa1-4bdf-bcb3-1bf8cd0d5e8e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 16 16:31:14.457663 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:31:14.457616 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-xg946" podUID="7849911b-7aa1-4bdf-bcb3-1bf8cd0d5e8e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 16 16:31:24.457376 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:31:24.457293 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-xg946" podUID="7849911b-7aa1-4bdf-bcb3-1bf8cd0d5e8e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 16 16:31:27.275769 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:31:27.275734 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-xg946" podUID="7849911b-7aa1-4bdf-bcb3-1bf8cd0d5e8e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.43:8080: connect: connection refused" Apr 16 16:31:37.278510 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:31:37.278482 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-xg946" Apr 16 16:31:38.997811 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:31:38.997771 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-xg946"] Apr 16 16:31:38.998341 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:31:38.998107 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-xg946" podUID="7849911b-7aa1-4bdf-bcb3-1bf8cd0d5e8e" containerName="kserve-container" containerID="cri-o://fbe110cd1ec055e866f369dda2ce4c5a31fb83679aeafa385d8d96d48e136019" gracePeriod=30 Apr 16 16:31:39.080972 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:31:39.080938 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-5b65fdc9dd-bzzjq"] Apr 16 16:31:39.081296 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:31:39.081283 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="938ad486-5276-4135-aa5b-de465a2b49e9" containerName="kserve-container" Apr 16 16:31:39.081346 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:31:39.081297 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="938ad486-5276-4135-aa5b-de465a2b49e9" containerName="kserve-container" Apr 16 16:31:39.081346 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:31:39.081318 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="938ad486-5276-4135-aa5b-de465a2b49e9" containerName="storage-initializer" Apr 16 16:31:39.081346 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:31:39.081323 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="938ad486-5276-4135-aa5b-de465a2b49e9" containerName="storage-initializer" Apr 16 16:31:39.081449 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:31:39.081384 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="938ad486-5276-4135-aa5b-de465a2b49e9" containerName="kserve-container" Apr 16 16:31:39.083563 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:31:39.083545 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-5b65fdc9dd-bzzjq" Apr 16 16:31:39.094522 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:31:39.094500 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-5b65fdc9dd-bzzjq"] Apr 16 16:31:39.260735 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:31:39.260657 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eda2ad7a-c8a0-406c-b0a4-37e69a12aa13-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-5b65fdc9dd-bzzjq\" (UID: \"eda2ad7a-c8a0-406c-b0a4-37e69a12aa13\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-5b65fdc9dd-bzzjq" Apr 16 16:31:39.361036 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:31:39.361004 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eda2ad7a-c8a0-406c-b0a4-37e69a12aa13-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-5b65fdc9dd-bzzjq\" (UID: \"eda2ad7a-c8a0-406c-b0a4-37e69a12aa13\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-5b65fdc9dd-bzzjq" Apr 16 16:31:39.361391 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:31:39.361372 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eda2ad7a-c8a0-406c-b0a4-37e69a12aa13-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-5b65fdc9dd-bzzjq\" (UID: \"eda2ad7a-c8a0-406c-b0a4-37e69a12aa13\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-5b65fdc9dd-bzzjq" Apr 16 16:31:39.393574 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:31:39.393542 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-5b65fdc9dd-bzzjq" Apr 16 16:31:39.510071 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:31:39.510047 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-5b65fdc9dd-bzzjq"] Apr 16 16:31:39.512746 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:31:39.512681 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeda2ad7a_c8a0_406c_b0a4_37e69a12aa13.slice/crio-ea231c4727a17b03bfe9048e2f201e5b937e9f2fbc2af3783c13aca0aaf5f89e WatchSource:0}: Error finding container ea231c4727a17b03bfe9048e2f201e5b937e9f2fbc2af3783c13aca0aaf5f89e: Status 404 returned error can't find the container with id ea231c4727a17b03bfe9048e2f201e5b937e9f2fbc2af3783c13aca0aaf5f89e Apr 16 16:31:39.514592 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:31:39.514575 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:31:39.721062 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:31:39.721029 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-5b65fdc9dd-bzzjq" event={"ID":"eda2ad7a-c8a0-406c-b0a4-37e69a12aa13","Type":"ContainerStarted","Data":"5a80dd3a2edb136c2420e80d351f10a321595649ce46579d4e9425d96e3cf99d"} Apr 16 16:31:39.721062 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:31:39.721064 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-5b65fdc9dd-bzzjq" event={"ID":"eda2ad7a-c8a0-406c-b0a4-37e69a12aa13","Type":"ContainerStarted","Data":"ea231c4727a17b03bfe9048e2f201e5b937e9f2fbc2af3783c13aca0aaf5f89e"} Apr 16 16:31:42.229877 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:31:42.229855 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-xg946" Apr 16 16:31:42.384185 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:31:42.384111 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7849911b-7aa1-4bdf-bcb3-1bf8cd0d5e8e-kserve-provision-location\") pod \"7849911b-7aa1-4bdf-bcb3-1bf8cd0d5e8e\" (UID: \"7849911b-7aa1-4bdf-bcb3-1bf8cd0d5e8e\") " Apr 16 16:31:42.384462 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:31:42.384439 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7849911b-7aa1-4bdf-bcb3-1bf8cd0d5e8e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7849911b-7aa1-4bdf-bcb3-1bf8cd0d5e8e" (UID: "7849911b-7aa1-4bdf-bcb3-1bf8cd0d5e8e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:31:42.485495 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:31:42.485469 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7849911b-7aa1-4bdf-bcb3-1bf8cd0d5e8e-kserve-provision-location\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:31:42.732996 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:31:42.732967 2576 generic.go:358] "Generic (PLEG): container finished" podID="7849911b-7aa1-4bdf-bcb3-1bf8cd0d5e8e" containerID="fbe110cd1ec055e866f369dda2ce4c5a31fb83679aeafa385d8d96d48e136019" exitCode=0 Apr 16 16:31:42.733120 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:31:42.733031 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-xg946" Apr 16 16:31:42.733120 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:31:42.733051 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-xg946" event={"ID":"7849911b-7aa1-4bdf-bcb3-1bf8cd0d5e8e","Type":"ContainerDied","Data":"fbe110cd1ec055e866f369dda2ce4c5a31fb83679aeafa385d8d96d48e136019"} Apr 16 16:31:42.733120 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:31:42.733090 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-xg946" event={"ID":"7849911b-7aa1-4bdf-bcb3-1bf8cd0d5e8e","Type":"ContainerDied","Data":"dfb17c784905d539c9423fa2a533ab1e3e1a399435dca7d05ff118832e9b37b3"} Apr 16 16:31:42.733120 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:31:42.733105 2576 scope.go:117] "RemoveContainer" containerID="fbe110cd1ec055e866f369dda2ce4c5a31fb83679aeafa385d8d96d48e136019" Apr 16 16:31:42.741687 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:31:42.741673 2576 scope.go:117] "RemoveContainer" containerID="3693a6301774b8b2a5cc37365327a0709c2a121d36f04ae4a8af641369504df4" Apr 16 16:31:42.748083 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:31:42.748066 2576 scope.go:117] "RemoveContainer" containerID="fbe110cd1ec055e866f369dda2ce4c5a31fb83679aeafa385d8d96d48e136019" Apr 16 16:31:42.748335 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:31:42.748317 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbe110cd1ec055e866f369dda2ce4c5a31fb83679aeafa385d8d96d48e136019\": container with ID starting with fbe110cd1ec055e866f369dda2ce4c5a31fb83679aeafa385d8d96d48e136019 not found: ID does not exist" containerID="fbe110cd1ec055e866f369dda2ce4c5a31fb83679aeafa385d8d96d48e136019" Apr 16 16:31:42.748398 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:31:42.748344 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbe110cd1ec055e866f369dda2ce4c5a31fb83679aeafa385d8d96d48e136019"} err="failed to get container status \"fbe110cd1ec055e866f369dda2ce4c5a31fb83679aeafa385d8d96d48e136019\": rpc error: code = NotFound desc = could not find container \"fbe110cd1ec055e866f369dda2ce4c5a31fb83679aeafa385d8d96d48e136019\": container with ID starting with fbe110cd1ec055e866f369dda2ce4c5a31fb83679aeafa385d8d96d48e136019 not found: ID does not exist" Apr 16 16:31:42.748398 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:31:42.748362 2576 scope.go:117] "RemoveContainer" containerID="3693a6301774b8b2a5cc37365327a0709c2a121d36f04ae4a8af641369504df4" Apr 16 16:31:42.748592 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:31:42.748571 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3693a6301774b8b2a5cc37365327a0709c2a121d36f04ae4a8af641369504df4\": container with ID starting with 3693a6301774b8b2a5cc37365327a0709c2a121d36f04ae4a8af641369504df4 not found: ID does not exist" containerID="3693a6301774b8b2a5cc37365327a0709c2a121d36f04ae4a8af641369504df4" Apr 16 16:31:42.748650 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:31:42.748601 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3693a6301774b8b2a5cc37365327a0709c2a121d36f04ae4a8af641369504df4"} err="failed to get container status \"3693a6301774b8b2a5cc37365327a0709c2a121d36f04ae4a8af641369504df4\": rpc error: code = NotFound desc = could not find container \"3693a6301774b8b2a5cc37365327a0709c2a121d36f04ae4a8af641369504df4\": container with ID starting with 3693a6301774b8b2a5cc37365327a0709c2a121d36f04ae4a8af641369504df4 not found: ID does not exist" Apr 16 16:31:42.754392 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:31:42.754371 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-xg946"] Apr 16 16:31:42.758851 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:31:42.758831 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-5bf494d8dc-xg946"] Apr 16 16:31:43.278903 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:31:43.278875 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7849911b-7aa1-4bdf-bcb3-1bf8cd0d5e8e" path="/var/lib/kubelet/pods/7849911b-7aa1-4bdf-bcb3-1bf8cd0d5e8e/volumes" Apr 16 16:31:43.740189 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:31:43.740161 2576 generic.go:358] "Generic (PLEG): container finished" podID="eda2ad7a-c8a0-406c-b0a4-37e69a12aa13" containerID="5a80dd3a2edb136c2420e80d351f10a321595649ce46579d4e9425d96e3cf99d" exitCode=0 Apr 16 16:31:43.740362 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:31:43.740233 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-5b65fdc9dd-bzzjq" event={"ID":"eda2ad7a-c8a0-406c-b0a4-37e69a12aa13","Type":"ContainerDied","Data":"5a80dd3a2edb136c2420e80d351f10a321595649ce46579d4e9425d96e3cf99d"} Apr 16 16:31:44.744804 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:31:44.744766 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-5b65fdc9dd-bzzjq" event={"ID":"eda2ad7a-c8a0-406c-b0a4-37e69a12aa13","Type":"ContainerStarted","Data":"bae4d11b14a9ef9aef9aab1a4294aa22532ef5cb85bbe617f231debec5ad54b8"} Apr 16 16:31:44.745202 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:31:44.745129 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-5b65fdc9dd-bzzjq" Apr 16 16:31:44.746402 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:31:44.746375 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-5b65fdc9dd-bzzjq" podUID="eda2ad7a-c8a0-406c-b0a4-37e69a12aa13" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 16 16:31:44.762548 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:31:44.762508 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-5b65fdc9dd-bzzjq" podStartSLOduration=5.762495627 podStartE2EDuration="5.762495627s" podCreationTimestamp="2026-04-16 16:31:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:31:44.760401268 +0000 UTC m=+1728.115391397" watchObservedRunningTime="2026-04-16 16:31:44.762495627 +0000 UTC m=+1728.117485756" Apr 16 16:31:45.748630 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:31:45.748588 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-5b65fdc9dd-bzzjq" podUID="eda2ad7a-c8a0-406c-b0a4-37e69a12aa13" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 16 16:31:55.749541 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:31:55.749505 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-5b65fdc9dd-bzzjq" podUID="eda2ad7a-c8a0-406c-b0a4-37e69a12aa13" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 16 16:32:05.748748 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:32:05.748707 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-5b65fdc9dd-bzzjq" podUID="eda2ad7a-c8a0-406c-b0a4-37e69a12aa13" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 16 16:32:15.748999 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:32:15.748964 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-5b65fdc9dd-bzzjq" podUID="eda2ad7a-c8a0-406c-b0a4-37e69a12aa13" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 16 16:32:25.749088 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:32:25.749042 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-5b65fdc9dd-bzzjq" podUID="eda2ad7a-c8a0-406c-b0a4-37e69a12aa13" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 16 16:32:35.749487 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:32:35.749444 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-5b65fdc9dd-bzzjq" podUID="eda2ad7a-c8a0-406c-b0a4-37e69a12aa13" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 16 16:32:45.749161 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:32:45.749067 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-5b65fdc9dd-bzzjq" podUID="eda2ad7a-c8a0-406c-b0a4-37e69a12aa13" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 16 16:32:46.274717 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:32:46.274676 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-5b65fdc9dd-bzzjq" podUID="eda2ad7a-c8a0-406c-b0a4-37e69a12aa13" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 16 16:32:56.275331 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:32:56.275292 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-5b65fdc9dd-bzzjq" podUID="eda2ad7a-c8a0-406c-b0a4-37e69a12aa13" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 16 16:33:06.276406 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:33:06.276364 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-5b65fdc9dd-bzzjq" Apr 16 16:33:10.276234 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:33:10.276186 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-5b65fdc9dd-bzzjq"] Apr 16 16:33:10.276753 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:33:10.276444 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-5b65fdc9dd-bzzjq" podUID="eda2ad7a-c8a0-406c-b0a4-37e69a12aa13" containerName="kserve-container" containerID="cri-o://bae4d11b14a9ef9aef9aab1a4294aa22532ef5cb85bbe617f231debec5ad54b8" gracePeriod=30 Apr 16 16:33:10.375987 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:33:10.375954 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-14ca3c-predictor-6b6887bd78-ljmlh"] Apr 16 16:33:10.376306 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:33:10.376291 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7849911b-7aa1-4bdf-bcb3-1bf8cd0d5e8e" containerName="storage-initializer" Apr 16 16:33:10.376306 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:33:10.376308 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7849911b-7aa1-4bdf-bcb3-1bf8cd0d5e8e" containerName="storage-initializer" Apr 16 16:33:10.376407 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:33:10.376316 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7849911b-7aa1-4bdf-bcb3-1bf8cd0d5e8e" containerName="kserve-container" Apr 16 16:33:10.376407 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:33:10.376322 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7849911b-7aa1-4bdf-bcb3-1bf8cd0d5e8e" containerName="kserve-container" Apr 16 16:33:10.376407 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:33:10.376399 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="7849911b-7aa1-4bdf-bcb3-1bf8cd0d5e8e" containerName="kserve-container" Apr 16 16:33:10.379366 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:33:10.379346 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-14ca3c-predictor-6b6887bd78-ljmlh" Apr 16 16:33:10.386981 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:33:10.386954 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-14ca3c-predictor-6b6887bd78-ljmlh"] Apr 16 16:33:10.436309 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:33:10.436273 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/64248505-177e-4509-a785-8197d9751694-kserve-provision-location\") pod \"isvc-primary-14ca3c-predictor-6b6887bd78-ljmlh\" (UID: \"64248505-177e-4509-a785-8197d9751694\") " pod="kserve-ci-e2e-test/isvc-primary-14ca3c-predictor-6b6887bd78-ljmlh" Apr 16 16:33:10.537030 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:33:10.536947 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/64248505-177e-4509-a785-8197d9751694-kserve-provision-location\") pod \"isvc-primary-14ca3c-predictor-6b6887bd78-ljmlh\" (UID: \"64248505-177e-4509-a785-8197d9751694\") " pod="kserve-ci-e2e-test/isvc-primary-14ca3c-predictor-6b6887bd78-ljmlh" Apr 16 16:33:10.537393 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:33:10.537372 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/64248505-177e-4509-a785-8197d9751694-kserve-provision-location\") pod \"isvc-primary-14ca3c-predictor-6b6887bd78-ljmlh\" (UID: \"64248505-177e-4509-a785-8197d9751694\") " pod="kserve-ci-e2e-test/isvc-primary-14ca3c-predictor-6b6887bd78-ljmlh" Apr 16 16:33:10.690089 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:33:10.690054 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-14ca3c-predictor-6b6887bd78-ljmlh" Apr 16 16:33:10.810539 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:33:10.810512 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-14ca3c-predictor-6b6887bd78-ljmlh"] Apr 16 16:33:10.813129 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:33:10.813103 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64248505_177e_4509_a785_8197d9751694.slice/crio-0f215b7c73ecca935f2907f9f9f3e76aef11ece558f65b313d9927861566b847 WatchSource:0}: Error finding container 0f215b7c73ecca935f2907f9f9f3e76aef11ece558f65b313d9927861566b847: Status 404 returned error can't find the container with id 0f215b7c73ecca935f2907f9f9f3e76aef11ece558f65b313d9927861566b847 Apr 16 16:33:11.013524 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:33:11.013490 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-14ca3c-predictor-6b6887bd78-ljmlh" event={"ID":"64248505-177e-4509-a785-8197d9751694","Type":"ContainerStarted","Data":"d50d309edb14e8fbd695df0bc0f190f87ddffa0f73e995de5e4c81a50df36b98"} Apr 16 16:33:11.013524 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:33:11.013528 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-14ca3c-predictor-6b6887bd78-ljmlh" event={"ID":"64248505-177e-4509-a785-8197d9751694","Type":"ContainerStarted","Data":"0f215b7c73ecca935f2907f9f9f3e76aef11ece558f65b313d9927861566b847"} Apr 16 16:33:13.820171 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:33:13.820147 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-5b65fdc9dd-bzzjq" Apr 16 16:33:13.862914 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:33:13.862888 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eda2ad7a-c8a0-406c-b0a4-37e69a12aa13-kserve-provision-location\") pod \"eda2ad7a-c8a0-406c-b0a4-37e69a12aa13\" (UID: \"eda2ad7a-c8a0-406c-b0a4-37e69a12aa13\") " Apr 16 16:33:13.863257 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:33:13.863237 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eda2ad7a-c8a0-406c-b0a4-37e69a12aa13-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "eda2ad7a-c8a0-406c-b0a4-37e69a12aa13" (UID: "eda2ad7a-c8a0-406c-b0a4-37e69a12aa13"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:33:13.964260 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:33:13.964203 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eda2ad7a-c8a0-406c-b0a4-37e69a12aa13-kserve-provision-location\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:33:14.024425 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:33:14.024394 2576 generic.go:358] "Generic (PLEG): container finished" podID="eda2ad7a-c8a0-406c-b0a4-37e69a12aa13" containerID="bae4d11b14a9ef9aef9aab1a4294aa22532ef5cb85bbe617f231debec5ad54b8" exitCode=0 Apr 16 16:33:14.024541 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:33:14.024448 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-5b65fdc9dd-bzzjq" event={"ID":"eda2ad7a-c8a0-406c-b0a4-37e69a12aa13","Type":"ContainerDied","Data":"bae4d11b14a9ef9aef9aab1a4294aa22532ef5cb85bbe617f231debec5ad54b8"} Apr 16 16:33:14.024541 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:33:14.024463 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-5b65fdc9dd-bzzjq" Apr 16 16:33:14.024541 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:33:14.024487 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-5b65fdc9dd-bzzjq" event={"ID":"eda2ad7a-c8a0-406c-b0a4-37e69a12aa13","Type":"ContainerDied","Data":"ea231c4727a17b03bfe9048e2f201e5b937e9f2fbc2af3783c13aca0aaf5f89e"} Apr 16 16:33:14.024541 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:33:14.024504 2576 scope.go:117] "RemoveContainer" containerID="bae4d11b14a9ef9aef9aab1a4294aa22532ef5cb85bbe617f231debec5ad54b8" Apr 16 16:33:14.032866 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:33:14.032844 2576 scope.go:117] "RemoveContainer" containerID="5a80dd3a2edb136c2420e80d351f10a321595649ce46579d4e9425d96e3cf99d" Apr 16 16:33:14.039821 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:33:14.039806 2576 scope.go:117] "RemoveContainer" containerID="bae4d11b14a9ef9aef9aab1a4294aa22532ef5cb85bbe617f231debec5ad54b8" Apr 16 16:33:14.040027 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:33:14.040011 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bae4d11b14a9ef9aef9aab1a4294aa22532ef5cb85bbe617f231debec5ad54b8\": container with ID starting with bae4d11b14a9ef9aef9aab1a4294aa22532ef5cb85bbe617f231debec5ad54b8 not found: ID does not exist" containerID="bae4d11b14a9ef9aef9aab1a4294aa22532ef5cb85bbe617f231debec5ad54b8" Apr 16 16:33:14.040072 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:33:14.040034 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bae4d11b14a9ef9aef9aab1a4294aa22532ef5cb85bbe617f231debec5ad54b8"} err="failed to get container status \"bae4d11b14a9ef9aef9aab1a4294aa22532ef5cb85bbe617f231debec5ad54b8\": rpc error: code = NotFound desc = could not find container \"bae4d11b14a9ef9aef9aab1a4294aa22532ef5cb85bbe617f231debec5ad54b8\": container with ID starting with bae4d11b14a9ef9aef9aab1a4294aa22532ef5cb85bbe617f231debec5ad54b8 not found: ID does not exist" Apr 16 16:33:14.040072 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:33:14.040050 2576 scope.go:117] "RemoveContainer" containerID="5a80dd3a2edb136c2420e80d351f10a321595649ce46579d4e9425d96e3cf99d" Apr 16 16:33:14.040258 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:33:14.040240 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a80dd3a2edb136c2420e80d351f10a321595649ce46579d4e9425d96e3cf99d\": container with ID starting with 5a80dd3a2edb136c2420e80d351f10a321595649ce46579d4e9425d96e3cf99d not found: ID does not exist" containerID="5a80dd3a2edb136c2420e80d351f10a321595649ce46579d4e9425d96e3cf99d" Apr 16 16:33:14.040305 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:33:14.040267 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a80dd3a2edb136c2420e80d351f10a321595649ce46579d4e9425d96e3cf99d"} err="failed to get container status \"5a80dd3a2edb136c2420e80d351f10a321595649ce46579d4e9425d96e3cf99d\": rpc error: code = NotFound desc = could not find container \"5a80dd3a2edb136c2420e80d351f10a321595649ce46579d4e9425d96e3cf99d\": container with ID starting with 5a80dd3a2edb136c2420e80d351f10a321595649ce46579d4e9425d96e3cf99d not found: ID does not exist" Apr 16 16:33:14.045714 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:33:14.045695 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-5b65fdc9dd-bzzjq"] Apr 16 16:33:14.049382 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:33:14.049362 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-5b65fdc9dd-bzzjq"] Apr 16 16:33:15.029856 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:33:15.029823 2576 generic.go:358] "Generic (PLEG): container finished" podID="64248505-177e-4509-a785-8197d9751694" containerID="d50d309edb14e8fbd695df0bc0f190f87ddffa0f73e995de5e4c81a50df36b98" exitCode=0 Apr 16 16:33:15.030272 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:33:15.029866 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-14ca3c-predictor-6b6887bd78-ljmlh" event={"ID":"64248505-177e-4509-a785-8197d9751694","Type":"ContainerDied","Data":"d50d309edb14e8fbd695df0bc0f190f87ddffa0f73e995de5e4c81a50df36b98"} Apr 16 16:33:15.278007 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:33:15.277971 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eda2ad7a-c8a0-406c-b0a4-37e69a12aa13" path="/var/lib/kubelet/pods/eda2ad7a-c8a0-406c-b0a4-37e69a12aa13/volumes" Apr 16 16:33:16.034962 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:33:16.034926 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-14ca3c-predictor-6b6887bd78-ljmlh" event={"ID":"64248505-177e-4509-a785-8197d9751694","Type":"ContainerStarted","Data":"0135ff03eff90d11736688bd952ac3e83b0e0d4c102bebf9da9480b6b67cffd4"} Apr 16 16:33:16.035507 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:33:16.035241 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-14ca3c-predictor-6b6887bd78-ljmlh" Apr 16 16:33:16.036626 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:33:16.036604 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-14ca3c-predictor-6b6887bd78-ljmlh" podUID="64248505-177e-4509-a785-8197d9751694" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 16 16:33:16.052750 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:33:16.052705 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-14ca3c-predictor-6b6887bd78-ljmlh" podStartSLOduration=6.05268874 podStartE2EDuration="6.05268874s" podCreationTimestamp="2026-04-16 16:33:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:33:16.051623362 +0000 UTC m=+1819.406613491" watchObservedRunningTime="2026-04-16 16:33:16.05268874 +0000 UTC m=+1819.407678870" Apr 16 16:33:17.038697 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:33:17.038661 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-14ca3c-predictor-6b6887bd78-ljmlh" podUID="64248505-177e-4509-a785-8197d9751694" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 16 16:33:27.039563 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:33:27.039519 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-14ca3c-predictor-6b6887bd78-ljmlh" podUID="64248505-177e-4509-a785-8197d9751694" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 16 16:33:37.038917 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:33:37.038874 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-14ca3c-predictor-6b6887bd78-ljmlh" podUID="64248505-177e-4509-a785-8197d9751694" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 16 16:33:47.039104 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:33:47.039062 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-14ca3c-predictor-6b6887bd78-ljmlh" podUID="64248505-177e-4509-a785-8197d9751694" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 16 16:33:57.039445 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:33:57.039401 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-14ca3c-predictor-6b6887bd78-ljmlh" podUID="64248505-177e-4509-a785-8197d9751694" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 16 16:34:07.038760 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:07.038720 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-14ca3c-predictor-6b6887bd78-ljmlh" podUID="64248505-177e-4509-a785-8197d9751694" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 16 16:34:17.039155 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:17.039069 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-14ca3c-predictor-6b6887bd78-ljmlh" podUID="64248505-177e-4509-a785-8197d9751694" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 16 16:34:19.278309 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:19.278282 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-14ca3c-predictor-6b6887bd78-ljmlh" Apr 16 16:34:20.513360 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:20.513327 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-14ca3c-predictor-5b5dd775d7-gfndg"] Apr 16 16:34:20.513711 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:20.513637 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eda2ad7a-c8a0-406c-b0a4-37e69a12aa13" containerName="storage-initializer" Apr 16 16:34:20.513711 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:20.513649 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="eda2ad7a-c8a0-406c-b0a4-37e69a12aa13" containerName="storage-initializer" Apr 16 16:34:20.513711 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:20.513661 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eda2ad7a-c8a0-406c-b0a4-37e69a12aa13" containerName="kserve-container" Apr 16 16:34:20.513711 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:20.513666 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="eda2ad7a-c8a0-406c-b0a4-37e69a12aa13" containerName="kserve-container" Apr 16 16:34:20.513840 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:20.513742 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="eda2ad7a-c8a0-406c-b0a4-37e69a12aa13" containerName="kserve-container" Apr 16 16:34:20.517885 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:20.517867 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-14ca3c-predictor-5b5dd775d7-gfndg" Apr 16 16:34:20.520121 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:20.520095 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-14ca3c\"" Apr 16 16:34:20.520242 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:20.520180 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-14ca3c-dockercfg-drgnr\"" Apr 16 16:34:20.520940 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:20.520926 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 16 16:34:20.524383 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:20.524355 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-14ca3c-predictor-5b5dd775d7-gfndg"] Apr 16 16:34:20.593506 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:20.593475 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/88dff65d-5251-4c89-b55e-9fd74878517a-cabundle-cert\") pod \"isvc-secondary-14ca3c-predictor-5b5dd775d7-gfndg\" (UID: \"88dff65d-5251-4c89-b55e-9fd74878517a\") " pod="kserve-ci-e2e-test/isvc-secondary-14ca3c-predictor-5b5dd775d7-gfndg" Apr 16 16:34:20.593638 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:20.593519 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/88dff65d-5251-4c89-b55e-9fd74878517a-kserve-provision-location\") pod \"isvc-secondary-14ca3c-predictor-5b5dd775d7-gfndg\" (UID: \"88dff65d-5251-4c89-b55e-9fd74878517a\") " pod="kserve-ci-e2e-test/isvc-secondary-14ca3c-predictor-5b5dd775d7-gfndg" Apr 16 16:34:20.694449 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:20.694401 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/88dff65d-5251-4c89-b55e-9fd74878517a-cabundle-cert\") pod \"isvc-secondary-14ca3c-predictor-5b5dd775d7-gfndg\" (UID: \"88dff65d-5251-4c89-b55e-9fd74878517a\") " pod="kserve-ci-e2e-test/isvc-secondary-14ca3c-predictor-5b5dd775d7-gfndg" Apr 16 16:34:20.694449 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:20.694454 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/88dff65d-5251-4c89-b55e-9fd74878517a-kserve-provision-location\") pod \"isvc-secondary-14ca3c-predictor-5b5dd775d7-gfndg\" (UID: \"88dff65d-5251-4c89-b55e-9fd74878517a\") " pod="kserve-ci-e2e-test/isvc-secondary-14ca3c-predictor-5b5dd775d7-gfndg" Apr 16 16:34:20.694823 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:20.694807 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/88dff65d-5251-4c89-b55e-9fd74878517a-kserve-provision-location\") pod \"isvc-secondary-14ca3c-predictor-5b5dd775d7-gfndg\" (UID: \"88dff65d-5251-4c89-b55e-9fd74878517a\") " pod="kserve-ci-e2e-test/isvc-secondary-14ca3c-predictor-5b5dd775d7-gfndg" Apr 16 16:34:20.695018 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:20.694999 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/88dff65d-5251-4c89-b55e-9fd74878517a-cabundle-cert\") pod \"isvc-secondary-14ca3c-predictor-5b5dd775d7-gfndg\" (UID: \"88dff65d-5251-4c89-b55e-9fd74878517a\") " pod="kserve-ci-e2e-test/isvc-secondary-14ca3c-predictor-5b5dd775d7-gfndg" Apr 16 16:34:20.828628 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:20.828548 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-14ca3c-predictor-5b5dd775d7-gfndg" Apr 16 16:34:20.946902 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:20.946561 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-14ca3c-predictor-5b5dd775d7-gfndg"] Apr 16 16:34:20.949571 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:34:20.949538 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88dff65d_5251_4c89_b55e_9fd74878517a.slice/crio-e6c26cc9a995a958b0dae5cc8d4812112aab4c1989473783b46a56837401a5b6 WatchSource:0}: Error finding container e6c26cc9a995a958b0dae5cc8d4812112aab4c1989473783b46a56837401a5b6: Status 404 returned error can't find the container with id e6c26cc9a995a958b0dae5cc8d4812112aab4c1989473783b46a56837401a5b6 Apr 16 16:34:21.238111 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:21.238075 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-14ca3c-predictor-5b5dd775d7-gfndg" event={"ID":"88dff65d-5251-4c89-b55e-9fd74878517a","Type":"ContainerStarted","Data":"b15c5b42b1023f4c51ad6071dd6f2eacd31b2790e3f1d45af5cfca198b9b9faf"} Apr 16 16:34:21.238111 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:21.238119 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-14ca3c-predictor-5b5dd775d7-gfndg" event={"ID":"88dff65d-5251-4c89-b55e-9fd74878517a","Type":"ContainerStarted","Data":"e6c26cc9a995a958b0dae5cc8d4812112aab4c1989473783b46a56837401a5b6"} Apr 16 16:34:25.253268 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:25.253239 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-14ca3c-predictor-5b5dd775d7-gfndg_88dff65d-5251-4c89-b55e-9fd74878517a/storage-initializer/0.log" Apr 16 16:34:25.253624 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:25.253277 2576 generic.go:358] "Generic (PLEG): container finished" podID="88dff65d-5251-4c89-b55e-9fd74878517a" containerID="b15c5b42b1023f4c51ad6071dd6f2eacd31b2790e3f1d45af5cfca198b9b9faf" exitCode=1 Apr 16 16:34:25.253624 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:25.253307 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-14ca3c-predictor-5b5dd775d7-gfndg" event={"ID":"88dff65d-5251-4c89-b55e-9fd74878517a","Type":"ContainerDied","Data":"b15c5b42b1023f4c51ad6071dd6f2eacd31b2790e3f1d45af5cfca198b9b9faf"} Apr 16 16:34:26.257363 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:26.257334 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-14ca3c-predictor-5b5dd775d7-gfndg_88dff65d-5251-4c89-b55e-9fd74878517a/storage-initializer/0.log" Apr 16 16:34:26.257739 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:26.257381 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-14ca3c-predictor-5b5dd775d7-gfndg" event={"ID":"88dff65d-5251-4c89-b55e-9fd74878517a","Type":"ContainerStarted","Data":"acbd04b840d75fadd4449c7ac5b8b252c065d0e402b37c7b7315e7869828fbb5"} Apr 16 16:34:29.267558 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:29.267529 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-14ca3c-predictor-5b5dd775d7-gfndg_88dff65d-5251-4c89-b55e-9fd74878517a/storage-initializer/1.log" Apr 16 16:34:29.267961 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:29.267880 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-14ca3c-predictor-5b5dd775d7-gfndg_88dff65d-5251-4c89-b55e-9fd74878517a/storage-initializer/0.log" Apr 16 16:34:29.267961 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:29.267913 2576 generic.go:358] "Generic (PLEG): container finished" podID="88dff65d-5251-4c89-b55e-9fd74878517a" containerID="acbd04b840d75fadd4449c7ac5b8b252c065d0e402b37c7b7315e7869828fbb5" exitCode=1 Apr 16 16:34:29.268037 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:29.267984 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-14ca3c-predictor-5b5dd775d7-gfndg" event={"ID":"88dff65d-5251-4c89-b55e-9fd74878517a","Type":"ContainerDied","Data":"acbd04b840d75fadd4449c7ac5b8b252c065d0e402b37c7b7315e7869828fbb5"} Apr 16 16:34:29.268037 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:29.268025 2576 scope.go:117] "RemoveContainer" containerID="b15c5b42b1023f4c51ad6071dd6f2eacd31b2790e3f1d45af5cfca198b9b9faf" Apr 16 16:34:29.268306 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:29.268280 2576 scope.go:117] "RemoveContainer" containerID="b15c5b42b1023f4c51ad6071dd6f2eacd31b2790e3f1d45af5cfca198b9b9faf" Apr 16 16:34:29.278372 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:34:29.278345 2576 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-14ca3c-predictor-5b5dd775d7-gfndg_kserve-ci-e2e-test_88dff65d-5251-4c89-b55e-9fd74878517a_0 in pod sandbox e6c26cc9a995a958b0dae5cc8d4812112aab4c1989473783b46a56837401a5b6 from index: no such id: 'b15c5b42b1023f4c51ad6071dd6f2eacd31b2790e3f1d45af5cfca198b9b9faf'" containerID="b15c5b42b1023f4c51ad6071dd6f2eacd31b2790e3f1d45af5cfca198b9b9faf" Apr 16 16:34:29.278436 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:34:29.278398 2576 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-14ca3c-predictor-5b5dd775d7-gfndg_kserve-ci-e2e-test_88dff65d-5251-4c89-b55e-9fd74878517a_0 in pod sandbox e6c26cc9a995a958b0dae5cc8d4812112aab4c1989473783b46a56837401a5b6 from index: no such id: 'b15c5b42b1023f4c51ad6071dd6f2eacd31b2790e3f1d45af5cfca198b9b9faf'; Skipping pod \"isvc-secondary-14ca3c-predictor-5b5dd775d7-gfndg_kserve-ci-e2e-test(88dff65d-5251-4c89-b55e-9fd74878517a)\"" logger="UnhandledError" Apr 16 16:34:29.279666 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:34:29.279646 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-14ca3c-predictor-5b5dd775d7-gfndg_kserve-ci-e2e-test(88dff65d-5251-4c89-b55e-9fd74878517a)\"" pod="kserve-ci-e2e-test/isvc-secondary-14ca3c-predictor-5b5dd775d7-gfndg" podUID="88dff65d-5251-4c89-b55e-9fd74878517a" Apr 16 16:34:30.272018 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:30.271988 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-14ca3c-predictor-5b5dd775d7-gfndg_88dff65d-5251-4c89-b55e-9fd74878517a/storage-initializer/1.log" Apr 16 16:34:36.587881 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:36.587852 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-14ca3c-predictor-5b5dd775d7-gfndg"] Apr 16 16:34:36.634498 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:36.634427 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-14ca3c-predictor-6b6887bd78-ljmlh"] Apr 16 16:34:36.634934 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:36.634781 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-14ca3c-predictor-6b6887bd78-ljmlh" podUID="64248505-177e-4509-a785-8197d9751694" containerName="kserve-container" containerID="cri-o://0135ff03eff90d11736688bd952ac3e83b0e0d4c102bebf9da9480b6b67cffd4" gracePeriod=30 Apr 16 16:34:36.706096 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:36.706048 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-c577e0-predictor-6f8468454b-drlkj"] Apr 16 16:34:36.710542 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:36.710520 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-c577e0-predictor-6f8468454b-drlkj" Apr 16 16:34:36.712740 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:36.712717 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-c577e0\"" Apr 16 16:34:36.712854 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:36.712778 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-c577e0-dockercfg-cbqxg\"" Apr 16 16:34:36.718473 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:36.718353 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-c577e0-predictor-6f8468454b-drlkj"] Apr 16 16:34:36.737799 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:36.737778 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-14ca3c-predictor-5b5dd775d7-gfndg_88dff65d-5251-4c89-b55e-9fd74878517a/storage-initializer/1.log" Apr 16 16:34:36.737888 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:36.737834 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-14ca3c-predictor-5b5dd775d7-gfndg" Apr 16 16:34:36.815541 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:36.815515 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/88dff65d-5251-4c89-b55e-9fd74878517a-cabundle-cert\") pod \"88dff65d-5251-4c89-b55e-9fd74878517a\" (UID: \"88dff65d-5251-4c89-b55e-9fd74878517a\") " Apr 16 16:34:36.815671 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:36.815557 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/88dff65d-5251-4c89-b55e-9fd74878517a-kserve-provision-location\") pod \"88dff65d-5251-4c89-b55e-9fd74878517a\" (UID: \"88dff65d-5251-4c89-b55e-9fd74878517a\") " Apr 16 16:34:36.815671 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:36.815660 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/fe8c6b53-acde-4996-a75f-0ad591e95cad-cabundle-cert\") pod \"isvc-init-fail-c577e0-predictor-6f8468454b-drlkj\" (UID: \"fe8c6b53-acde-4996-a75f-0ad591e95cad\") " pod="kserve-ci-e2e-test/isvc-init-fail-c577e0-predictor-6f8468454b-drlkj" Apr 16 16:34:36.815786 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:36.815713 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fe8c6b53-acde-4996-a75f-0ad591e95cad-kserve-provision-location\") pod \"isvc-init-fail-c577e0-predictor-6f8468454b-drlkj\" (UID: \"fe8c6b53-acde-4996-a75f-0ad591e95cad\") " pod="kserve-ci-e2e-test/isvc-init-fail-c577e0-predictor-6f8468454b-drlkj" Apr 16 16:34:36.815862 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:36.815841 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88dff65d-5251-4c89-b55e-9fd74878517a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "88dff65d-5251-4c89-b55e-9fd74878517a" (UID: "88dff65d-5251-4c89-b55e-9fd74878517a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:34:36.815926 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:36.815896 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88dff65d-5251-4c89-b55e-9fd74878517a-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "88dff65d-5251-4c89-b55e-9fd74878517a" (UID: "88dff65d-5251-4c89-b55e-9fd74878517a"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:34:36.916906 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:36.916869 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fe8c6b53-acde-4996-a75f-0ad591e95cad-kserve-provision-location\") pod \"isvc-init-fail-c577e0-predictor-6f8468454b-drlkj\" (UID: \"fe8c6b53-acde-4996-a75f-0ad591e95cad\") " pod="kserve-ci-e2e-test/isvc-init-fail-c577e0-predictor-6f8468454b-drlkj" Apr 16 16:34:36.917010 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:36.916932 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/fe8c6b53-acde-4996-a75f-0ad591e95cad-cabundle-cert\") pod \"isvc-init-fail-c577e0-predictor-6f8468454b-drlkj\" (UID: \"fe8c6b53-acde-4996-a75f-0ad591e95cad\") " pod="kserve-ci-e2e-test/isvc-init-fail-c577e0-predictor-6f8468454b-drlkj" Apr 16 16:34:36.917010 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:36.916974 2576 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/88dff65d-5251-4c89-b55e-9fd74878517a-cabundle-cert\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:34:36.917010 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:36.916985 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/88dff65d-5251-4c89-b55e-9fd74878517a-kserve-provision-location\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:34:36.917263 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:36.917245 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fe8c6b53-acde-4996-a75f-0ad591e95cad-kserve-provision-location\") pod \"isvc-init-fail-c577e0-predictor-6f8468454b-drlkj\" (UID: \"fe8c6b53-acde-4996-a75f-0ad591e95cad\") " pod="kserve-ci-e2e-test/isvc-init-fail-c577e0-predictor-6f8468454b-drlkj" Apr 16 16:34:36.917503 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:36.917489 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/fe8c6b53-acde-4996-a75f-0ad591e95cad-cabundle-cert\") pod \"isvc-init-fail-c577e0-predictor-6f8468454b-drlkj\" (UID: \"fe8c6b53-acde-4996-a75f-0ad591e95cad\") " pod="kserve-ci-e2e-test/isvc-init-fail-c577e0-predictor-6f8468454b-drlkj" Apr 16 16:34:37.034246 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:37.034197 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-c577e0-predictor-6f8468454b-drlkj" Apr 16 16:34:37.149964 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:37.149930 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-c577e0-predictor-6f8468454b-drlkj"] Apr 16 16:34:37.153076 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:34:37.153035 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe8c6b53_acde_4996_a75f_0ad591e95cad.slice/crio-6fd180150959786540bbfb5b857f7a0e073f37e3caed87b76eb0366d4386cd91 WatchSource:0}: Error finding container 6fd180150959786540bbfb5b857f7a0e073f37e3caed87b76eb0366d4386cd91: Status 404 returned error can't find the container with id 6fd180150959786540bbfb5b857f7a0e073f37e3caed87b76eb0366d4386cd91 Apr 16 16:34:37.295769 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:37.295749 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-14ca3c-predictor-5b5dd775d7-gfndg_88dff65d-5251-4c89-b55e-9fd74878517a/storage-initializer/1.log" Apr 16 16:34:37.295900 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:37.295848 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-14ca3c-predictor-5b5dd775d7-gfndg" event={"ID":"88dff65d-5251-4c89-b55e-9fd74878517a","Type":"ContainerDied","Data":"e6c26cc9a995a958b0dae5cc8d4812112aab4c1989473783b46a56837401a5b6"} Apr 16 16:34:37.295900 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:37.295885 2576 scope.go:117] "RemoveContainer" containerID="acbd04b840d75fadd4449c7ac5b8b252c065d0e402b37c7b7315e7869828fbb5" Apr 16 16:34:37.296022 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:37.295896 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-14ca3c-predictor-5b5dd775d7-gfndg" Apr 16 16:34:37.297806 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:37.297781 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-c577e0-predictor-6f8468454b-drlkj" event={"ID":"fe8c6b53-acde-4996-a75f-0ad591e95cad","Type":"ContainerStarted","Data":"fe537347df979593799ac3213f2c1b38dc5a83b89e8b51450c85b33cbc10e799"} Apr 16 16:34:37.298034 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:37.298015 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-c577e0-predictor-6f8468454b-drlkj" event={"ID":"fe8c6b53-acde-4996-a75f-0ad591e95cad","Type":"ContainerStarted","Data":"6fd180150959786540bbfb5b857f7a0e073f37e3caed87b76eb0366d4386cd91"} Apr 16 16:34:37.326941 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:37.326915 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-14ca3c-predictor-5b5dd775d7-gfndg"] Apr 16 16:34:37.330783 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:37.330747 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-14ca3c-predictor-5b5dd775d7-gfndg"] Apr 16 16:34:39.274901 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:39.274857 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-14ca3c-predictor-6b6887bd78-ljmlh" podUID="64248505-177e-4509-a785-8197d9751694" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 16 16:34:39.278223 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:39.278184 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88dff65d-5251-4c89-b55e-9fd74878517a" path="/var/lib/kubelet/pods/88dff65d-5251-4c89-b55e-9fd74878517a/volumes" Apr 16 16:34:40.782609 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:40.782585 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-14ca3c-predictor-6b6887bd78-ljmlh" Apr 16 16:34:40.847606 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:40.847543 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/64248505-177e-4509-a785-8197d9751694-kserve-provision-location\") pod \"64248505-177e-4509-a785-8197d9751694\" (UID: \"64248505-177e-4509-a785-8197d9751694\") " Apr 16 16:34:40.847836 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:40.847814 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64248505-177e-4509-a785-8197d9751694-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "64248505-177e-4509-a785-8197d9751694" (UID: "64248505-177e-4509-a785-8197d9751694"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:34:40.948153 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:40.948127 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/64248505-177e-4509-a785-8197d9751694-kserve-provision-location\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:34:41.312185 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:41.312163 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-c577e0-predictor-6f8468454b-drlkj_fe8c6b53-acde-4996-a75f-0ad591e95cad/storage-initializer/0.log" Apr 16 16:34:41.312320 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:41.312198 2576 generic.go:358] "Generic (PLEG): container finished" podID="fe8c6b53-acde-4996-a75f-0ad591e95cad" containerID="fe537347df979593799ac3213f2c1b38dc5a83b89e8b51450c85b33cbc10e799" exitCode=1 Apr 16 16:34:41.312320 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:41.312275 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-c577e0-predictor-6f8468454b-drlkj" event={"ID":"fe8c6b53-acde-4996-a75f-0ad591e95cad","Type":"ContainerDied","Data":"fe537347df979593799ac3213f2c1b38dc5a83b89e8b51450c85b33cbc10e799"} Apr 16 16:34:41.313608 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:41.313585 2576 generic.go:358] "Generic (PLEG): container finished" podID="64248505-177e-4509-a785-8197d9751694" containerID="0135ff03eff90d11736688bd952ac3e83b0e0d4c102bebf9da9480b6b67cffd4" exitCode=0 Apr 16 16:34:41.313707 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:41.313617 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-14ca3c-predictor-6b6887bd78-ljmlh" event={"ID":"64248505-177e-4509-a785-8197d9751694","Type":"ContainerDied","Data":"0135ff03eff90d11736688bd952ac3e83b0e0d4c102bebf9da9480b6b67cffd4"} Apr 16 16:34:41.313707 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:41.313658 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-14ca3c-predictor-6b6887bd78-ljmlh" event={"ID":"64248505-177e-4509-a785-8197d9751694","Type":"ContainerDied","Data":"0f215b7c73ecca935f2907f9f9f3e76aef11ece558f65b313d9927861566b847"} Apr 16 16:34:41.313707 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:41.313663 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-14ca3c-predictor-6b6887bd78-ljmlh" Apr 16 16:34:41.313707 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:41.313678 2576 scope.go:117] "RemoveContainer" containerID="0135ff03eff90d11736688bd952ac3e83b0e0d4c102bebf9da9480b6b67cffd4" Apr 16 16:34:41.321223 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:41.321193 2576 scope.go:117] "RemoveContainer" containerID="d50d309edb14e8fbd695df0bc0f190f87ddffa0f73e995de5e4c81a50df36b98" Apr 16 16:34:41.327995 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:41.327971 2576 scope.go:117] "RemoveContainer" containerID="0135ff03eff90d11736688bd952ac3e83b0e0d4c102bebf9da9480b6b67cffd4" Apr 16 16:34:41.328262 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:34:41.328244 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0135ff03eff90d11736688bd952ac3e83b0e0d4c102bebf9da9480b6b67cffd4\": container with ID starting with 0135ff03eff90d11736688bd952ac3e83b0e0d4c102bebf9da9480b6b67cffd4 not found: ID does not exist" containerID="0135ff03eff90d11736688bd952ac3e83b0e0d4c102bebf9da9480b6b67cffd4" Apr 16 16:34:41.328333 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:41.328270 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0135ff03eff90d11736688bd952ac3e83b0e0d4c102bebf9da9480b6b67cffd4"} err="failed to get container status \"0135ff03eff90d11736688bd952ac3e83b0e0d4c102bebf9da9480b6b67cffd4\": rpc error: code = NotFound desc = could not find container \"0135ff03eff90d11736688bd952ac3e83b0e0d4c102bebf9da9480b6b67cffd4\": container with ID starting with 0135ff03eff90d11736688bd952ac3e83b0e0d4c102bebf9da9480b6b67cffd4 not found: ID does not exist" Apr 16 16:34:41.328333 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:41.328286 2576 scope.go:117] "RemoveContainer" containerID="d50d309edb14e8fbd695df0bc0f190f87ddffa0f73e995de5e4c81a50df36b98" Apr 16 16:34:41.328554 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:34:41.328537 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d50d309edb14e8fbd695df0bc0f190f87ddffa0f73e995de5e4c81a50df36b98\": container with ID starting with d50d309edb14e8fbd695df0bc0f190f87ddffa0f73e995de5e4c81a50df36b98 not found: ID does not exist" containerID="d50d309edb14e8fbd695df0bc0f190f87ddffa0f73e995de5e4c81a50df36b98" Apr 16 16:34:41.328618 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:41.328559 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d50d309edb14e8fbd695df0bc0f190f87ddffa0f73e995de5e4c81a50df36b98"} err="failed to get container status \"d50d309edb14e8fbd695df0bc0f190f87ddffa0f73e995de5e4c81a50df36b98\": rpc error: code = NotFound desc = could not find container \"d50d309edb14e8fbd695df0bc0f190f87ddffa0f73e995de5e4c81a50df36b98\": container with ID starting with d50d309edb14e8fbd695df0bc0f190f87ddffa0f73e995de5e4c81a50df36b98 not found: ID does not exist" Apr 16 16:34:41.344053 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:41.344020 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-14ca3c-predictor-6b6887bd78-ljmlh"] Apr 16 16:34:41.345708 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:41.345690 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-14ca3c-predictor-6b6887bd78-ljmlh"] Apr 16 16:34:41.731371 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:41.731340 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-c577e0-predictor-6f8468454b-drlkj"] Apr 16 16:34:41.838906 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:41.838878 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-5864bd8d8b-m9vfm"] Apr 16 16:34:41.839273 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:41.839181 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88dff65d-5251-4c89-b55e-9fd74878517a" containerName="storage-initializer" Apr 16 16:34:41.839273 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:41.839192 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="88dff65d-5251-4c89-b55e-9fd74878517a" containerName="storage-initializer" Apr 16 16:34:41.839273 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:41.839205 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="64248505-177e-4509-a785-8197d9751694" containerName="storage-initializer" Apr 16 16:34:41.839273 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:41.839226 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="64248505-177e-4509-a785-8197d9751694" containerName="storage-initializer" Apr 16 16:34:41.839273 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:41.839234 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88dff65d-5251-4c89-b55e-9fd74878517a" containerName="storage-initializer" Apr 16 16:34:41.839273 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:41.839245 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="88dff65d-5251-4c89-b55e-9fd74878517a" containerName="storage-initializer" Apr 16 16:34:41.839273 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:41.839254 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="64248505-177e-4509-a785-8197d9751694" containerName="kserve-container" Apr 16 16:34:41.839273 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:41.839259 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="64248505-177e-4509-a785-8197d9751694" containerName="kserve-container" Apr 16 16:34:41.839530 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:41.839310 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="88dff65d-5251-4c89-b55e-9fd74878517a" containerName="storage-initializer" Apr 16 16:34:41.839530 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:41.839318 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="88dff65d-5251-4c89-b55e-9fd74878517a" containerName="storage-initializer" Apr 16 16:34:41.839530 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:41.839327 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="64248505-177e-4509-a785-8197d9751694" containerName="kserve-container" Apr 16 16:34:41.845650 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:41.843692 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-5864bd8d8b-m9vfm" Apr 16 16:34:41.847031 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:41.847006 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-wzmqv\"" Apr 16 16:34:41.853564 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:41.853541 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-5864bd8d8b-m9vfm"] Apr 16 16:34:41.956552 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:41.956519 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/451cf185-c631-45d8-938e-c45f5a05922b-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-5864bd8d8b-m9vfm\" (UID: \"451cf185-c631-45d8-938e-c45f5a05922b\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-5864bd8d8b-m9vfm" Apr 16 16:34:42.057145 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:42.057056 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/451cf185-c631-45d8-938e-c45f5a05922b-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-5864bd8d8b-m9vfm\" (UID: \"451cf185-c631-45d8-938e-c45f5a05922b\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-5864bd8d8b-m9vfm" Apr 16 16:34:42.057564 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:42.057537 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/451cf185-c631-45d8-938e-c45f5a05922b-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-5864bd8d8b-m9vfm\" (UID: \"451cf185-c631-45d8-938e-c45f5a05922b\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-5864bd8d8b-m9vfm" Apr 16 16:34:42.155384 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:42.155350 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-5864bd8d8b-m9vfm" Apr 16 16:34:42.274545 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:42.274471 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-5864bd8d8b-m9vfm"] Apr 16 16:34:42.276608 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:34:42.276577 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod451cf185_c631_45d8_938e_c45f5a05922b.slice/crio-f04eef4b279d26aae369fb69c3565e06da35d16c2a45f138183ea5f5f1f61b5e WatchSource:0}: Error finding container f04eef4b279d26aae369fb69c3565e06da35d16c2a45f138183ea5f5f1f61b5e: Status 404 returned error can't find the container with id f04eef4b279d26aae369fb69c3565e06da35d16c2a45f138183ea5f5f1f61b5e Apr 16 16:34:42.318822 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:42.318802 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-c577e0-predictor-6f8468454b-drlkj_fe8c6b53-acde-4996-a75f-0ad591e95cad/storage-initializer/0.log" Apr 16 16:34:42.318926 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:42.318914 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-c577e0-predictor-6f8468454b-drlkj" event={"ID":"fe8c6b53-acde-4996-a75f-0ad591e95cad","Type":"ContainerStarted","Data":"522fc9e3f8849a30cc093cc0e300fabc2875c33b5d9920c8f6a6ad36cb6ad05e"} Apr 16 16:34:42.319065 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:42.319020 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-init-fail-c577e0-predictor-6f8468454b-drlkj" podUID="fe8c6b53-acde-4996-a75f-0ad591e95cad" containerName="storage-initializer" containerID="cri-o://522fc9e3f8849a30cc093cc0e300fabc2875c33b5d9920c8f6a6ad36cb6ad05e" gracePeriod=30 Apr 16 16:34:42.320823 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:42.320797 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-5864bd8d8b-m9vfm" event={"ID":"451cf185-c631-45d8-938e-c45f5a05922b","Type":"ContainerStarted","Data":"f04eef4b279d26aae369fb69c3565e06da35d16c2a45f138183ea5f5f1f61b5e"} Apr 16 16:34:43.278893 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:43.278861 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64248505-177e-4509-a785-8197d9751694" path="/var/lib/kubelet/pods/64248505-177e-4509-a785-8197d9751694/volumes" Apr 16 16:34:43.324544 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:43.324517 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-5864bd8d8b-m9vfm" event={"ID":"451cf185-c631-45d8-938e-c45f5a05922b","Type":"ContainerStarted","Data":"f36c5f9771476c7bf96d53e461bc95f69e7c377b23e5b27147097f2204b1e8a0"} Apr 16 16:34:46.335467 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:46.335432 2576 generic.go:358] "Generic (PLEG): container finished" podID="451cf185-c631-45d8-938e-c45f5a05922b" containerID="f36c5f9771476c7bf96d53e461bc95f69e7c377b23e5b27147097f2204b1e8a0" exitCode=0 Apr 16 16:34:46.335802 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:46.335494 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-5864bd8d8b-m9vfm" event={"ID":"451cf185-c631-45d8-938e-c45f5a05922b","Type":"ContainerDied","Data":"f36c5f9771476c7bf96d53e461bc95f69e7c377b23e5b27147097f2204b1e8a0"} Apr 16 16:34:47.070491 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:47.070464 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-c577e0-predictor-6f8468454b-drlkj_fe8c6b53-acde-4996-a75f-0ad591e95cad/storage-initializer/1.log" Apr 16 16:34:47.070909 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:47.070888 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-c577e0-predictor-6f8468454b-drlkj_fe8c6b53-acde-4996-a75f-0ad591e95cad/storage-initializer/0.log" Apr 16 16:34:47.071040 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:47.070967 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-c577e0-predictor-6f8468454b-drlkj" Apr 16 16:34:47.201779 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:47.201740 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/fe8c6b53-acde-4996-a75f-0ad591e95cad-cabundle-cert\") pod \"fe8c6b53-acde-4996-a75f-0ad591e95cad\" (UID: \"fe8c6b53-acde-4996-a75f-0ad591e95cad\") " Apr 16 16:34:47.201970 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:47.201827 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fe8c6b53-acde-4996-a75f-0ad591e95cad-kserve-provision-location\") pod \"fe8c6b53-acde-4996-a75f-0ad591e95cad\" (UID: \"fe8c6b53-acde-4996-a75f-0ad591e95cad\") " Apr 16 16:34:47.202129 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:47.202094 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe8c6b53-acde-4996-a75f-0ad591e95cad-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "fe8c6b53-acde-4996-a75f-0ad591e95cad" (UID: "fe8c6b53-acde-4996-a75f-0ad591e95cad"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:34:47.202257 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:47.202124 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe8c6b53-acde-4996-a75f-0ad591e95cad-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "fe8c6b53-acde-4996-a75f-0ad591e95cad" (UID: "fe8c6b53-acde-4996-a75f-0ad591e95cad"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:34:47.303070 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:47.303009 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fe8c6b53-acde-4996-a75f-0ad591e95cad-kserve-provision-location\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:34:47.303070 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:47.303037 2576 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/fe8c6b53-acde-4996-a75f-0ad591e95cad-cabundle-cert\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:34:47.340278 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:47.340257 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-c577e0-predictor-6f8468454b-drlkj_fe8c6b53-acde-4996-a75f-0ad591e95cad/storage-initializer/1.log" Apr 16 16:34:47.340735 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:47.340716 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-c577e0-predictor-6f8468454b-drlkj_fe8c6b53-acde-4996-a75f-0ad591e95cad/storage-initializer/0.log" Apr 16 16:34:47.340816 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:47.340761 2576 generic.go:358] "Generic (PLEG): container finished" podID="fe8c6b53-acde-4996-a75f-0ad591e95cad" containerID="522fc9e3f8849a30cc093cc0e300fabc2875c33b5d9920c8f6a6ad36cb6ad05e" exitCode=1 Apr 16 16:34:47.340816 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:47.340798 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-c577e0-predictor-6f8468454b-drlkj" event={"ID":"fe8c6b53-acde-4996-a75f-0ad591e95cad","Type":"ContainerDied","Data":"522fc9e3f8849a30cc093cc0e300fabc2875c33b5d9920c8f6a6ad36cb6ad05e"} Apr 16 16:34:47.340927 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:47.340829 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-c577e0-predictor-6f8468454b-drlkj" event={"ID":"fe8c6b53-acde-4996-a75f-0ad591e95cad","Type":"ContainerDied","Data":"6fd180150959786540bbfb5b857f7a0e073f37e3caed87b76eb0366d4386cd91"} Apr 16 16:34:47.340927 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:47.340849 2576 scope.go:117] "RemoveContainer" containerID="522fc9e3f8849a30cc093cc0e300fabc2875c33b5d9920c8f6a6ad36cb6ad05e" Apr 16 16:34:47.340927 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:47.340873 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-c577e0-predictor-6f8468454b-drlkj" Apr 16 16:34:47.349931 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:47.349906 2576 scope.go:117] "RemoveContainer" containerID="fe537347df979593799ac3213f2c1b38dc5a83b89e8b51450c85b33cbc10e799" Apr 16 16:34:47.357617 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:47.357599 2576 scope.go:117] "RemoveContainer" containerID="522fc9e3f8849a30cc093cc0e300fabc2875c33b5d9920c8f6a6ad36cb6ad05e" Apr 16 16:34:47.357864 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:34:47.357838 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"522fc9e3f8849a30cc093cc0e300fabc2875c33b5d9920c8f6a6ad36cb6ad05e\": container with ID starting with 522fc9e3f8849a30cc093cc0e300fabc2875c33b5d9920c8f6a6ad36cb6ad05e not found: ID does not exist" containerID="522fc9e3f8849a30cc093cc0e300fabc2875c33b5d9920c8f6a6ad36cb6ad05e" Apr 16 16:34:47.357929 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:47.357870 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"522fc9e3f8849a30cc093cc0e300fabc2875c33b5d9920c8f6a6ad36cb6ad05e"} err="failed to get container status \"522fc9e3f8849a30cc093cc0e300fabc2875c33b5d9920c8f6a6ad36cb6ad05e\": rpc error: code = NotFound desc = could not find container \"522fc9e3f8849a30cc093cc0e300fabc2875c33b5d9920c8f6a6ad36cb6ad05e\": container with ID starting with 522fc9e3f8849a30cc093cc0e300fabc2875c33b5d9920c8f6a6ad36cb6ad05e not found: ID does not exist" Apr 16 16:34:47.357929 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:47.357888 2576 scope.go:117] "RemoveContainer" containerID="fe537347df979593799ac3213f2c1b38dc5a83b89e8b51450c85b33cbc10e799" Apr 16 16:34:47.358168 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:34:47.358148 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe537347df979593799ac3213f2c1b38dc5a83b89e8b51450c85b33cbc10e799\": container with ID starting with fe537347df979593799ac3213f2c1b38dc5a83b89e8b51450c85b33cbc10e799 not found: ID does not exist" containerID="fe537347df979593799ac3213f2c1b38dc5a83b89e8b51450c85b33cbc10e799" Apr 16 16:34:47.358257 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:47.358174 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe537347df979593799ac3213f2c1b38dc5a83b89e8b51450c85b33cbc10e799"} err="failed to get container status \"fe537347df979593799ac3213f2c1b38dc5a83b89e8b51450c85b33cbc10e799\": rpc error: code = NotFound desc = could not find container \"fe537347df979593799ac3213f2c1b38dc5a83b89e8b51450c85b33cbc10e799\": container with ID starting with fe537347df979593799ac3213f2c1b38dc5a83b89e8b51450c85b33cbc10e799 not found: ID does not exist" Apr 16 16:34:47.373731 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:47.373692 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-c577e0-predictor-6f8468454b-drlkj"] Apr 16 16:34:47.375960 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:47.375938 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-c577e0-predictor-6f8468454b-drlkj"] Apr 16 16:34:49.278948 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:34:49.278916 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe8c6b53-acde-4996-a75f-0ad591e95cad" path="/var/lib/kubelet/pods/fe8c6b53-acde-4996-a75f-0ad591e95cad/volumes" Apr 16 16:35:06.412692 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:35:06.412658 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-5864bd8d8b-m9vfm" event={"ID":"451cf185-c631-45d8-938e-c45f5a05922b","Type":"ContainerStarted","Data":"38c6ba9e6c7ea829e1183f96abd5ac04038c5e349e591f0a3fba30aee6c60f27"} Apr 16 16:35:06.413144 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:35:06.412977 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-5864bd8d8b-m9vfm" Apr 16 16:35:06.414013 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:35:06.413983 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-5864bd8d8b-m9vfm" podUID="451cf185-c631-45d8-938e-c45f5a05922b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 16 16:35:06.429787 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:35:06.429731 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-5864bd8d8b-m9vfm" podStartSLOduration=6.206486977 podStartE2EDuration="25.429716165s" podCreationTimestamp="2026-04-16 16:34:41 +0000 UTC" firstStartedPulling="2026-04-16 16:34:46.336658121 +0000 UTC m=+1909.691648229" lastFinishedPulling="2026-04-16 16:35:05.559887309 +0000 UTC m=+1928.914877417" observedRunningTime="2026-04-16 16:35:06.428517849 +0000 UTC m=+1929.783507981" watchObservedRunningTime="2026-04-16 16:35:06.429716165 +0000 UTC m=+1929.784706297" Apr 16 16:35:07.416438 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:35:07.416405 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-5864bd8d8b-m9vfm" podUID="451cf185-c631-45d8-938e-c45f5a05922b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 16 16:35:17.416909 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:35:17.416865 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-5864bd8d8b-m9vfm" podUID="451cf185-c631-45d8-938e-c45f5a05922b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 16 16:35:27.417373 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:35:27.417329 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-5864bd8d8b-m9vfm" podUID="451cf185-c631-45d8-938e-c45f5a05922b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 16 16:35:37.417127 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:35:37.417080 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-5864bd8d8b-m9vfm" podUID="451cf185-c631-45d8-938e-c45f5a05922b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 16 16:35:47.417288 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:35:47.417177 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-5864bd8d8b-m9vfm" podUID="451cf185-c631-45d8-938e-c45f5a05922b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 16 16:35:57.416589 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:35:57.416549 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-5864bd8d8b-m9vfm" podUID="451cf185-c631-45d8-938e-c45f5a05922b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 16 16:36:07.417185 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:36:07.417144 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-5864bd8d8b-m9vfm" podUID="451cf185-c631-45d8-938e-c45f5a05922b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.48:8080: connect: connection refused" Apr 16 16:36:17.418553 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:36:17.418521 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-5864bd8d8b-m9vfm" Apr 16 16:36:22.012664 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:36:22.012631 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-5864bd8d8b-m9vfm"] Apr 16 16:36:22.013157 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:36:22.012968 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-5864bd8d8b-m9vfm" podUID="451cf185-c631-45d8-938e-c45f5a05922b" containerName="kserve-container" containerID="cri-o://38c6ba9e6c7ea829e1183f96abd5ac04038c5e349e591f0a3fba30aee6c60f27" gracePeriod=30 Apr 16 16:36:22.100047 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:36:22.100011 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-577fdc969f-q7jfr"] Apr 16 16:36:22.100341 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:36:22.100329 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fe8c6b53-acde-4996-a75f-0ad591e95cad" containerName="storage-initializer" Apr 16 16:36:22.100398 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:36:22.100342 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe8c6b53-acde-4996-a75f-0ad591e95cad" containerName="storage-initializer" Apr 16 16:36:22.100398 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:36:22.100351 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fe8c6b53-acde-4996-a75f-0ad591e95cad" containerName="storage-initializer" Apr 16 16:36:22.100398 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:36:22.100357 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe8c6b53-acde-4996-a75f-0ad591e95cad" containerName="storage-initializer" Apr 16 16:36:22.100519 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:36:22.100414 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="fe8c6b53-acde-4996-a75f-0ad591e95cad" containerName="storage-initializer" Apr 16 16:36:22.100519 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:36:22.100504 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="fe8c6b53-acde-4996-a75f-0ad591e95cad" containerName="storage-initializer" Apr 16 16:36:22.103573 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:36:22.103553 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-577fdc969f-q7jfr" Apr 16 16:36:22.111630 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:36:22.111604 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-577fdc969f-q7jfr"] Apr 16 16:36:22.148178 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:36:22.148155 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ccea3d31-02ad-4bde-96ab-93119af51d60-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-577fdc969f-q7jfr\" (UID: \"ccea3d31-02ad-4bde-96ab-93119af51d60\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-577fdc969f-q7jfr" Apr 16 16:36:22.248639 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:36:22.248611 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ccea3d31-02ad-4bde-96ab-93119af51d60-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-577fdc969f-q7jfr\" (UID: \"ccea3d31-02ad-4bde-96ab-93119af51d60\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-577fdc969f-q7jfr" Apr 16 16:36:22.248974 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:36:22.248953 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ccea3d31-02ad-4bde-96ab-93119af51d60-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-577fdc969f-q7jfr\" (UID: \"ccea3d31-02ad-4bde-96ab-93119af51d60\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-577fdc969f-q7jfr" Apr 16 16:36:22.414500 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:36:22.414475 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-577fdc969f-q7jfr" Apr 16 16:36:22.543411 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:36:22.543380 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-577fdc969f-q7jfr"] Apr 16 16:36:22.549483 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:36:22.549456 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podccea3d31_02ad_4bde_96ab_93119af51d60.slice/crio-df68dc24f6772f6eaf14cb91183d18b6876f46f09b71bb3a2dd366f29047a012 WatchSource:0}: Error finding container df68dc24f6772f6eaf14cb91183d18b6876f46f09b71bb3a2dd366f29047a012: Status 404 returned error can't find the container with id df68dc24f6772f6eaf14cb91183d18b6876f46f09b71bb3a2dd366f29047a012 Apr 16 16:36:22.646903 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:36:22.646874 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-577fdc969f-q7jfr" event={"ID":"ccea3d31-02ad-4bde-96ab-93119af51d60","Type":"ContainerStarted","Data":"b195442fa3767e51cc53d26cb402f9ebd0c58fa7d2eca4739246a126a51f3452"} Apr 16 16:36:22.647028 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:36:22.646911 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-577fdc969f-q7jfr" event={"ID":"ccea3d31-02ad-4bde-96ab-93119af51d60","Type":"ContainerStarted","Data":"df68dc24f6772f6eaf14cb91183d18b6876f46f09b71bb3a2dd366f29047a012"} Apr 16 16:36:26.456423 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:36:26.456403 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-5864bd8d8b-m9vfm" Apr 16 16:36:26.480686 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:36:26.480615 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/451cf185-c631-45d8-938e-c45f5a05922b-kserve-provision-location\") pod \"451cf185-c631-45d8-938e-c45f5a05922b\" (UID: \"451cf185-c631-45d8-938e-c45f5a05922b\") " Apr 16 16:36:26.480905 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:36:26.480884 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/451cf185-c631-45d8-938e-c45f5a05922b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "451cf185-c631-45d8-938e-c45f5a05922b" (UID: "451cf185-c631-45d8-938e-c45f5a05922b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:36:26.581757 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:36:26.581727 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/451cf185-c631-45d8-938e-c45f5a05922b-kserve-provision-location\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:36:26.663750 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:36:26.663715 2576 generic.go:358] "Generic (PLEG): container finished" podID="451cf185-c631-45d8-938e-c45f5a05922b" containerID="38c6ba9e6c7ea829e1183f96abd5ac04038c5e349e591f0a3fba30aee6c60f27" exitCode=0 Apr 16 16:36:26.663905 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:36:26.663795 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-5864bd8d8b-m9vfm" event={"ID":"451cf185-c631-45d8-938e-c45f5a05922b","Type":"ContainerDied","Data":"38c6ba9e6c7ea829e1183f96abd5ac04038c5e349e591f0a3fba30aee6c60f27"} Apr 16 16:36:26.663905 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:36:26.663811 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-5864bd8d8b-m9vfm" Apr 16 16:36:26.663905 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:36:26.663827 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-5864bd8d8b-m9vfm" event={"ID":"451cf185-c631-45d8-938e-c45f5a05922b","Type":"ContainerDied","Data":"f04eef4b279d26aae369fb69c3565e06da35d16c2a45f138183ea5f5f1f61b5e"} Apr 16 16:36:26.663905 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:36:26.663843 2576 scope.go:117] "RemoveContainer" containerID="38c6ba9e6c7ea829e1183f96abd5ac04038c5e349e591f0a3fba30aee6c60f27" Apr 16 16:36:26.665233 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:36:26.665196 2576 generic.go:358] "Generic (PLEG): container finished" podID="ccea3d31-02ad-4bde-96ab-93119af51d60" containerID="b195442fa3767e51cc53d26cb402f9ebd0c58fa7d2eca4739246a126a51f3452" exitCode=0 Apr 16 16:36:26.665362 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:36:26.665235 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-577fdc969f-q7jfr" event={"ID":"ccea3d31-02ad-4bde-96ab-93119af51d60","Type":"ContainerDied","Data":"b195442fa3767e51cc53d26cb402f9ebd0c58fa7d2eca4739246a126a51f3452"} Apr 16 16:36:26.672021 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:36:26.671944 2576 scope.go:117] "RemoveContainer" containerID="f36c5f9771476c7bf96d53e461bc95f69e7c377b23e5b27147097f2204b1e8a0" Apr 16 16:36:26.678657 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:36:26.678640 2576 scope.go:117] "RemoveContainer" containerID="38c6ba9e6c7ea829e1183f96abd5ac04038c5e349e591f0a3fba30aee6c60f27" Apr 16 16:36:26.678883 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:36:26.678862 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38c6ba9e6c7ea829e1183f96abd5ac04038c5e349e591f0a3fba30aee6c60f27\": container with ID starting with 38c6ba9e6c7ea829e1183f96abd5ac04038c5e349e591f0a3fba30aee6c60f27 not found: ID does not exist" containerID="38c6ba9e6c7ea829e1183f96abd5ac04038c5e349e591f0a3fba30aee6c60f27" Apr 16 16:36:26.678954 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:36:26.678893 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38c6ba9e6c7ea829e1183f96abd5ac04038c5e349e591f0a3fba30aee6c60f27"} err="failed to get container status \"38c6ba9e6c7ea829e1183f96abd5ac04038c5e349e591f0a3fba30aee6c60f27\": rpc error: code = NotFound desc = could not find container \"38c6ba9e6c7ea829e1183f96abd5ac04038c5e349e591f0a3fba30aee6c60f27\": container with ID starting with 38c6ba9e6c7ea829e1183f96abd5ac04038c5e349e591f0a3fba30aee6c60f27 not found: ID does not exist" Apr 16 16:36:26.678954 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:36:26.678915 2576 scope.go:117] "RemoveContainer" containerID="f36c5f9771476c7bf96d53e461bc95f69e7c377b23e5b27147097f2204b1e8a0" Apr 16 16:36:26.679135 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:36:26.679120 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f36c5f9771476c7bf96d53e461bc95f69e7c377b23e5b27147097f2204b1e8a0\": container with ID starting with f36c5f9771476c7bf96d53e461bc95f69e7c377b23e5b27147097f2204b1e8a0 not found: ID does not exist" containerID="f36c5f9771476c7bf96d53e461bc95f69e7c377b23e5b27147097f2204b1e8a0" Apr 16 16:36:26.679185 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:36:26.679142 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f36c5f9771476c7bf96d53e461bc95f69e7c377b23e5b27147097f2204b1e8a0"} err="failed to get container status \"f36c5f9771476c7bf96d53e461bc95f69e7c377b23e5b27147097f2204b1e8a0\": rpc error: code = NotFound desc = could not find container \"f36c5f9771476c7bf96d53e461bc95f69e7c377b23e5b27147097f2204b1e8a0\": container with ID starting with f36c5f9771476c7bf96d53e461bc95f69e7c377b23e5b27147097f2204b1e8a0 not found: ID does not exist" Apr 16 16:36:26.695753 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:36:26.695714 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-5864bd8d8b-m9vfm"] Apr 16 16:36:26.697612 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:36:26.697588 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-5864bd8d8b-m9vfm"] Apr 16 16:36:27.278298 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:36:27.278269 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="451cf185-c631-45d8-938e-c45f5a05922b" path="/var/lib/kubelet/pods/451cf185-c631-45d8-938e-c45f5a05922b/volumes" Apr 16 16:36:27.670448 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:36:27.670418 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-577fdc969f-q7jfr" event={"ID":"ccea3d31-02ad-4bde-96ab-93119af51d60","Type":"ContainerStarted","Data":"b53a7d154530fe79e796b0b4542ccb4b4ad3556735d49e1532fc11601c90a9dc"} Apr 16 16:36:27.670869 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:36:27.670703 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-577fdc969f-q7jfr" Apr 16 16:36:27.672035 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:36:27.672008 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-577fdc969f-q7jfr" podUID="ccea3d31-02ad-4bde-96ab-93119af51d60" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.49:8080: connect: connection refused" Apr 16 16:36:27.695451 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:36:27.695409 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-577fdc969f-q7jfr" podStartSLOduration=5.695395978 podStartE2EDuration="5.695395978s" podCreationTimestamp="2026-04-16 16:36:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:36:27.694524302 +0000 UTC m=+2011.049514433" watchObservedRunningTime="2026-04-16 16:36:27.695395978 +0000 UTC m=+2011.050386104" Apr 16 16:36:28.674450 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:36:28.674407 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-577fdc969f-q7jfr" podUID="ccea3d31-02ad-4bde-96ab-93119af51d60" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.49:8080: connect: connection refused" Apr 16 16:36:38.675320 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:36:38.675281 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-577fdc969f-q7jfr" podUID="ccea3d31-02ad-4bde-96ab-93119af51d60" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.49:8080: connect: connection refused" Apr 16 16:36:48.675319 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:36:48.675274 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-577fdc969f-q7jfr" podUID="ccea3d31-02ad-4bde-96ab-93119af51d60" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.49:8080: connect: connection refused" Apr 16 16:36:58.674707 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:36:58.674668 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-577fdc969f-q7jfr" podUID="ccea3d31-02ad-4bde-96ab-93119af51d60" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.49:8080: connect: connection refused" Apr 16 16:37:08.675427 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:37:08.675387 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-577fdc969f-q7jfr" podUID="ccea3d31-02ad-4bde-96ab-93119af51d60" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.49:8080: connect: connection refused" Apr 16 16:37:18.675261 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:37:18.675151 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-577fdc969f-q7jfr" podUID="ccea3d31-02ad-4bde-96ab-93119af51d60" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.49:8080: connect: connection refused" Apr 16 16:37:28.674998 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:37:28.674958 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-577fdc969f-q7jfr" podUID="ccea3d31-02ad-4bde-96ab-93119af51d60" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.49:8080: connect: connection refused" Apr 16 16:37:38.676156 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:37:38.676121 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-577fdc969f-q7jfr" Apr 16 16:37:42.190016 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:37:42.189979 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-577fdc969f-q7jfr"] Apr 16 16:37:42.190426 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:37:42.190271 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-577fdc969f-q7jfr" podUID="ccea3d31-02ad-4bde-96ab-93119af51d60" containerName="kserve-container" containerID="cri-o://b53a7d154530fe79e796b0b4542ccb4b4ad3556735d49e1532fc11601c90a9dc" gracePeriod=30 Apr 16 16:37:42.259866 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:37:42.259835 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-fqqhq"] Apr 16 16:37:42.260147 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:37:42.260136 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="451cf185-c631-45d8-938e-c45f5a05922b" containerName="storage-initializer" Apr 16 16:37:42.260192 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:37:42.260150 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="451cf185-c631-45d8-938e-c45f5a05922b" containerName="storage-initializer" Apr 16 16:37:42.260192 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:37:42.260160 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="451cf185-c631-45d8-938e-c45f5a05922b" containerName="kserve-container" Apr 16 16:37:42.260192 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:37:42.260166 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="451cf185-c631-45d8-938e-c45f5a05922b" containerName="kserve-container" Apr 16 16:37:42.260306 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:37:42.260239 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="451cf185-c631-45d8-938e-c45f5a05922b" containerName="kserve-container" Apr 16 16:37:42.264121 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:37:42.264084 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-fqqhq" Apr 16 16:37:42.274243 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:37:42.274200 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-fqqhq"] Apr 16 16:37:42.434533 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:37:42.434496 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a460751e-ae5f-419f-bc07-7c6d323cb7a8-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-785bb8f7db-fqqhq\" (UID: \"a460751e-ae5f-419f-bc07-7c6d323cb7a8\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-fqqhq" Apr 16 16:37:42.536012 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:37:42.535929 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a460751e-ae5f-419f-bc07-7c6d323cb7a8-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-785bb8f7db-fqqhq\" (UID: \"a460751e-ae5f-419f-bc07-7c6d323cb7a8\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-fqqhq" Apr 16 16:37:42.536323 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:37:42.536304 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a460751e-ae5f-419f-bc07-7c6d323cb7a8-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-785bb8f7db-fqqhq\" (UID: \"a460751e-ae5f-419f-bc07-7c6d323cb7a8\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-fqqhq" Apr 16 16:37:42.575630 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:37:42.575590 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-fqqhq" Apr 16 16:37:42.695773 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:37:42.695740 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-fqqhq"] Apr 16 16:37:42.698701 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:37:42.698671 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda460751e_ae5f_419f_bc07_7c6d323cb7a8.slice/crio-156dcf569285207db143fb364bf8343ba19f7306c33911c83a6268b75bad94c8 WatchSource:0}: Error finding container 156dcf569285207db143fb364bf8343ba19f7306c33911c83a6268b75bad94c8: Status 404 returned error can't find the container with id 156dcf569285207db143fb364bf8343ba19f7306c33911c83a6268b75bad94c8 Apr 16 16:37:42.700542 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:37:42.700525 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:37:42.911094 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:37:42.911064 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-fqqhq" event={"ID":"a460751e-ae5f-419f-bc07-7c6d323cb7a8","Type":"ContainerStarted","Data":"f85420cb9d6e364ac855fdb3ca8de7b1990c17edb7cc5533733f3ade68f3dca0"} Apr 16 16:37:42.911094 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:37:42.911099 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-fqqhq" event={"ID":"a460751e-ae5f-419f-bc07-7c6d323cb7a8","Type":"ContainerStarted","Data":"156dcf569285207db143fb364bf8343ba19f7306c33911c83a6268b75bad94c8"} Apr 16 16:37:46.831893 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:37:46.831871 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-577fdc969f-q7jfr" Apr 16 16:37:46.924768 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:37:46.924739 2576 generic.go:358] "Generic (PLEG): container finished" podID="ccea3d31-02ad-4bde-96ab-93119af51d60" containerID="b53a7d154530fe79e796b0b4542ccb4b4ad3556735d49e1532fc11601c90a9dc" exitCode=0 Apr 16 16:37:46.924925 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:37:46.924807 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-577fdc969f-q7jfr" Apr 16 16:37:46.924925 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:37:46.924808 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-577fdc969f-q7jfr" event={"ID":"ccea3d31-02ad-4bde-96ab-93119af51d60","Type":"ContainerDied","Data":"b53a7d154530fe79e796b0b4542ccb4b4ad3556735d49e1532fc11601c90a9dc"} Apr 16 16:37:46.924925 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:37:46.924904 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-577fdc969f-q7jfr" event={"ID":"ccea3d31-02ad-4bde-96ab-93119af51d60","Type":"ContainerDied","Data":"df68dc24f6772f6eaf14cb91183d18b6876f46f09b71bb3a2dd366f29047a012"} Apr 16 16:37:46.924925 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:37:46.924921 2576 scope.go:117] "RemoveContainer" containerID="b53a7d154530fe79e796b0b4542ccb4b4ad3556735d49e1532fc11601c90a9dc" Apr 16 16:37:46.926266 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:37:46.926242 2576 generic.go:358] "Generic (PLEG): container finished" podID="a460751e-ae5f-419f-bc07-7c6d323cb7a8" containerID="f85420cb9d6e364ac855fdb3ca8de7b1990c17edb7cc5533733f3ade68f3dca0" exitCode=0 Apr 16 16:37:46.926372 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:37:46.926303 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-fqqhq" event={"ID":"a460751e-ae5f-419f-bc07-7c6d323cb7a8","Type":"ContainerDied","Data":"f85420cb9d6e364ac855fdb3ca8de7b1990c17edb7cc5533733f3ade68f3dca0"} Apr 16 16:37:46.932776 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:37:46.932727 2576 scope.go:117] "RemoveContainer" containerID="b195442fa3767e51cc53d26cb402f9ebd0c58fa7d2eca4739246a126a51f3452" Apr 16 16:37:46.939703 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:37:46.939682 2576 scope.go:117] "RemoveContainer" containerID="b53a7d154530fe79e796b0b4542ccb4b4ad3556735d49e1532fc11601c90a9dc" Apr 16 16:37:46.939943 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:37:46.939924 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b53a7d154530fe79e796b0b4542ccb4b4ad3556735d49e1532fc11601c90a9dc\": container with ID starting with b53a7d154530fe79e796b0b4542ccb4b4ad3556735d49e1532fc11601c90a9dc not found: ID does not exist" containerID="b53a7d154530fe79e796b0b4542ccb4b4ad3556735d49e1532fc11601c90a9dc" Apr 16 16:37:46.939998 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:37:46.939951 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b53a7d154530fe79e796b0b4542ccb4b4ad3556735d49e1532fc11601c90a9dc"} err="failed to get container status \"b53a7d154530fe79e796b0b4542ccb4b4ad3556735d49e1532fc11601c90a9dc\": rpc error: code = NotFound desc = could not find container \"b53a7d154530fe79e796b0b4542ccb4b4ad3556735d49e1532fc11601c90a9dc\": container with ID starting with b53a7d154530fe79e796b0b4542ccb4b4ad3556735d49e1532fc11601c90a9dc not found: ID does not exist" Apr 16 16:37:46.939998 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:37:46.939967 2576 scope.go:117] "RemoveContainer" containerID="b195442fa3767e51cc53d26cb402f9ebd0c58fa7d2eca4739246a126a51f3452" Apr 16 16:37:46.940224 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:37:46.940191 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b195442fa3767e51cc53d26cb402f9ebd0c58fa7d2eca4739246a126a51f3452\": container with ID starting with b195442fa3767e51cc53d26cb402f9ebd0c58fa7d2eca4739246a126a51f3452 not found: ID does not exist" containerID="b195442fa3767e51cc53d26cb402f9ebd0c58fa7d2eca4739246a126a51f3452" Apr 16 16:37:46.940278 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:37:46.940228 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b195442fa3767e51cc53d26cb402f9ebd0c58fa7d2eca4739246a126a51f3452"} err="failed to get container status \"b195442fa3767e51cc53d26cb402f9ebd0c58fa7d2eca4739246a126a51f3452\": rpc error: code = NotFound desc = could not find container \"b195442fa3767e51cc53d26cb402f9ebd0c58fa7d2eca4739246a126a51f3452\": container with ID starting with b195442fa3767e51cc53d26cb402f9ebd0c58fa7d2eca4739246a126a51f3452 not found: ID does not exist" Apr 16 16:37:46.972954 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:37:46.972931 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ccea3d31-02ad-4bde-96ab-93119af51d60-kserve-provision-location\") pod \"ccea3d31-02ad-4bde-96ab-93119af51d60\" (UID: \"ccea3d31-02ad-4bde-96ab-93119af51d60\") " Apr 16 16:37:46.973224 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:37:46.973185 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccea3d31-02ad-4bde-96ab-93119af51d60-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ccea3d31-02ad-4bde-96ab-93119af51d60" (UID: "ccea3d31-02ad-4bde-96ab-93119af51d60"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:37:47.074402 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:37:47.074368 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ccea3d31-02ad-4bde-96ab-93119af51d60-kserve-provision-location\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:37:47.250482 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:37:47.250453 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-577fdc969f-q7jfr"] Apr 16 16:37:47.254601 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:37:47.254576 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-577fdc969f-q7jfr"] Apr 16 16:37:47.278714 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:37:47.278686 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccea3d31-02ad-4bde-96ab-93119af51d60" path="/var/lib/kubelet/pods/ccea3d31-02ad-4bde-96ab-93119af51d60/volumes" Apr 16 16:37:47.931149 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:37:47.931111 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-fqqhq" event={"ID":"a460751e-ae5f-419f-bc07-7c6d323cb7a8","Type":"ContainerStarted","Data":"e8dc38a9f63bc1ba23335a9eacbc66a5867e7a2221b15b8bcab9f413319d4729"} Apr 16 16:37:47.931656 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:37:47.931431 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-fqqhq" Apr 16 16:37:47.932679 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:37:47.932650 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-fqqhq" podUID="a460751e-ae5f-419f-bc07-7c6d323cb7a8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.50:8080: connect: connection refused" Apr 16 16:37:47.950330 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:37:47.950285 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-fqqhq" podStartSLOduration=5.950272405 podStartE2EDuration="5.950272405s" podCreationTimestamp="2026-04-16 16:37:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:37:47.948142307 +0000 UTC m=+2091.303132440" watchObservedRunningTime="2026-04-16 16:37:47.950272405 +0000 UTC m=+2091.305262530" Apr 16 16:37:48.934987 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:37:48.934951 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-fqqhq" podUID="a460751e-ae5f-419f-bc07-7c6d323cb7a8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.50:8080: connect: connection refused" Apr 16 16:37:58.935693 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:37:58.935647 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-fqqhq" podUID="a460751e-ae5f-419f-bc07-7c6d323cb7a8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.50:8080: connect: connection refused" Apr 16 16:38:08.935746 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:38:08.935707 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-fqqhq" podUID="a460751e-ae5f-419f-bc07-7c6d323cb7a8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.50:8080: connect: connection refused" Apr 16 16:38:18.935195 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:38:18.935155 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-fqqhq" podUID="a460751e-ae5f-419f-bc07-7c6d323cb7a8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.50:8080: connect: connection refused" Apr 16 16:38:28.935348 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:38:28.935302 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-fqqhq" podUID="a460751e-ae5f-419f-bc07-7c6d323cb7a8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.50:8080: connect: connection refused" Apr 16 16:38:38.935188 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:38:38.935147 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-fqqhq" podUID="a460751e-ae5f-419f-bc07-7c6d323cb7a8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.50:8080: connect: connection refused" Apr 16 16:38:48.935474 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:38:48.935386 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-fqqhq" podUID="a460751e-ae5f-419f-bc07-7c6d323cb7a8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.50:8080: connect: connection refused" Apr 16 16:38:58.936436 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:38:58.936390 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-fqqhq" Apr 16 16:39:02.458597 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:39:02.458560 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-fqqhq"] Apr 16 16:39:02.459062 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:39:02.458793 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-fqqhq" podUID="a460751e-ae5f-419f-bc07-7c6d323cb7a8" containerName="kserve-container" containerID="cri-o://e8dc38a9f63bc1ba23335a9eacbc66a5867e7a2221b15b8bcab9f413319d4729" gracePeriod=30 Apr 16 16:39:02.523903 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:39:02.523865 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-7f8f9f49bf-4l48r"] Apr 16 16:39:02.524337 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:39:02.524313 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ccea3d31-02ad-4bde-96ab-93119af51d60" containerName="kserve-container" Apr 16 16:39:02.524337 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:39:02.524330 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccea3d31-02ad-4bde-96ab-93119af51d60" containerName="kserve-container" Apr 16 16:39:02.524504 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:39:02.524345 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ccea3d31-02ad-4bde-96ab-93119af51d60" containerName="storage-initializer" Apr 16 16:39:02.524504 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:39:02.524353 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccea3d31-02ad-4bde-96ab-93119af51d60" containerName="storage-initializer" Apr 16 16:39:02.524504 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:39:02.524402 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ccea3d31-02ad-4bde-96ab-93119af51d60" containerName="kserve-container" Apr 16 16:39:02.527291 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:39:02.527273 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-7f8f9f49bf-4l48r" Apr 16 16:39:02.535767 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:39:02.535739 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-7f8f9f49bf-4l48r"] Apr 16 16:39:02.641677 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:39:02.641638 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3b68ea46-e001-4826-aa42-23197e2691e6-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-7f8f9f49bf-4l48r\" (UID: \"3b68ea46-e001-4826-aa42-23197e2691e6\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-7f8f9f49bf-4l48r" Apr 16 16:39:02.742720 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:39:02.742630 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3b68ea46-e001-4826-aa42-23197e2691e6-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-7f8f9f49bf-4l48r\" (UID: \"3b68ea46-e001-4826-aa42-23197e2691e6\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-7f8f9f49bf-4l48r" Apr 16 16:39:02.743061 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:39:02.743038 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3b68ea46-e001-4826-aa42-23197e2691e6-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-7f8f9f49bf-4l48r\" (UID: \"3b68ea46-e001-4826-aa42-23197e2691e6\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-7f8f9f49bf-4l48r" Apr 16 16:39:02.838340 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:39:02.838299 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-7f8f9f49bf-4l48r" Apr 16 16:39:02.955989 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:39:02.955823 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-7f8f9f49bf-4l48r"] Apr 16 16:39:02.958798 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:39:02.958769 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b68ea46_e001_4826_aa42_23197e2691e6.slice/crio-aa2535b47e67467b764e095cdfa5a045931a86111557076f858e7b4aaa63e7e2 WatchSource:0}: Error finding container aa2535b47e67467b764e095cdfa5a045931a86111557076f858e7b4aaa63e7e2: Status 404 returned error can't find the container with id aa2535b47e67467b764e095cdfa5a045931a86111557076f858e7b4aaa63e7e2 Apr 16 16:39:03.173245 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:39:03.173194 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-7f8f9f49bf-4l48r" event={"ID":"3b68ea46-e001-4826-aa42-23197e2691e6","Type":"ContainerStarted","Data":"57dd258940022af45e966e5125c0e06085bcf61efcb614b00f4bae2468834a39"} Apr 16 16:39:03.173245 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:39:03.173248 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-7f8f9f49bf-4l48r" event={"ID":"3b68ea46-e001-4826-aa42-23197e2691e6","Type":"ContainerStarted","Data":"aa2535b47e67467b764e095cdfa5a045931a86111557076f858e7b4aaa63e7e2"} Apr 16 16:39:06.988408 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:39:06.988386 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-fqqhq" Apr 16 16:39:07.175978 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:39:07.175897 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a460751e-ae5f-419f-bc07-7c6d323cb7a8-kserve-provision-location\") pod \"a460751e-ae5f-419f-bc07-7c6d323cb7a8\" (UID: \"a460751e-ae5f-419f-bc07-7c6d323cb7a8\") " Apr 16 16:39:07.176249 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:39:07.176201 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a460751e-ae5f-419f-bc07-7c6d323cb7a8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a460751e-ae5f-419f-bc07-7c6d323cb7a8" (UID: "a460751e-ae5f-419f-bc07-7c6d323cb7a8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:39:07.188018 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:39:07.187991 2576 generic.go:358] "Generic (PLEG): container finished" podID="3b68ea46-e001-4826-aa42-23197e2691e6" containerID="57dd258940022af45e966e5125c0e06085bcf61efcb614b00f4bae2468834a39" exitCode=0 Apr 16 16:39:07.188156 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:39:07.188072 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-7f8f9f49bf-4l48r" event={"ID":"3b68ea46-e001-4826-aa42-23197e2691e6","Type":"ContainerDied","Data":"57dd258940022af45e966e5125c0e06085bcf61efcb614b00f4bae2468834a39"} Apr 16 16:39:07.189408 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:39:07.189387 2576 generic.go:358] "Generic (PLEG): container finished" podID="a460751e-ae5f-419f-bc07-7c6d323cb7a8" containerID="e8dc38a9f63bc1ba23335a9eacbc66a5867e7a2221b15b8bcab9f413319d4729" exitCode=0 Apr 16 16:39:07.189511 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:39:07.189458 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-fqqhq" event={"ID":"a460751e-ae5f-419f-bc07-7c6d323cb7a8","Type":"ContainerDied","Data":"e8dc38a9f63bc1ba23335a9eacbc66a5867e7a2221b15b8bcab9f413319d4729"} Apr 16 16:39:07.189511 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:39:07.189486 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-fqqhq" event={"ID":"a460751e-ae5f-419f-bc07-7c6d323cb7a8","Type":"ContainerDied","Data":"156dcf569285207db143fb364bf8343ba19f7306c33911c83a6268b75bad94c8"} Apr 16 16:39:07.189511 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:39:07.189508 2576 scope.go:117] "RemoveContainer" containerID="e8dc38a9f63bc1ba23335a9eacbc66a5867e7a2221b15b8bcab9f413319d4729" Apr 16 16:39:07.189611 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:39:07.189516 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-fqqhq" Apr 16 16:39:07.199273 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:39:07.199257 2576 scope.go:117] "RemoveContainer" containerID="f85420cb9d6e364ac855fdb3ca8de7b1990c17edb7cc5533733f3ade68f3dca0" Apr 16 16:39:07.206639 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:39:07.206620 2576 scope.go:117] "RemoveContainer" containerID="e8dc38a9f63bc1ba23335a9eacbc66a5867e7a2221b15b8bcab9f413319d4729" Apr 16 16:39:07.206897 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:39:07.206878 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8dc38a9f63bc1ba23335a9eacbc66a5867e7a2221b15b8bcab9f413319d4729\": container with ID starting with e8dc38a9f63bc1ba23335a9eacbc66a5867e7a2221b15b8bcab9f413319d4729 not found: ID does not exist" containerID="e8dc38a9f63bc1ba23335a9eacbc66a5867e7a2221b15b8bcab9f413319d4729" Apr 16 16:39:07.206964 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:39:07.206906 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8dc38a9f63bc1ba23335a9eacbc66a5867e7a2221b15b8bcab9f413319d4729"} err="failed to get container status \"e8dc38a9f63bc1ba23335a9eacbc66a5867e7a2221b15b8bcab9f413319d4729\": rpc error: code = NotFound desc = could not find container \"e8dc38a9f63bc1ba23335a9eacbc66a5867e7a2221b15b8bcab9f413319d4729\": container with ID starting with e8dc38a9f63bc1ba23335a9eacbc66a5867e7a2221b15b8bcab9f413319d4729 not found: ID does not exist" Apr 16 16:39:07.206964 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:39:07.206923 2576 scope.go:117] "RemoveContainer" containerID="f85420cb9d6e364ac855fdb3ca8de7b1990c17edb7cc5533733f3ade68f3dca0" Apr 16 16:39:07.207194 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:39:07.207173 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f85420cb9d6e364ac855fdb3ca8de7b1990c17edb7cc5533733f3ade68f3dca0\": container with ID starting with f85420cb9d6e364ac855fdb3ca8de7b1990c17edb7cc5533733f3ade68f3dca0 not found: ID does not exist" containerID="f85420cb9d6e364ac855fdb3ca8de7b1990c17edb7cc5533733f3ade68f3dca0" Apr 16 16:39:07.207277 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:39:07.207205 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f85420cb9d6e364ac855fdb3ca8de7b1990c17edb7cc5533733f3ade68f3dca0"} err="failed to get container status \"f85420cb9d6e364ac855fdb3ca8de7b1990c17edb7cc5533733f3ade68f3dca0\": rpc error: code = NotFound desc = could not find container \"f85420cb9d6e364ac855fdb3ca8de7b1990c17edb7cc5533733f3ade68f3dca0\": container with ID starting with f85420cb9d6e364ac855fdb3ca8de7b1990c17edb7cc5533733f3ade68f3dca0 not found: ID does not exist" Apr 16 16:39:07.250098 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:39:07.250076 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-fqqhq"] Apr 16 16:39:07.257039 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:39:07.257018 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-785bb8f7db-fqqhq"] Apr 16 16:39:07.276404 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:39:07.276380 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a460751e-ae5f-419f-bc07-7c6d323cb7a8-kserve-provision-location\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:39:07.278499 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:39:07.278477 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a460751e-ae5f-419f-bc07-7c6d323cb7a8" path="/var/lib/kubelet/pods/a460751e-ae5f-419f-bc07-7c6d323cb7a8/volumes" Apr 16 16:39:08.194066 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:39:08.194026 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-7f8f9f49bf-4l48r" event={"ID":"3b68ea46-e001-4826-aa42-23197e2691e6","Type":"ContainerStarted","Data":"f7c91cc03216d78766cb7975cc76bded70b8b3a5f17bfcfb40e603f9d84c1b19"} Apr 16 16:39:08.194562 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:39:08.194279 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-7f8f9f49bf-4l48r" Apr 16 16:39:08.209977 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:39:08.209925 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-7f8f9f49bf-4l48r" podStartSLOduration=6.209911222 podStartE2EDuration="6.209911222s" podCreationTimestamp="2026-04-16 16:39:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:39:08.209521652 +0000 UTC m=+2171.564511794" watchObservedRunningTime="2026-04-16 16:39:08.209911222 +0000 UTC m=+2171.564901356" Apr 16 16:39:39.200230 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:39:39.200161 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-7f8f9f49bf-4l48r" podUID="3b68ea46-e001-4826-aa42-23197e2691e6" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.51:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.51:8080: connect: connection refused" Apr 16 16:39:49.199405 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:39:49.199360 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-7f8f9f49bf-4l48r" podUID="3b68ea46-e001-4826-aa42-23197e2691e6" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.51:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.51:8080: connect: connection refused" Apr 16 16:39:59.199615 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:39:59.199570 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-7f8f9f49bf-4l48r" podUID="3b68ea46-e001-4826-aa42-23197e2691e6" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.51:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.51:8080: connect: connection refused" Apr 16 16:40:09.199110 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:40:09.199062 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-7f8f9f49bf-4l48r" podUID="3b68ea46-e001-4826-aa42-23197e2691e6" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.51:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.51:8080: connect: connection refused" Apr 16 16:40:19.202298 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:40:19.202203 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-7f8f9f49bf-4l48r" Apr 16 16:40:22.653688 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:40:22.653651 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-7f8f9f49bf-4l48r"] Apr 16 16:40:22.654145 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:40:22.653949 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-7f8f9f49bf-4l48r" podUID="3b68ea46-e001-4826-aa42-23197e2691e6" containerName="kserve-container" containerID="cri-o://f7c91cc03216d78766cb7975cc76bded70b8b3a5f17bfcfb40e603f9d84c1b19" gracePeriod=30 Apr 16 16:40:22.726021 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:40:22.725988 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-795b445d66-5kbwg"] Apr 16 16:40:22.726344 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:40:22.726331 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a460751e-ae5f-419f-bc07-7c6d323cb7a8" containerName="kserve-container" Apr 16 16:40:22.726394 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:40:22.726346 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a460751e-ae5f-419f-bc07-7c6d323cb7a8" containerName="kserve-container" Apr 16 16:40:22.726394 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:40:22.726368 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a460751e-ae5f-419f-bc07-7c6d323cb7a8" containerName="storage-initializer" Apr 16 16:40:22.726394 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:40:22.726374 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a460751e-ae5f-419f-bc07-7c6d323cb7a8" containerName="storage-initializer" Apr 16 16:40:22.726489 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:40:22.726428 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a460751e-ae5f-419f-bc07-7c6d323cb7a8" containerName="kserve-container" Apr 16 16:40:22.730523 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:40:22.730506 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-795b445d66-5kbwg" Apr 16 16:40:22.736735 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:40:22.736715 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-795b445d66-5kbwg"] Apr 16 16:40:22.854479 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:40:22.854448 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b846a93-2540-4450-8c3c-417545aecf80-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-795b445d66-5kbwg\" (UID: \"5b846a93-2540-4450-8c3c-417545aecf80\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-795b445d66-5kbwg" Apr 16 16:40:22.955801 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:40:22.955704 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b846a93-2540-4450-8c3c-417545aecf80-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-795b445d66-5kbwg\" (UID: \"5b846a93-2540-4450-8c3c-417545aecf80\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-795b445d66-5kbwg" Apr 16 16:40:22.956122 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:40:22.956097 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b846a93-2540-4450-8c3c-417545aecf80-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-795b445d66-5kbwg\" (UID: \"5b846a93-2540-4450-8c3c-417545aecf80\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-795b445d66-5kbwg" Apr 16 16:40:23.042036 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:40:23.041995 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-795b445d66-5kbwg" Apr 16 16:40:23.156797 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:40:23.156767 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-795b445d66-5kbwg"] Apr 16 16:40:23.159428 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:40:23.159403 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b846a93_2540_4450_8c3c_417545aecf80.slice/crio-a07fcff1fce00a742ae62d5e4636e6e23a12f6533f64af2cd0ca4ffb0f89961e WatchSource:0}: Error finding container a07fcff1fce00a742ae62d5e4636e6e23a12f6533f64af2cd0ca4ffb0f89961e: Status 404 returned error can't find the container with id a07fcff1fce00a742ae62d5e4636e6e23a12f6533f64af2cd0ca4ffb0f89961e Apr 16 16:40:23.427170 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:40:23.427092 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-795b445d66-5kbwg" event={"ID":"5b846a93-2540-4450-8c3c-417545aecf80","Type":"ContainerStarted","Data":"e538de6ead66c42408a5ce8af93c98f7099dbf81154594a4f166ff0d4c0ffef2"} Apr 16 16:40:23.427170 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:40:23.427133 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-795b445d66-5kbwg" event={"ID":"5b846a93-2540-4450-8c3c-417545aecf80","Type":"ContainerStarted","Data":"a07fcff1fce00a742ae62d5e4636e6e23a12f6533f64af2cd0ca4ffb0f89961e"} Apr 16 16:40:27.392188 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:40:27.392165 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-7f8f9f49bf-4l48r" Apr 16 16:40:27.441142 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:40:27.441106 2576 generic.go:358] "Generic (PLEG): container finished" podID="5b846a93-2540-4450-8c3c-417545aecf80" containerID="e538de6ead66c42408a5ce8af93c98f7099dbf81154594a4f166ff0d4c0ffef2" exitCode=0 Apr 16 16:40:27.441304 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:40:27.441191 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-795b445d66-5kbwg" event={"ID":"5b846a93-2540-4450-8c3c-417545aecf80","Type":"ContainerDied","Data":"e538de6ead66c42408a5ce8af93c98f7099dbf81154594a4f166ff0d4c0ffef2"} Apr 16 16:40:27.442686 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:40:27.442666 2576 generic.go:358] "Generic (PLEG): container finished" podID="3b68ea46-e001-4826-aa42-23197e2691e6" containerID="f7c91cc03216d78766cb7975cc76bded70b8b3a5f17bfcfb40e603f9d84c1b19" exitCode=0 Apr 16 16:40:27.442791 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:40:27.442723 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-7f8f9f49bf-4l48r" event={"ID":"3b68ea46-e001-4826-aa42-23197e2691e6","Type":"ContainerDied","Data":"f7c91cc03216d78766cb7975cc76bded70b8b3a5f17bfcfb40e603f9d84c1b19"} Apr 16 16:40:27.442791 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:40:27.442742 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-7f8f9f49bf-4l48r" Apr 16 16:40:27.442791 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:40:27.442761 2576 scope.go:117] "RemoveContainer" containerID="f7c91cc03216d78766cb7975cc76bded70b8b3a5f17bfcfb40e603f9d84c1b19" Apr 16 16:40:27.442915 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:40:27.442751 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-7f8f9f49bf-4l48r" event={"ID":"3b68ea46-e001-4826-aa42-23197e2691e6","Type":"ContainerDied","Data":"aa2535b47e67467b764e095cdfa5a045931a86111557076f858e7b4aaa63e7e2"} Apr 16 16:40:27.450187 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:40:27.450170 2576 scope.go:117] "RemoveContainer" containerID="57dd258940022af45e966e5125c0e06085bcf61efcb614b00f4bae2468834a39" Apr 16 16:40:27.458940 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:40:27.458920 2576 scope.go:117] "RemoveContainer" containerID="f7c91cc03216d78766cb7975cc76bded70b8b3a5f17bfcfb40e603f9d84c1b19" Apr 16 16:40:27.459224 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:40:27.459192 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7c91cc03216d78766cb7975cc76bded70b8b3a5f17bfcfb40e603f9d84c1b19\": container with ID starting with f7c91cc03216d78766cb7975cc76bded70b8b3a5f17bfcfb40e603f9d84c1b19 not found: ID does not exist" containerID="f7c91cc03216d78766cb7975cc76bded70b8b3a5f17bfcfb40e603f9d84c1b19" Apr 16 16:40:27.459300 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:40:27.459237 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7c91cc03216d78766cb7975cc76bded70b8b3a5f17bfcfb40e603f9d84c1b19"} err="failed to get container status \"f7c91cc03216d78766cb7975cc76bded70b8b3a5f17bfcfb40e603f9d84c1b19\": rpc error: code = NotFound desc = could not find container \"f7c91cc03216d78766cb7975cc76bded70b8b3a5f17bfcfb40e603f9d84c1b19\": container with ID starting with f7c91cc03216d78766cb7975cc76bded70b8b3a5f17bfcfb40e603f9d84c1b19 not found: ID does not exist" Apr 16 16:40:27.459300 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:40:27.459254 2576 scope.go:117] "RemoveContainer" containerID="57dd258940022af45e966e5125c0e06085bcf61efcb614b00f4bae2468834a39" Apr 16 16:40:27.459506 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:40:27.459486 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57dd258940022af45e966e5125c0e06085bcf61efcb614b00f4bae2468834a39\": container with ID starting with 57dd258940022af45e966e5125c0e06085bcf61efcb614b00f4bae2468834a39 not found: ID does not exist" containerID="57dd258940022af45e966e5125c0e06085bcf61efcb614b00f4bae2468834a39" Apr 16 16:40:27.459569 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:40:27.459513 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57dd258940022af45e966e5125c0e06085bcf61efcb614b00f4bae2468834a39"} err="failed to get container status \"57dd258940022af45e966e5125c0e06085bcf61efcb614b00f4bae2468834a39\": rpc error: code = NotFound desc = could not find container \"57dd258940022af45e966e5125c0e06085bcf61efcb614b00f4bae2468834a39\": container with ID starting with 57dd258940022af45e966e5125c0e06085bcf61efcb614b00f4bae2468834a39 not found: ID does not exist" Apr 16 16:40:27.489719 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:40:27.489692 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3b68ea46-e001-4826-aa42-23197e2691e6-kserve-provision-location\") pod \"3b68ea46-e001-4826-aa42-23197e2691e6\" (UID: \"3b68ea46-e001-4826-aa42-23197e2691e6\") " Apr 16 16:40:27.490000 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:40:27.489975 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b68ea46-e001-4826-aa42-23197e2691e6-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3b68ea46-e001-4826-aa42-23197e2691e6" (UID: "3b68ea46-e001-4826-aa42-23197e2691e6"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:40:27.590286 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:40:27.590253 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3b68ea46-e001-4826-aa42-23197e2691e6-kserve-provision-location\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:40:27.763523 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:40:27.763496 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-7f8f9f49bf-4l48r"] Apr 16 16:40:27.767130 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:40:27.767103 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-7f8f9f49bf-4l48r"] Apr 16 16:40:28.448021 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:40:28.447992 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-795b445d66-5kbwg" event={"ID":"5b846a93-2540-4450-8c3c-417545aecf80","Type":"ContainerStarted","Data":"22e028dd6f7d1bd62f8643ab2baf59fbd95846c2f84c567866f6e11b2362ca34"} Apr 16 16:40:28.448412 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:40:28.448192 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-795b445d66-5kbwg" Apr 16 16:40:28.464720 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:40:28.464664 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-795b445d66-5kbwg" podStartSLOduration=6.464651832 podStartE2EDuration="6.464651832s" podCreationTimestamp="2026-04-16 16:40:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:40:28.463740925 +0000 UTC m=+2251.818731079" watchObservedRunningTime="2026-04-16 16:40:28.464651832 +0000 UTC m=+2251.819641958" Apr 16 16:40:29.278150 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:40:29.278123 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b68ea46-e001-4826-aa42-23197e2691e6" path="/var/lib/kubelet/pods/3b68ea46-e001-4826-aa42-23197e2691e6/volumes" Apr 16 16:40:59.454529 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:40:59.454485 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-795b445d66-5kbwg" podUID="5b846a93-2540-4450-8c3c-417545aecf80" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.52:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.52:8080: connect: connection refused" Apr 16 16:41:09.453381 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:41:09.453332 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-795b445d66-5kbwg" podUID="5b846a93-2540-4450-8c3c-417545aecf80" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.52:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.52:8080: connect: connection refused" Apr 16 16:41:19.452635 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:41:19.452587 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-795b445d66-5kbwg" podUID="5b846a93-2540-4450-8c3c-417545aecf80" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.52:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.52:8080: connect: connection refused" Apr 16 16:41:29.452613 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:41:29.452568 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-795b445d66-5kbwg" podUID="5b846a93-2540-4450-8c3c-417545aecf80" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.52:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.52:8080: connect: connection refused" Apr 16 16:41:39.456020 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:41:39.455985 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-795b445d66-5kbwg" Apr 16 16:41:42.835730 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:41:42.835692 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-795b445d66-5kbwg"] Apr 16 16:41:42.836188 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:41:42.835928 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-795b445d66-5kbwg" podUID="5b846a93-2540-4450-8c3c-417545aecf80" containerName="kserve-container" containerID="cri-o://22e028dd6f7d1bd62f8643ab2baf59fbd95846c2f84c567866f6e11b2362ca34" gracePeriod=30 Apr 16 16:41:42.892981 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:41:42.892950 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-665ddb84b7-tk98b"] Apr 16 16:41:42.893320 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:41:42.893303 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3b68ea46-e001-4826-aa42-23197e2691e6" containerName="storage-initializer" Apr 16 16:41:42.893320 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:41:42.893322 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b68ea46-e001-4826-aa42-23197e2691e6" containerName="storage-initializer" Apr 16 16:41:42.893459 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:41:42.893332 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3b68ea46-e001-4826-aa42-23197e2691e6" containerName="kserve-container" Apr 16 16:41:42.893459 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:41:42.893337 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b68ea46-e001-4826-aa42-23197e2691e6" containerName="kserve-container" Apr 16 16:41:42.893459 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:41:42.893390 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="3b68ea46-e001-4826-aa42-23197e2691e6" containerName="kserve-container" Apr 16 16:41:42.896549 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:41:42.896529 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-665ddb84b7-tk98b" Apr 16 16:41:42.903352 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:41:42.903324 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-665ddb84b7-tk98b"] Apr 16 16:41:42.983626 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:41:42.983597 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7c0f572c-78fe-4542-99b7-2a169550091d-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-665ddb84b7-tk98b\" (UID: \"7c0f572c-78fe-4542-99b7-2a169550091d\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-665ddb84b7-tk98b" Apr 16 16:41:43.084895 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:41:43.084866 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7c0f572c-78fe-4542-99b7-2a169550091d-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-665ddb84b7-tk98b\" (UID: \"7c0f572c-78fe-4542-99b7-2a169550091d\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-665ddb84b7-tk98b" Apr 16 16:41:43.085174 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:41:43.085157 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7c0f572c-78fe-4542-99b7-2a169550091d-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-665ddb84b7-tk98b\" (UID: \"7c0f572c-78fe-4542-99b7-2a169550091d\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-665ddb84b7-tk98b" Apr 16 16:41:43.207069 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:41:43.207050 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-665ddb84b7-tk98b" Apr 16 16:41:43.326474 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:41:43.326451 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-665ddb84b7-tk98b"] Apr 16 16:41:43.328347 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:41:43.328318 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c0f572c_78fe_4542_99b7_2a169550091d.slice/crio-8ca34ae9ac37998783d7c2b5399bc30ada9ade613d548079eb316966f48c471a WatchSource:0}: Error finding container 8ca34ae9ac37998783d7c2b5399bc30ada9ade613d548079eb316966f48c471a: Status 404 returned error can't find the container with id 8ca34ae9ac37998783d7c2b5399bc30ada9ade613d548079eb316966f48c471a Apr 16 16:41:43.693802 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:41:43.693765 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-665ddb84b7-tk98b" event={"ID":"7c0f572c-78fe-4542-99b7-2a169550091d","Type":"ContainerStarted","Data":"5ef5d3fc8e129741cfe1f3ca99035fa90d8c260e3e9dfb5c9718fbdf54f6c67d"} Apr 16 16:41:43.693802 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:41:43.693807 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-665ddb84b7-tk98b" event={"ID":"7c0f572c-78fe-4542-99b7-2a169550091d","Type":"ContainerStarted","Data":"8ca34ae9ac37998783d7c2b5399bc30ada9ade613d548079eb316966f48c471a"} Apr 16 16:41:47.221547 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:41:47.221524 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-795b445d66-5kbwg" Apr 16 16:41:47.312994 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:41:47.312917 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b846a93-2540-4450-8c3c-417545aecf80-kserve-provision-location\") pod \"5b846a93-2540-4450-8c3c-417545aecf80\" (UID: \"5b846a93-2540-4450-8c3c-417545aecf80\") " Apr 16 16:41:47.313192 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:41:47.313167 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b846a93-2540-4450-8c3c-417545aecf80-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5b846a93-2540-4450-8c3c-417545aecf80" (UID: "5b846a93-2540-4450-8c3c-417545aecf80"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:41:47.413391 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:41:47.413360 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b846a93-2540-4450-8c3c-417545aecf80-kserve-provision-location\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:41:47.709489 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:41:47.709453 2576 generic.go:358] "Generic (PLEG): container finished" podID="7c0f572c-78fe-4542-99b7-2a169550091d" containerID="5ef5d3fc8e129741cfe1f3ca99035fa90d8c260e3e9dfb5c9718fbdf54f6c67d" exitCode=0 Apr 16 16:41:47.709724 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:41:47.709523 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-665ddb84b7-tk98b" event={"ID":"7c0f572c-78fe-4542-99b7-2a169550091d","Type":"ContainerDied","Data":"5ef5d3fc8e129741cfe1f3ca99035fa90d8c260e3e9dfb5c9718fbdf54f6c67d"} Apr 16 16:41:47.711005 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:41:47.710984 2576 generic.go:358] "Generic (PLEG): container finished" podID="5b846a93-2540-4450-8c3c-417545aecf80" containerID="22e028dd6f7d1bd62f8643ab2baf59fbd95846c2f84c567866f6e11b2362ca34" exitCode=0 Apr 16 16:41:47.711087 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:41:47.711063 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-795b445d66-5kbwg" event={"ID":"5b846a93-2540-4450-8c3c-417545aecf80","Type":"ContainerDied","Data":"22e028dd6f7d1bd62f8643ab2baf59fbd95846c2f84c567866f6e11b2362ca34"} Apr 16 16:41:47.711087 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:41:47.711074 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-795b445d66-5kbwg" Apr 16 16:41:47.711087 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:41:47.711085 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-795b445d66-5kbwg" event={"ID":"5b846a93-2540-4450-8c3c-417545aecf80","Type":"ContainerDied","Data":"a07fcff1fce00a742ae62d5e4636e6e23a12f6533f64af2cd0ca4ffb0f89961e"} Apr 16 16:41:47.711203 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:41:47.711100 2576 scope.go:117] "RemoveContainer" containerID="22e028dd6f7d1bd62f8643ab2baf59fbd95846c2f84c567866f6e11b2362ca34" Apr 16 16:41:47.719997 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:41:47.719974 2576 scope.go:117] "RemoveContainer" containerID="e538de6ead66c42408a5ce8af93c98f7099dbf81154594a4f166ff0d4c0ffef2" Apr 16 16:41:47.727668 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:41:47.727645 2576 scope.go:117] "RemoveContainer" containerID="22e028dd6f7d1bd62f8643ab2baf59fbd95846c2f84c567866f6e11b2362ca34" Apr 16 16:41:47.727936 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:41:47.727913 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22e028dd6f7d1bd62f8643ab2baf59fbd95846c2f84c567866f6e11b2362ca34\": container with ID starting with 22e028dd6f7d1bd62f8643ab2baf59fbd95846c2f84c567866f6e11b2362ca34 not found: ID does not exist" containerID="22e028dd6f7d1bd62f8643ab2baf59fbd95846c2f84c567866f6e11b2362ca34" Apr 16 16:41:47.728024 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:41:47.727943 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22e028dd6f7d1bd62f8643ab2baf59fbd95846c2f84c567866f6e11b2362ca34"} err="failed to get container status \"22e028dd6f7d1bd62f8643ab2baf59fbd95846c2f84c567866f6e11b2362ca34\": rpc error: code = NotFound desc = could not find container \"22e028dd6f7d1bd62f8643ab2baf59fbd95846c2f84c567866f6e11b2362ca34\": container with ID starting with 22e028dd6f7d1bd62f8643ab2baf59fbd95846c2f84c567866f6e11b2362ca34 not found: ID does not exist" Apr 16 16:41:47.728024 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:41:47.727959 2576 scope.go:117] "RemoveContainer" containerID="e538de6ead66c42408a5ce8af93c98f7099dbf81154594a4f166ff0d4c0ffef2" Apr 16 16:41:47.728197 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:41:47.728181 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e538de6ead66c42408a5ce8af93c98f7099dbf81154594a4f166ff0d4c0ffef2\": container with ID starting with e538de6ead66c42408a5ce8af93c98f7099dbf81154594a4f166ff0d4c0ffef2 not found: ID does not exist" containerID="e538de6ead66c42408a5ce8af93c98f7099dbf81154594a4f166ff0d4c0ffef2" Apr 16 16:41:47.728267 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:41:47.728203 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e538de6ead66c42408a5ce8af93c98f7099dbf81154594a4f166ff0d4c0ffef2"} err="failed to get container status \"e538de6ead66c42408a5ce8af93c98f7099dbf81154594a4f166ff0d4c0ffef2\": rpc error: code = NotFound desc = could not find container \"e538de6ead66c42408a5ce8af93c98f7099dbf81154594a4f166ff0d4c0ffef2\": container with ID starting with e538de6ead66c42408a5ce8af93c98f7099dbf81154594a4f166ff0d4c0ffef2 not found: ID does not exist" Apr 16 16:41:47.738371 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:41:47.738351 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-795b445d66-5kbwg"] Apr 16 16:41:47.739795 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:41:47.739774 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-795b445d66-5kbwg"] Apr 16 16:41:48.716698 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:41:48.716659 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-665ddb84b7-tk98b" event={"ID":"7c0f572c-78fe-4542-99b7-2a169550091d","Type":"ContainerStarted","Data":"858091af6092be666919b6a4cdea1e915afeb052c4328aef11fe7a0b23c80868"} Apr 16 16:41:48.717142 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:41:48.716887 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-665ddb84b7-tk98b" Apr 16 16:41:48.732418 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:41:48.732368 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-665ddb84b7-tk98b" podStartSLOduration=6.732354433 podStartE2EDuration="6.732354433s" podCreationTimestamp="2026-04-16 16:41:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:41:48.731667402 +0000 UTC m=+2332.086657532" watchObservedRunningTime="2026-04-16 16:41:48.732354433 +0000 UTC m=+2332.087344612" Apr 16 16:41:49.278971 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:41:49.278938 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b846a93-2540-4450-8c3c-417545aecf80" path="/var/lib/kubelet/pods/5b846a93-2540-4450-8c3c-417545aecf80/volumes" Apr 16 16:42:19.723074 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:42:19.723033 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-665ddb84b7-tk98b" podUID="7c0f572c-78fe-4542-99b7-2a169550091d" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.53:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.53:8080: connect: connection refused" Apr 16 16:42:29.721763 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:42:29.721722 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-665ddb84b7-tk98b" podUID="7c0f572c-78fe-4542-99b7-2a169550091d" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.53:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.53:8080: connect: connection refused" Apr 16 16:42:39.721551 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:42:39.721515 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-665ddb84b7-tk98b" podUID="7c0f572c-78fe-4542-99b7-2a169550091d" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.53:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.53:8080: connect: connection refused" Apr 16 16:42:49.722390 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:42:49.722350 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-665ddb84b7-tk98b" podUID="7c0f572c-78fe-4542-99b7-2a169550091d" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.53:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.53:8080: connect: connection refused" Apr 16 16:42:59.725185 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:42:59.725154 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-665ddb84b7-tk98b" Apr 16 16:43:02.996067 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:43:02.996034 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-665ddb84b7-tk98b"] Apr 16 16:43:02.996465 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:43:02.996266 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-665ddb84b7-tk98b" podUID="7c0f572c-78fe-4542-99b7-2a169550091d" containerName="kserve-container" containerID="cri-o://858091af6092be666919b6a4cdea1e915afeb052c4328aef11fe7a0b23c80868" gracePeriod=30 Apr 16 16:43:05.168675 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:43:05.168638 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-2jbhh"] Apr 16 16:43:05.169105 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:43:05.168999 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5b846a93-2540-4450-8c3c-417545aecf80" containerName="kserve-container" Apr 16 16:43:05.169105 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:43:05.169011 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b846a93-2540-4450-8c3c-417545aecf80" containerName="kserve-container" Apr 16 16:43:05.169105 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:43:05.169019 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5b846a93-2540-4450-8c3c-417545aecf80" containerName="storage-initializer" Apr 16 16:43:05.169105 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:43:05.169026 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b846a93-2540-4450-8c3c-417545aecf80" containerName="storage-initializer" Apr 16 16:43:05.169105 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:43:05.169083 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="5b846a93-2540-4450-8c3c-417545aecf80" containerName="kserve-container" Apr 16 16:43:05.172066 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:43:05.172047 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-2jbhh" Apr 16 16:43:05.179713 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:43:05.179691 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-2jbhh"] Apr 16 16:43:05.289566 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:43:05.289535 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/97e53918-526a-477a-b50d-ebba27538fa3-kserve-provision-location\") pod \"isvc-sklearn-predictor-59d84b47f5-2jbhh\" (UID: \"97e53918-526a-477a-b50d-ebba27538fa3\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-2jbhh" Apr 16 16:43:05.389889 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:43:05.389853 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/97e53918-526a-477a-b50d-ebba27538fa3-kserve-provision-location\") pod \"isvc-sklearn-predictor-59d84b47f5-2jbhh\" (UID: \"97e53918-526a-477a-b50d-ebba27538fa3\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-2jbhh" Apr 16 16:43:05.390260 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:43:05.390201 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/97e53918-526a-477a-b50d-ebba27538fa3-kserve-provision-location\") pod \"isvc-sklearn-predictor-59d84b47f5-2jbhh\" (UID: \"97e53918-526a-477a-b50d-ebba27538fa3\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-2jbhh" Apr 16 16:43:05.483304 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:43:05.483229 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-2jbhh" Apr 16 16:43:05.599777 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:43:05.599752 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-2jbhh"] Apr 16 16:43:05.602291 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:43:05.602254 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97e53918_526a_477a_b50d_ebba27538fa3.slice/crio-1432ef37ce92d41a7331e896f2118e525d4e227f12067f6a07c9682a143e8de5 WatchSource:0}: Error finding container 1432ef37ce92d41a7331e896f2118e525d4e227f12067f6a07c9682a143e8de5: Status 404 returned error can't find the container with id 1432ef37ce92d41a7331e896f2118e525d4e227f12067f6a07c9682a143e8de5 Apr 16 16:43:05.603977 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:43:05.603960 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:43:05.962420 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:43:05.962385 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-2jbhh" event={"ID":"97e53918-526a-477a-b50d-ebba27538fa3","Type":"ContainerStarted","Data":"6b2ff97943247123dcc6eeb4e2904bb709da993c2d49e09a806bab873d0354a6"} Apr 16 16:43:05.962420 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:43:05.962424 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-2jbhh" event={"ID":"97e53918-526a-477a-b50d-ebba27538fa3","Type":"ContainerStarted","Data":"1432ef37ce92d41a7331e896f2118e525d4e227f12067f6a07c9682a143e8de5"} Apr 16 16:43:07.970727 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:43:07.970695 2576 generic.go:358] "Generic (PLEG): container finished" podID="7c0f572c-78fe-4542-99b7-2a169550091d" containerID="858091af6092be666919b6a4cdea1e915afeb052c4328aef11fe7a0b23c80868" exitCode=0 Apr 16 16:43:07.971151 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:43:07.970728 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-665ddb84b7-tk98b" event={"ID":"7c0f572c-78fe-4542-99b7-2a169550091d","Type":"ContainerDied","Data":"858091af6092be666919b6a4cdea1e915afeb052c4328aef11fe7a0b23c80868"} Apr 16 16:43:08.030775 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:43:08.030755 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-665ddb84b7-tk98b" Apr 16 16:43:08.212754 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:43:08.212662 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7c0f572c-78fe-4542-99b7-2a169550091d-kserve-provision-location\") pod \"7c0f572c-78fe-4542-99b7-2a169550091d\" (UID: \"7c0f572c-78fe-4542-99b7-2a169550091d\") " Apr 16 16:43:08.212985 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:43:08.212964 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c0f572c-78fe-4542-99b7-2a169550091d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7c0f572c-78fe-4542-99b7-2a169550091d" (UID: "7c0f572c-78fe-4542-99b7-2a169550091d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:43:08.313703 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:43:08.313671 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7c0f572c-78fe-4542-99b7-2a169550091d-kserve-provision-location\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:43:08.975299 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:43:08.975265 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-665ddb84b7-tk98b" event={"ID":"7c0f572c-78fe-4542-99b7-2a169550091d","Type":"ContainerDied","Data":"8ca34ae9ac37998783d7c2b5399bc30ada9ade613d548079eb316966f48c471a"} Apr 16 16:43:08.975766 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:43:08.975310 2576 scope.go:117] "RemoveContainer" containerID="858091af6092be666919b6a4cdea1e915afeb052c4328aef11fe7a0b23c80868" Apr 16 16:43:08.975766 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:43:08.975314 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-665ddb84b7-tk98b" Apr 16 16:43:08.983137 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:43:08.983120 2576 scope.go:117] "RemoveContainer" containerID="5ef5d3fc8e129741cfe1f3ca99035fa90d8c260e3e9dfb5c9718fbdf54f6c67d" Apr 16 16:43:08.995997 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:43:08.995971 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-665ddb84b7-tk98b"] Apr 16 16:43:09.000901 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:43:09.000881 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-665ddb84b7-tk98b"] Apr 16 16:43:09.277724 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:43:09.277688 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c0f572c-78fe-4542-99b7-2a169550091d" path="/var/lib/kubelet/pods/7c0f572c-78fe-4542-99b7-2a169550091d/volumes" Apr 16 16:43:09.980830 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:43:09.980799 2576 generic.go:358] "Generic (PLEG): container finished" podID="97e53918-526a-477a-b50d-ebba27538fa3" containerID="6b2ff97943247123dcc6eeb4e2904bb709da993c2d49e09a806bab873d0354a6" exitCode=0 Apr 16 16:43:09.981245 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:43:09.980872 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-2jbhh" event={"ID":"97e53918-526a-477a-b50d-ebba27538fa3","Type":"ContainerDied","Data":"6b2ff97943247123dcc6eeb4e2904bb709da993c2d49e09a806bab873d0354a6"} Apr 16 16:43:10.985613 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:43:10.985576 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-2jbhh" event={"ID":"97e53918-526a-477a-b50d-ebba27538fa3","Type":"ContainerStarted","Data":"6f6198c5ceb32b3a6ae8ba2b7389c5c57b71a6736bcecc7eaa90c0b912f5c425"} Apr 16 16:43:10.986044 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:43:10.985858 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-2jbhh" Apr 16 16:43:10.987220 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:43:10.987185 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-2jbhh" podUID="97e53918-526a-477a-b50d-ebba27538fa3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.54:8080: connect: connection refused" Apr 16 16:43:11.008369 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:43:11.008325 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-2jbhh" podStartSLOduration=6.008311362 podStartE2EDuration="6.008311362s" podCreationTimestamp="2026-04-16 16:43:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:43:11.006125462 +0000 UTC m=+2414.361115591" watchObservedRunningTime="2026-04-16 16:43:11.008311362 +0000 UTC m=+2414.363301568" Apr 16 16:43:11.989437 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:43:11.989401 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-2jbhh" podUID="97e53918-526a-477a-b50d-ebba27538fa3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.54:8080: connect: connection refused" Apr 16 16:43:21.990250 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:43:21.990137 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-2jbhh" podUID="97e53918-526a-477a-b50d-ebba27538fa3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.54:8080: connect: connection refused" Apr 16 16:43:31.989440 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:43:31.989397 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-2jbhh" podUID="97e53918-526a-477a-b50d-ebba27538fa3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.54:8080: connect: connection refused" Apr 16 16:43:41.989751 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:43:41.989706 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-2jbhh" podUID="97e53918-526a-477a-b50d-ebba27538fa3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.54:8080: connect: connection refused" Apr 16 16:43:51.990286 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:43:51.990243 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-2jbhh" podUID="97e53918-526a-477a-b50d-ebba27538fa3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.54:8080: connect: connection refused" Apr 16 16:44:01.989654 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:44:01.989612 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-2jbhh" podUID="97e53918-526a-477a-b50d-ebba27538fa3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.54:8080: connect: connection refused" Apr 16 16:44:11.990056 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:44:11.990010 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-2jbhh" podUID="97e53918-526a-477a-b50d-ebba27538fa3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.54:8080: connect: connection refused" Apr 16 16:44:21.990402 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:44:21.990368 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-2jbhh" Apr 16 16:44:25.318851 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:44:25.318815 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-2jbhh"] Apr 16 16:44:25.319244 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:44:25.319101 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-2jbhh" podUID="97e53918-526a-477a-b50d-ebba27538fa3" containerName="kserve-container" containerID="cri-o://6f6198c5ceb32b3a6ae8ba2b7389c5c57b71a6736bcecc7eaa90c0b912f5c425" gracePeriod=30 Apr 16 16:44:25.366865 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:44:25.366828 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-5fdf655847-fhg7h"] Apr 16 16:44:25.367187 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:44:25.367171 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7c0f572c-78fe-4542-99b7-2a169550091d" containerName="kserve-container" Apr 16 16:44:25.367273 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:44:25.367189 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c0f572c-78fe-4542-99b7-2a169550091d" containerName="kserve-container" Apr 16 16:44:25.367273 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:44:25.367232 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7c0f572c-78fe-4542-99b7-2a169550091d" containerName="storage-initializer" Apr 16 16:44:25.367273 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:44:25.367239 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c0f572c-78fe-4542-99b7-2a169550091d" containerName="storage-initializer" Apr 16 16:44:25.367382 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:44:25.367297 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="7c0f572c-78fe-4542-99b7-2a169550091d" containerName="kserve-container" Apr 16 16:44:25.371513 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:44:25.371493 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-5fdf655847-fhg7h" Apr 16 16:44:25.379094 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:44:25.379067 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-5fdf655847-fhg7h"] Apr 16 16:44:25.421200 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:44:25.421168 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/367f6856-4854-42a5-a7b1-40f0c7163116-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-5fdf655847-fhg7h\" (UID: \"367f6856-4854-42a5-a7b1-40f0c7163116\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-5fdf655847-fhg7h" Apr 16 16:44:25.521681 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:44:25.521644 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/367f6856-4854-42a5-a7b1-40f0c7163116-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-5fdf655847-fhg7h\" (UID: \"367f6856-4854-42a5-a7b1-40f0c7163116\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-5fdf655847-fhg7h" Apr 16 16:44:25.522020 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:44:25.522001 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/367f6856-4854-42a5-a7b1-40f0c7163116-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-5fdf655847-fhg7h\" (UID: \"367f6856-4854-42a5-a7b1-40f0c7163116\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-5fdf655847-fhg7h" Apr 16 16:44:25.682258 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:44:25.682195 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-5fdf655847-fhg7h" Apr 16 16:44:25.803197 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:44:25.803159 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-5fdf655847-fhg7h"] Apr 16 16:44:25.806174 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:44:25.806142 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod367f6856_4854_42a5_a7b1_40f0c7163116.slice/crio-44866ea7f9a21486e693b0d84730cfcd1d7a9f3a336ae5ef1332d475ef67a5a6 WatchSource:0}: Error finding container 44866ea7f9a21486e693b0d84730cfcd1d7a9f3a336ae5ef1332d475ef67a5a6: Status 404 returned error can't find the container with id 44866ea7f9a21486e693b0d84730cfcd1d7a9f3a336ae5ef1332d475ef67a5a6 Apr 16 16:44:26.226905 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:44:26.226868 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-5fdf655847-fhg7h" event={"ID":"367f6856-4854-42a5-a7b1-40f0c7163116","Type":"ContainerStarted","Data":"6bb78ee63884aadeb148caf89cd63563847d01e1fc19917d8cf95da5764c9968"} Apr 16 16:44:26.227083 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:44:26.226913 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-5fdf655847-fhg7h" event={"ID":"367f6856-4854-42a5-a7b1-40f0c7163116","Type":"ContainerStarted","Data":"44866ea7f9a21486e693b0d84730cfcd1d7a9f3a336ae5ef1332d475ef67a5a6"} Apr 16 16:44:29.955726 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:44:29.955701 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-2jbhh" Apr 16 16:44:30.055734 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:44:30.055692 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/97e53918-526a-477a-b50d-ebba27538fa3-kserve-provision-location\") pod \"97e53918-526a-477a-b50d-ebba27538fa3\" (UID: \"97e53918-526a-477a-b50d-ebba27538fa3\") " Apr 16 16:44:30.056014 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:44:30.055987 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97e53918-526a-477a-b50d-ebba27538fa3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "97e53918-526a-477a-b50d-ebba27538fa3" (UID: "97e53918-526a-477a-b50d-ebba27538fa3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:44:30.156724 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:44:30.156689 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/97e53918-526a-477a-b50d-ebba27538fa3-kserve-provision-location\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:44:30.242168 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:44:30.242138 2576 generic.go:358] "Generic (PLEG): container finished" podID="97e53918-526a-477a-b50d-ebba27538fa3" containerID="6f6198c5ceb32b3a6ae8ba2b7389c5c57b71a6736bcecc7eaa90c0b912f5c425" exitCode=0 Apr 16 16:44:30.242358 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:44:30.242228 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-2jbhh" Apr 16 16:44:30.242358 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:44:30.242242 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-2jbhh" event={"ID":"97e53918-526a-477a-b50d-ebba27538fa3","Type":"ContainerDied","Data":"6f6198c5ceb32b3a6ae8ba2b7389c5c57b71a6736bcecc7eaa90c0b912f5c425"} Apr 16 16:44:30.242358 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:44:30.242279 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-2jbhh" event={"ID":"97e53918-526a-477a-b50d-ebba27538fa3","Type":"ContainerDied","Data":"1432ef37ce92d41a7331e896f2118e525d4e227f12067f6a07c9682a143e8de5"} Apr 16 16:44:30.242358 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:44:30.242298 2576 scope.go:117] "RemoveContainer" containerID="6f6198c5ceb32b3a6ae8ba2b7389c5c57b71a6736bcecc7eaa90c0b912f5c425" Apr 16 16:44:30.243514 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:44:30.243497 2576 generic.go:358] "Generic (PLEG): container finished" podID="367f6856-4854-42a5-a7b1-40f0c7163116" containerID="6bb78ee63884aadeb148caf89cd63563847d01e1fc19917d8cf95da5764c9968" exitCode=0 Apr 16 16:44:30.243608 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:44:30.243580 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-5fdf655847-fhg7h" event={"ID":"367f6856-4854-42a5-a7b1-40f0c7163116","Type":"ContainerDied","Data":"6bb78ee63884aadeb148caf89cd63563847d01e1fc19917d8cf95da5764c9968"} Apr 16 16:44:30.250082 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:44:30.250063 2576 scope.go:117] "RemoveContainer" containerID="6b2ff97943247123dcc6eeb4e2904bb709da993c2d49e09a806bab873d0354a6" Apr 16 16:44:30.259728 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:44:30.259708 2576 scope.go:117] "RemoveContainer" containerID="6f6198c5ceb32b3a6ae8ba2b7389c5c57b71a6736bcecc7eaa90c0b912f5c425" Apr 16 16:44:30.260026 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:44:30.260005 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f6198c5ceb32b3a6ae8ba2b7389c5c57b71a6736bcecc7eaa90c0b912f5c425\": container with ID starting with 6f6198c5ceb32b3a6ae8ba2b7389c5c57b71a6736bcecc7eaa90c0b912f5c425 not found: ID does not exist" containerID="6f6198c5ceb32b3a6ae8ba2b7389c5c57b71a6736bcecc7eaa90c0b912f5c425" Apr 16 16:44:30.260081 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:44:30.260037 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f6198c5ceb32b3a6ae8ba2b7389c5c57b71a6736bcecc7eaa90c0b912f5c425"} err="failed to get container status \"6f6198c5ceb32b3a6ae8ba2b7389c5c57b71a6736bcecc7eaa90c0b912f5c425\": rpc error: code = NotFound desc = could not find container \"6f6198c5ceb32b3a6ae8ba2b7389c5c57b71a6736bcecc7eaa90c0b912f5c425\": container with ID starting with 6f6198c5ceb32b3a6ae8ba2b7389c5c57b71a6736bcecc7eaa90c0b912f5c425 not found: ID does not exist" Apr 16 16:44:30.260081 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:44:30.260055 2576 scope.go:117] "RemoveContainer" containerID="6b2ff97943247123dcc6eeb4e2904bb709da993c2d49e09a806bab873d0354a6" Apr 16 16:44:30.260351 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:44:30.260334 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b2ff97943247123dcc6eeb4e2904bb709da993c2d49e09a806bab873d0354a6\": container with ID starting with 6b2ff97943247123dcc6eeb4e2904bb709da993c2d49e09a806bab873d0354a6 not found: ID does not exist" containerID="6b2ff97943247123dcc6eeb4e2904bb709da993c2d49e09a806bab873d0354a6" Apr 16 16:44:30.260422 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:44:30.260356 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b2ff97943247123dcc6eeb4e2904bb709da993c2d49e09a806bab873d0354a6"} err="failed to get container status \"6b2ff97943247123dcc6eeb4e2904bb709da993c2d49e09a806bab873d0354a6\": rpc error: code = NotFound desc = could not find container \"6b2ff97943247123dcc6eeb4e2904bb709da993c2d49e09a806bab873d0354a6\": container with ID starting with 6b2ff97943247123dcc6eeb4e2904bb709da993c2d49e09a806bab873d0354a6 not found: ID does not exist" Apr 16 16:44:30.275682 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:44:30.275652 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-2jbhh"] Apr 16 16:44:30.279845 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:44:30.279818 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-59d84b47f5-2jbhh"] Apr 16 16:44:31.248783 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:44:31.248748 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-5fdf655847-fhg7h" event={"ID":"367f6856-4854-42a5-a7b1-40f0c7163116","Type":"ContainerStarted","Data":"92a75ef1006207e532ed5c5ab1d09133c00379e8c4bdc846cb83631e0266a713"} Apr 16 16:44:31.249198 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:44:31.248953 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-5fdf655847-fhg7h" Apr 16 16:44:31.267807 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:44:31.267758 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-5fdf655847-fhg7h" podStartSLOduration=6.267745254 podStartE2EDuration="6.267745254s" podCreationTimestamp="2026-04-16 16:44:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:44:31.265674023 +0000 UTC m=+2494.620664152" watchObservedRunningTime="2026-04-16 16:44:31.267745254 +0000 UTC m=+2494.622735382" Apr 16 16:44:31.278321 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:44:31.278290 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97e53918-526a-477a-b50d-ebba27538fa3" path="/var/lib/kubelet/pods/97e53918-526a-477a-b50d-ebba27538fa3/volumes" Apr 16 16:45:02.326150 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:02.326050 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-5fdf655847-fhg7h" podUID="367f6856-4854-42a5-a7b1-40f0c7163116" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 16 16:45:12.255389 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:12.255353 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-5fdf655847-fhg7h" Apr 16 16:45:15.550541 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:15.550503 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-56797944b-x2mm5"] Apr 16 16:45:15.550971 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:15.550874 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="97e53918-526a-477a-b50d-ebba27538fa3" containerName="kserve-container" Apr 16 16:45:15.550971 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:15.550887 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="97e53918-526a-477a-b50d-ebba27538fa3" containerName="kserve-container" Apr 16 16:45:15.550971 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:15.550897 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="97e53918-526a-477a-b50d-ebba27538fa3" containerName="storage-initializer" Apr 16 16:45:15.550971 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:15.550903 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="97e53918-526a-477a-b50d-ebba27538fa3" containerName="storage-initializer" Apr 16 16:45:15.550971 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:15.550958 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="97e53918-526a-477a-b50d-ebba27538fa3" containerName="kserve-container" Apr 16 16:45:15.554914 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:15.554899 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-56797944b-x2mm5" Apr 16 16:45:15.564102 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:15.564078 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-56797944b-x2mm5"] Apr 16 16:45:15.603415 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:15.603383 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-5fdf655847-fhg7h"] Apr 16 16:45:15.603634 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:15.603614 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-5fdf655847-fhg7h" podUID="367f6856-4854-42a5-a7b1-40f0c7163116" containerName="kserve-container" containerID="cri-o://92a75ef1006207e532ed5c5ab1d09133c00379e8c4bdc846cb83631e0266a713" gracePeriod=30 Apr 16 16:45:15.735936 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:15.735900 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/817e0582-63b8-4993-9d69-bc56fc714010-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-56797944b-x2mm5\" (UID: \"817e0582-63b8-4993-9d69-bc56fc714010\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-56797944b-x2mm5" Apr 16 16:45:15.836982 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:15.836883 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/817e0582-63b8-4993-9d69-bc56fc714010-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-56797944b-x2mm5\" (UID: \"817e0582-63b8-4993-9d69-bc56fc714010\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-56797944b-x2mm5" Apr 16 16:45:15.837325 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:15.837304 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/817e0582-63b8-4993-9d69-bc56fc714010-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-56797944b-x2mm5\" (UID: \"817e0582-63b8-4993-9d69-bc56fc714010\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-56797944b-x2mm5" Apr 16 16:45:15.865715 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:15.865681 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-56797944b-x2mm5" Apr 16 16:45:15.991851 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:15.991826 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-56797944b-x2mm5"] Apr 16 16:45:15.994365 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:45:15.994335 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod817e0582_63b8_4993_9d69_bc56fc714010.slice/crio-a19f15c0fca32fa9b915a10d247e2d1f75048cbe1248c20f06a85a14a3661346 WatchSource:0}: Error finding container a19f15c0fca32fa9b915a10d247e2d1f75048cbe1248c20f06a85a14a3661346: Status 404 returned error can't find the container with id a19f15c0fca32fa9b915a10d247e2d1f75048cbe1248c20f06a85a14a3661346 Apr 16 16:45:16.397153 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:16.397115 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-56797944b-x2mm5" event={"ID":"817e0582-63b8-4993-9d69-bc56fc714010","Type":"ContainerStarted","Data":"4111e84c4bad5be3b4360f6149a0b9aa0578f14b66d06b03561ba61ce69a65f0"} Apr 16 16:45:16.397153 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:16.397153 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-56797944b-x2mm5" event={"ID":"817e0582-63b8-4993-9d69-bc56fc714010","Type":"ContainerStarted","Data":"a19f15c0fca32fa9b915a10d247e2d1f75048cbe1248c20f06a85a14a3661346"} Apr 16 16:45:22.253616 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:22.253570 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-5fdf655847-fhg7h" podUID="367f6856-4854-42a5-a7b1-40f0c7163116" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.55:8080/v2/models/sklearn-v2-mlserver/ready\": dial tcp 10.133.0.55:8080: connect: connection refused" Apr 16 16:45:22.417086 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:22.417051 2576 generic.go:358] "Generic (PLEG): container finished" podID="817e0582-63b8-4993-9d69-bc56fc714010" containerID="4111e84c4bad5be3b4360f6149a0b9aa0578f14b66d06b03561ba61ce69a65f0" exitCode=0 Apr 16 16:45:22.417273 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:22.417119 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-56797944b-x2mm5" event={"ID":"817e0582-63b8-4993-9d69-bc56fc714010","Type":"ContainerDied","Data":"4111e84c4bad5be3b4360f6149a0b9aa0578f14b66d06b03561ba61ce69a65f0"} Apr 16 16:45:23.422751 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:23.422698 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-56797944b-x2mm5" event={"ID":"817e0582-63b8-4993-9d69-bc56fc714010","Type":"ContainerStarted","Data":"2508e9db0091a0088b34dbd8d9f2fd88098874ec9bba26682b2b9de21af367cf"} Apr 16 16:45:23.423202 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:23.423157 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-56797944b-x2mm5" Apr 16 16:45:23.424502 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:23.424476 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-56797944b-x2mm5" podUID="817e0582-63b8-4993-9d69-bc56fc714010" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.56:8080: connect: connection refused" Apr 16 16:45:23.425115 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:23.425091 2576 generic.go:358] "Generic (PLEG): container finished" podID="367f6856-4854-42a5-a7b1-40f0c7163116" containerID="92a75ef1006207e532ed5c5ab1d09133c00379e8c4bdc846cb83631e0266a713" exitCode=0 Apr 16 16:45:23.425188 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:23.425152 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-5fdf655847-fhg7h" event={"ID":"367f6856-4854-42a5-a7b1-40f0c7163116","Type":"ContainerDied","Data":"92a75ef1006207e532ed5c5ab1d09133c00379e8c4bdc846cb83631e0266a713"} Apr 16 16:45:23.440816 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:23.440775 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-56797944b-x2mm5" podStartSLOduration=8.440762305 podStartE2EDuration="8.440762305s" podCreationTimestamp="2026-04-16 16:45:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:45:23.438927058 +0000 UTC m=+2546.793917188" watchObservedRunningTime="2026-04-16 16:45:23.440762305 +0000 UTC m=+2546.795752434" Apr 16 16:45:23.449868 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:23.449848 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-5fdf655847-fhg7h" Apr 16 16:45:23.492502 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:23.492406 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/367f6856-4854-42a5-a7b1-40f0c7163116-kserve-provision-location\") pod \"367f6856-4854-42a5-a7b1-40f0c7163116\" (UID: \"367f6856-4854-42a5-a7b1-40f0c7163116\") " Apr 16 16:45:23.492807 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:23.492778 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/367f6856-4854-42a5-a7b1-40f0c7163116-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "367f6856-4854-42a5-a7b1-40f0c7163116" (UID: "367f6856-4854-42a5-a7b1-40f0c7163116"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:45:23.596651 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:23.596618 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/367f6856-4854-42a5-a7b1-40f0c7163116-kserve-provision-location\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:45:24.430178 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:24.430137 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-5fdf655847-fhg7h" event={"ID":"367f6856-4854-42a5-a7b1-40f0c7163116","Type":"ContainerDied","Data":"44866ea7f9a21486e693b0d84730cfcd1d7a9f3a336ae5ef1332d475ef67a5a6"} Apr 16 16:45:24.430178 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:24.430171 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-5fdf655847-fhg7h" Apr 16 16:45:24.430695 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:24.430189 2576 scope.go:117] "RemoveContainer" containerID="92a75ef1006207e532ed5c5ab1d09133c00379e8c4bdc846cb83631e0266a713" Apr 16 16:45:24.430918 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:24.430888 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-56797944b-x2mm5" podUID="817e0582-63b8-4993-9d69-bc56fc714010" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.56:8080: connect: connection refused" Apr 16 16:45:24.439582 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:24.439561 2576 scope.go:117] "RemoveContainer" containerID="6bb78ee63884aadeb148caf89cd63563847d01e1fc19917d8cf95da5764c9968" Apr 16 16:45:24.451484 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:24.451458 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-5fdf655847-fhg7h"] Apr 16 16:45:24.455353 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:24.455334 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-5fdf655847-fhg7h"] Apr 16 16:45:25.278625 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:25.278590 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="367f6856-4854-42a5-a7b1-40f0c7163116" path="/var/lib/kubelet/pods/367f6856-4854-42a5-a7b1-40f0c7163116/volumes" Apr 16 16:45:34.431595 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:34.431552 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-56797944b-x2mm5" podUID="817e0582-63b8-4993-9d69-bc56fc714010" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.56:8080: connect: connection refused" Apr 16 16:45:44.432167 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:44.432128 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-56797944b-x2mm5" Apr 16 16:45:52.561987 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:52.561948 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-runtime-predictor-56797944b-x2mm5_817e0582-63b8-4993-9d69-bc56fc714010/kserve-container/0.log" Apr 16 16:45:52.708747 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:52.708698 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-56797944b-x2mm5"] Apr 16 16:45:52.709050 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:52.709004 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-56797944b-x2mm5" podUID="817e0582-63b8-4993-9d69-bc56fc714010" containerName="kserve-container" containerID="cri-o://2508e9db0091a0088b34dbd8d9f2fd88098874ec9bba26682b2b9de21af367cf" gracePeriod=30 Apr 16 16:45:52.771440 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:52.771403 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7f9684dc47-qcqtg"] Apr 16 16:45:52.771823 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:52.771803 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="367f6856-4854-42a5-a7b1-40f0c7163116" containerName="kserve-container" Apr 16 16:45:52.771823 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:52.771825 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="367f6856-4854-42a5-a7b1-40f0c7163116" containerName="kserve-container" Apr 16 16:45:52.772060 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:52.771837 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="367f6856-4854-42a5-a7b1-40f0c7163116" containerName="storage-initializer" Apr 16 16:45:52.772060 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:52.771845 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="367f6856-4854-42a5-a7b1-40f0c7163116" containerName="storage-initializer" Apr 16 16:45:52.772060 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:52.771943 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="367f6856-4854-42a5-a7b1-40f0c7163116" containerName="kserve-container" Apr 16 16:45:52.774325 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:52.774305 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7f9684dc47-qcqtg" Apr 16 16:45:52.785062 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:52.785037 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7f9684dc47-qcqtg"] Apr 16 16:45:52.828367 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:52.828291 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0ab4993a-f137-4a2d-8053-05a43f7e5c02-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-7f9684dc47-qcqtg\" (UID: \"0ab4993a-f137-4a2d-8053-05a43f7e5c02\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7f9684dc47-qcqtg" Apr 16 16:45:52.929115 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:52.929064 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0ab4993a-f137-4a2d-8053-05a43f7e5c02-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-7f9684dc47-qcqtg\" (UID: \"0ab4993a-f137-4a2d-8053-05a43f7e5c02\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7f9684dc47-qcqtg" Apr 16 16:45:52.929458 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:52.929439 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0ab4993a-f137-4a2d-8053-05a43f7e5c02-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-7f9684dc47-qcqtg\" (UID: \"0ab4993a-f137-4a2d-8053-05a43f7e5c02\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7f9684dc47-qcqtg" Apr 16 16:45:53.085140 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:53.085019 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7f9684dc47-qcqtg" Apr 16 16:45:53.207546 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:53.207516 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7f9684dc47-qcqtg"] Apr 16 16:45:53.210785 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:45:53.210696 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ab4993a_f137_4a2d_8053_05a43f7e5c02.slice/crio-ba3ecda07afcd1b99494242e23797164327b4b637944aa36d07b9585abaf91c9 WatchSource:0}: Error finding container ba3ecda07afcd1b99494242e23797164327b4b637944aa36d07b9585abaf91c9: Status 404 returned error can't find the container with id ba3ecda07afcd1b99494242e23797164327b4b637944aa36d07b9585abaf91c9 Apr 16 16:45:53.540555 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:53.540519 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7f9684dc47-qcqtg" event={"ID":"0ab4993a-f137-4a2d-8053-05a43f7e5c02","Type":"ContainerStarted","Data":"34e5c7809dd2cfe126c647e6008d99bc99ad124a0bff5e453dcd1791e2297ef5"} Apr 16 16:45:53.540555 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:53.540561 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7f9684dc47-qcqtg" event={"ID":"0ab4993a-f137-4a2d-8053-05a43f7e5c02","Type":"ContainerStarted","Data":"ba3ecda07afcd1b99494242e23797164327b4b637944aa36d07b9585abaf91c9"} Apr 16 16:45:53.844517 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:53.844491 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-56797944b-x2mm5" Apr 16 16:45:53.938198 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:53.938161 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/817e0582-63b8-4993-9d69-bc56fc714010-kserve-provision-location\") pod \"817e0582-63b8-4993-9d69-bc56fc714010\" (UID: \"817e0582-63b8-4993-9d69-bc56fc714010\") " Apr 16 16:45:53.943584 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:53.943550 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/817e0582-63b8-4993-9d69-bc56fc714010-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "817e0582-63b8-4993-9d69-bc56fc714010" (UID: "817e0582-63b8-4993-9d69-bc56fc714010"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:45:54.039534 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:54.039500 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/817e0582-63b8-4993-9d69-bc56fc714010-kserve-provision-location\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:45:54.544305 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:54.544266 2576 generic.go:358] "Generic (PLEG): container finished" podID="817e0582-63b8-4993-9d69-bc56fc714010" containerID="2508e9db0091a0088b34dbd8d9f2fd88098874ec9bba26682b2b9de21af367cf" exitCode=0 Apr 16 16:45:54.544489 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:54.544357 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-56797944b-x2mm5" Apr 16 16:45:54.544489 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:54.544355 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-56797944b-x2mm5" event={"ID":"817e0582-63b8-4993-9d69-bc56fc714010","Type":"ContainerDied","Data":"2508e9db0091a0088b34dbd8d9f2fd88098874ec9bba26682b2b9de21af367cf"} Apr 16 16:45:54.544489 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:54.544473 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-56797944b-x2mm5" event={"ID":"817e0582-63b8-4993-9d69-bc56fc714010","Type":"ContainerDied","Data":"a19f15c0fca32fa9b915a10d247e2d1f75048cbe1248c20f06a85a14a3661346"} Apr 16 16:45:54.544625 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:54.544494 2576 scope.go:117] "RemoveContainer" containerID="2508e9db0091a0088b34dbd8d9f2fd88098874ec9bba26682b2b9de21af367cf" Apr 16 16:45:54.552966 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:54.552945 2576 scope.go:117] "RemoveContainer" containerID="4111e84c4bad5be3b4360f6149a0b9aa0578f14b66d06b03561ba61ce69a65f0" Apr 16 16:45:54.559998 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:54.559982 2576 scope.go:117] "RemoveContainer" containerID="2508e9db0091a0088b34dbd8d9f2fd88098874ec9bba26682b2b9de21af367cf" Apr 16 16:45:54.560248 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:45:54.560222 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2508e9db0091a0088b34dbd8d9f2fd88098874ec9bba26682b2b9de21af367cf\": container with ID starting with 2508e9db0091a0088b34dbd8d9f2fd88098874ec9bba26682b2b9de21af367cf not found: ID does not exist" containerID="2508e9db0091a0088b34dbd8d9f2fd88098874ec9bba26682b2b9de21af367cf" Apr 16 16:45:54.560314 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:54.560261 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2508e9db0091a0088b34dbd8d9f2fd88098874ec9bba26682b2b9de21af367cf"} err="failed to get container status \"2508e9db0091a0088b34dbd8d9f2fd88098874ec9bba26682b2b9de21af367cf\": rpc error: code = NotFound desc = could not find container \"2508e9db0091a0088b34dbd8d9f2fd88098874ec9bba26682b2b9de21af367cf\": container with ID starting with 2508e9db0091a0088b34dbd8d9f2fd88098874ec9bba26682b2b9de21af367cf not found: ID does not exist" Apr 16 16:45:54.560314 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:54.560286 2576 scope.go:117] "RemoveContainer" containerID="4111e84c4bad5be3b4360f6149a0b9aa0578f14b66d06b03561ba61ce69a65f0" Apr 16 16:45:54.560541 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:45:54.560517 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4111e84c4bad5be3b4360f6149a0b9aa0578f14b66d06b03561ba61ce69a65f0\": container with ID starting with 4111e84c4bad5be3b4360f6149a0b9aa0578f14b66d06b03561ba61ce69a65f0 not found: ID does not exist" containerID="4111e84c4bad5be3b4360f6149a0b9aa0578f14b66d06b03561ba61ce69a65f0" Apr 16 16:45:54.560635 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:54.560545 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4111e84c4bad5be3b4360f6149a0b9aa0578f14b66d06b03561ba61ce69a65f0"} err="failed to get container status \"4111e84c4bad5be3b4360f6149a0b9aa0578f14b66d06b03561ba61ce69a65f0\": rpc error: code = NotFound desc = could not find container \"4111e84c4bad5be3b4360f6149a0b9aa0578f14b66d06b03561ba61ce69a65f0\": container with ID starting with 4111e84c4bad5be3b4360f6149a0b9aa0578f14b66d06b03561ba61ce69a65f0 not found: ID does not exist" Apr 16 16:45:54.565547 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:54.565526 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-56797944b-x2mm5"] Apr 16 16:45:54.568611 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:54.568590 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-56797944b-x2mm5"] Apr 16 16:45:55.277944 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:55.277915 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="817e0582-63b8-4993-9d69-bc56fc714010" path="/var/lib/kubelet/pods/817e0582-63b8-4993-9d69-bc56fc714010/volumes" Apr 16 16:45:57.556714 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:57.556683 2576 generic.go:358] "Generic (PLEG): container finished" podID="0ab4993a-f137-4a2d-8053-05a43f7e5c02" containerID="34e5c7809dd2cfe126c647e6008d99bc99ad124a0bff5e453dcd1791e2297ef5" exitCode=0 Apr 16 16:45:57.557089 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:57.556726 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7f9684dc47-qcqtg" event={"ID":"0ab4993a-f137-4a2d-8053-05a43f7e5c02","Type":"ContainerDied","Data":"34e5c7809dd2cfe126c647e6008d99bc99ad124a0bff5e453dcd1791e2297ef5"} Apr 16 16:45:58.566977 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:58.566939 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7f9684dc47-qcqtg" event={"ID":"0ab4993a-f137-4a2d-8053-05a43f7e5c02","Type":"ContainerStarted","Data":"1ff2086d3d7f30504b128ef597b86364b261ce2abd5366741cd058d1963a1f3d"} Apr 16 16:45:58.567508 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:58.567188 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7f9684dc47-qcqtg" Apr 16 16:45:58.584051 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:45:58.583973 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7f9684dc47-qcqtg" podStartSLOduration=6.583957086 podStartE2EDuration="6.583957086s" podCreationTimestamp="2026-04-16 16:45:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:45:58.583420873 +0000 UTC m=+2581.938411002" watchObservedRunningTime="2026-04-16 16:45:58.583957086 +0000 UTC m=+2581.938947216" Apr 16 16:46:29.625537 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:46:29.625433 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7f9684dc47-qcqtg" podUID="0ab4993a-f137-4a2d-8053-05a43f7e5c02" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 16 16:46:39.573770 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:46:39.573736 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7f9684dc47-qcqtg" Apr 16 16:46:42.891758 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:46:42.891718 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7f9684dc47-qcqtg"] Apr 16 16:46:42.892255 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:46:42.891975 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7f9684dc47-qcqtg" podUID="0ab4993a-f137-4a2d-8053-05a43f7e5c02" containerName="kserve-container" containerID="cri-o://1ff2086d3d7f30504b128ef597b86364b261ce2abd5366741cd058d1963a1f3d" gracePeriod=30 Apr 16 16:46:42.946095 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:46:42.946056 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-7c7x9"] Apr 16 16:46:42.946384 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:46:42.946371 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="817e0582-63b8-4993-9d69-bc56fc714010" containerName="kserve-container" Apr 16 16:46:42.946438 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:46:42.946386 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="817e0582-63b8-4993-9d69-bc56fc714010" containerName="kserve-container" Apr 16 16:46:42.946438 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:46:42.946404 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="817e0582-63b8-4993-9d69-bc56fc714010" containerName="storage-initializer" Apr 16 16:46:42.946438 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:46:42.946410 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="817e0582-63b8-4993-9d69-bc56fc714010" containerName="storage-initializer" Apr 16 16:46:42.946537 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:46:42.946478 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="817e0582-63b8-4993-9d69-bc56fc714010" containerName="kserve-container" Apr 16 16:46:42.948824 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:46:42.948794 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-7c7x9" Apr 16 16:46:42.956416 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:46:42.956391 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-7c7x9"] Apr 16 16:46:43.022343 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:46:43.022309 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2a588298-3c56-44ea-bfaa-f8d9211ca2f4-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-5ff9c785b8-7c7x9\" (UID: \"2a588298-3c56-44ea-bfaa-f8d9211ca2f4\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-7c7x9" Apr 16 16:46:43.123042 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:46:43.123011 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2a588298-3c56-44ea-bfaa-f8d9211ca2f4-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-5ff9c785b8-7c7x9\" (UID: \"2a588298-3c56-44ea-bfaa-f8d9211ca2f4\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-7c7x9" Apr 16 16:46:43.123388 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:46:43.123370 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2a588298-3c56-44ea-bfaa-f8d9211ca2f4-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-5ff9c785b8-7c7x9\" (UID: \"2a588298-3c56-44ea-bfaa-f8d9211ca2f4\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-7c7x9" Apr 16 16:46:43.259288 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:46:43.259170 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-7c7x9" Apr 16 16:46:43.587481 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:46:43.587400 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-7c7x9"] Apr 16 16:46:43.590980 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:46:43.590939 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a588298_3c56_44ea_bfaa_f8d9211ca2f4.slice/crio-14cfb403566b11c3435da5a6a3052f01e9873de85daff3e0399bfbef12e8ff80 WatchSource:0}: Error finding container 14cfb403566b11c3435da5a6a3052f01e9873de85daff3e0399bfbef12e8ff80: Status 404 returned error can't find the container with id 14cfb403566b11c3435da5a6a3052f01e9873de85daff3e0399bfbef12e8ff80 Apr 16 16:46:43.710273 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:46:43.710235 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-7c7x9" event={"ID":"2a588298-3c56-44ea-bfaa-f8d9211ca2f4","Type":"ContainerStarted","Data":"43304789b9a5b0964b7dcf072e71a16c1ab6f47b81fb24477df785816d6d2c36"} Apr 16 16:46:43.710273 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:46:43.710275 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-7c7x9" event={"ID":"2a588298-3c56-44ea-bfaa-f8d9211ca2f4","Type":"ContainerStarted","Data":"14cfb403566b11c3435da5a6a3052f01e9873de85daff3e0399bfbef12e8ff80"} Apr 16 16:46:47.722857 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:46:47.722765 2576 generic.go:358] "Generic (PLEG): container finished" podID="2a588298-3c56-44ea-bfaa-f8d9211ca2f4" containerID="43304789b9a5b0964b7dcf072e71a16c1ab6f47b81fb24477df785816d6d2c36" exitCode=0 Apr 16 16:46:47.722857 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:46:47.722821 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-7c7x9" event={"ID":"2a588298-3c56-44ea-bfaa-f8d9211ca2f4","Type":"ContainerDied","Data":"43304789b9a5b0964b7dcf072e71a16c1ab6f47b81fb24477df785816d6d2c36"} Apr 16 16:46:48.728273 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:46:48.728237 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-7c7x9" event={"ID":"2a588298-3c56-44ea-bfaa-f8d9211ca2f4","Type":"ContainerStarted","Data":"27fa297e16298cffb540602092e0e0d0cfc599ed7934994628bb5db696e8888f"} Apr 16 16:46:48.728740 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:46:48.728567 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-7c7x9" Apr 16 16:46:48.729992 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:46:48.729957 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-7c7x9" podUID="2a588298-3c56-44ea-bfaa-f8d9211ca2f4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.58:8080: connect: connection refused" Apr 16 16:46:48.744792 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:46:48.744741 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-7c7x9" podStartSLOduration=6.7447261560000005 podStartE2EDuration="6.744726156s" podCreationTimestamp="2026-04-16 16:46:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:46:48.742711802 +0000 UTC m=+2632.097701931" watchObservedRunningTime="2026-04-16 16:46:48.744726156 +0000 UTC m=+2632.099716285" Apr 16 16:46:49.571578 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:46:49.571534 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7f9684dc47-qcqtg" podUID="0ab4993a-f137-4a2d-8053-05a43f7e5c02" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.57:8080/v2/models/isvc-sklearn-v2-runtime/ready\": dial tcp 10.133.0.57:8080: connect: connection refused" Apr 16 16:46:49.732455 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:46:49.732409 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-7c7x9" podUID="2a588298-3c56-44ea-bfaa-f8d9211ca2f4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.58:8080: connect: connection refused" Apr 16 16:46:50.631322 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:46:50.631298 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7f9684dc47-qcqtg" Apr 16 16:46:50.687050 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:46:50.686975 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0ab4993a-f137-4a2d-8053-05a43f7e5c02-kserve-provision-location\") pod \"0ab4993a-f137-4a2d-8053-05a43f7e5c02\" (UID: \"0ab4993a-f137-4a2d-8053-05a43f7e5c02\") " Apr 16 16:46:50.687360 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:46:50.687337 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ab4993a-f137-4a2d-8053-05a43f7e5c02-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0ab4993a-f137-4a2d-8053-05a43f7e5c02" (UID: "0ab4993a-f137-4a2d-8053-05a43f7e5c02"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:46:50.736598 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:46:50.736563 2576 generic.go:358] "Generic (PLEG): container finished" podID="0ab4993a-f137-4a2d-8053-05a43f7e5c02" containerID="1ff2086d3d7f30504b128ef597b86364b261ce2abd5366741cd058d1963a1f3d" exitCode=0 Apr 16 16:46:50.737019 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:46:50.736635 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7f9684dc47-qcqtg" Apr 16 16:46:50.737019 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:46:50.736640 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7f9684dc47-qcqtg" event={"ID":"0ab4993a-f137-4a2d-8053-05a43f7e5c02","Type":"ContainerDied","Data":"1ff2086d3d7f30504b128ef597b86364b261ce2abd5366741cd058d1963a1f3d"} Apr 16 16:46:50.737019 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:46:50.736677 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7f9684dc47-qcqtg" event={"ID":"0ab4993a-f137-4a2d-8053-05a43f7e5c02","Type":"ContainerDied","Data":"ba3ecda07afcd1b99494242e23797164327b4b637944aa36d07b9585abaf91c9"} Apr 16 16:46:50.737019 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:46:50.736693 2576 scope.go:117] "RemoveContainer" containerID="1ff2086d3d7f30504b128ef597b86364b261ce2abd5366741cd058d1963a1f3d" Apr 16 16:46:50.744845 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:46:50.744825 2576 scope.go:117] "RemoveContainer" containerID="34e5c7809dd2cfe126c647e6008d99bc99ad124a0bff5e453dcd1791e2297ef5" Apr 16 16:46:50.751802 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:46:50.751786 2576 scope.go:117] "RemoveContainer" containerID="1ff2086d3d7f30504b128ef597b86364b261ce2abd5366741cd058d1963a1f3d" Apr 16 16:46:50.752019 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:46:50.751998 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ff2086d3d7f30504b128ef597b86364b261ce2abd5366741cd058d1963a1f3d\": container with ID starting with 1ff2086d3d7f30504b128ef597b86364b261ce2abd5366741cd058d1963a1f3d not found: ID does not exist" containerID="1ff2086d3d7f30504b128ef597b86364b261ce2abd5366741cd058d1963a1f3d" Apr 16 16:46:50.752068 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:46:50.752027 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ff2086d3d7f30504b128ef597b86364b261ce2abd5366741cd058d1963a1f3d"} err="failed to get container status \"1ff2086d3d7f30504b128ef597b86364b261ce2abd5366741cd058d1963a1f3d\": rpc error: code = NotFound desc = could not find container \"1ff2086d3d7f30504b128ef597b86364b261ce2abd5366741cd058d1963a1f3d\": container with ID starting with 1ff2086d3d7f30504b128ef597b86364b261ce2abd5366741cd058d1963a1f3d not found: ID does not exist" Apr 16 16:46:50.752068 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:46:50.752044 2576 scope.go:117] "RemoveContainer" containerID="34e5c7809dd2cfe126c647e6008d99bc99ad124a0bff5e453dcd1791e2297ef5" Apr 16 16:46:50.752300 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:46:50.752281 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34e5c7809dd2cfe126c647e6008d99bc99ad124a0bff5e453dcd1791e2297ef5\": container with ID starting with 34e5c7809dd2cfe126c647e6008d99bc99ad124a0bff5e453dcd1791e2297ef5 not found: ID does not exist" containerID="34e5c7809dd2cfe126c647e6008d99bc99ad124a0bff5e453dcd1791e2297ef5" Apr 16 16:46:50.752342 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:46:50.752307 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34e5c7809dd2cfe126c647e6008d99bc99ad124a0bff5e453dcd1791e2297ef5"} err="failed to get container status \"34e5c7809dd2cfe126c647e6008d99bc99ad124a0bff5e453dcd1791e2297ef5\": rpc error: code = NotFound desc = could not find container \"34e5c7809dd2cfe126c647e6008d99bc99ad124a0bff5e453dcd1791e2297ef5\": container with ID starting with 34e5c7809dd2cfe126c647e6008d99bc99ad124a0bff5e453dcd1791e2297ef5 not found: ID does not exist" Apr 16 16:46:50.757741 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:46:50.757721 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7f9684dc47-qcqtg"] Apr 16 16:46:50.761255 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:46:50.761234 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7f9684dc47-qcqtg"] Apr 16 16:46:50.788475 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:46:50.788449 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0ab4993a-f137-4a2d-8053-05a43f7e5c02-kserve-provision-location\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:46:51.279106 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:46:51.279074 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ab4993a-f137-4a2d-8053-05a43f7e5c02" path="/var/lib/kubelet/pods/0ab4993a-f137-4a2d-8053-05a43f7e5c02/volumes" Apr 16 16:46:59.732555 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:46:59.732513 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-7c7x9" podUID="2a588298-3c56-44ea-bfaa-f8d9211ca2f4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.58:8080: connect: connection refused" Apr 16 16:47:09.733389 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:47:09.733347 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-7c7x9" podUID="2a588298-3c56-44ea-bfaa-f8d9211ca2f4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.58:8080: connect: connection refused" Apr 16 16:47:19.732520 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:47:19.732478 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-7c7x9" podUID="2a588298-3c56-44ea-bfaa-f8d9211ca2f4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.58:8080: connect: connection refused" Apr 16 16:47:29.732492 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:47:29.732452 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-7c7x9" podUID="2a588298-3c56-44ea-bfaa-f8d9211ca2f4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.58:8080: connect: connection refused" Apr 16 16:47:39.732790 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:47:39.732738 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-7c7x9" podUID="2a588298-3c56-44ea-bfaa-f8d9211ca2f4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.58:8080: connect: connection refused" Apr 16 16:47:49.733355 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:47:49.733262 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-7c7x9" podUID="2a588298-3c56-44ea-bfaa-f8d9211ca2f4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.58:8080: connect: connection refused" Apr 16 16:47:59.733402 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:47:59.733369 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-7c7x9" Apr 16 16:48:03.166554 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:48:03.166517 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-7c7x9"] Apr 16 16:48:03.166940 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:48:03.166751 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-7c7x9" podUID="2a588298-3c56-44ea-bfaa-f8d9211ca2f4" containerName="kserve-container" containerID="cri-o://27fa297e16298cffb540602092e0e0d0cfc599ed7934994628bb5db696e8888f" gracePeriod=30 Apr 16 16:48:03.268364 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:48:03.268329 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-k9g4g"] Apr 16 16:48:03.268667 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:48:03.268654 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0ab4993a-f137-4a2d-8053-05a43f7e5c02" containerName="storage-initializer" Apr 16 16:48:03.268716 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:48:03.268668 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ab4993a-f137-4a2d-8053-05a43f7e5c02" containerName="storage-initializer" Apr 16 16:48:03.268716 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:48:03.268688 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0ab4993a-f137-4a2d-8053-05a43f7e5c02" containerName="kserve-container" Apr 16 16:48:03.268716 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:48:03.268694 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ab4993a-f137-4a2d-8053-05a43f7e5c02" containerName="kserve-container" Apr 16 16:48:03.268821 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:48:03.268772 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="0ab4993a-f137-4a2d-8053-05a43f7e5c02" containerName="kserve-container" Apr 16 16:48:03.270641 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:48:03.270624 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-k9g4g" Apr 16 16:48:03.288933 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:48:03.288907 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-k9g4g"] Apr 16 16:48:03.340512 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:48:03.340476 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dac40226-d90d-4a2c-b311-53101299950c-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-7bb9cdb449-k9g4g\" (UID: \"dac40226-d90d-4a2c-b311-53101299950c\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-k9g4g" Apr 16 16:48:03.441326 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:48:03.441241 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dac40226-d90d-4a2c-b311-53101299950c-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-7bb9cdb449-k9g4g\" (UID: \"dac40226-d90d-4a2c-b311-53101299950c\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-k9g4g" Apr 16 16:48:03.441600 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:48:03.441580 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dac40226-d90d-4a2c-b311-53101299950c-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-7bb9cdb449-k9g4g\" (UID: \"dac40226-d90d-4a2c-b311-53101299950c\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-k9g4g" Apr 16 16:48:03.580325 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:48:03.580291 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-k9g4g" Apr 16 16:48:03.706175 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:48:03.706105 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-k9g4g"] Apr 16 16:48:03.710105 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:48:03.710077 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddac40226_d90d_4a2c_b311_53101299950c.slice/crio-c0828e293a9e96bbf4ac77a0edc68f586936370973c877fb7dc151867895c854 WatchSource:0}: Error finding container c0828e293a9e96bbf4ac77a0edc68f586936370973c877fb7dc151867895c854: Status 404 returned error can't find the container with id c0828e293a9e96bbf4ac77a0edc68f586936370973c877fb7dc151867895c854 Apr 16 16:48:03.962170 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:48:03.962090 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-k9g4g" event={"ID":"dac40226-d90d-4a2c-b311-53101299950c","Type":"ContainerStarted","Data":"000fc0b045b84139a680b968484eae30199fb6f6d15390c49183687ad83730a7"} Apr 16 16:48:03.962170 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:48:03.962124 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-k9g4g" event={"ID":"dac40226-d90d-4a2c-b311-53101299950c","Type":"ContainerStarted","Data":"c0828e293a9e96bbf4ac77a0edc68f586936370973c877fb7dc151867895c854"} Apr 16 16:48:07.207256 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:48:07.207233 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-7c7x9" Apr 16 16:48:07.271061 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:48:07.271030 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2a588298-3c56-44ea-bfaa-f8d9211ca2f4-kserve-provision-location\") pod \"2a588298-3c56-44ea-bfaa-f8d9211ca2f4\" (UID: \"2a588298-3c56-44ea-bfaa-f8d9211ca2f4\") " Apr 16 16:48:07.271355 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:48:07.271334 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a588298-3c56-44ea-bfaa-f8d9211ca2f4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2a588298-3c56-44ea-bfaa-f8d9211ca2f4" (UID: "2a588298-3c56-44ea-bfaa-f8d9211ca2f4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:48:07.372308 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:48:07.372282 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2a588298-3c56-44ea-bfaa-f8d9211ca2f4-kserve-provision-location\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:48:07.974737 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:48:07.974705 2576 generic.go:358] "Generic (PLEG): container finished" podID="dac40226-d90d-4a2c-b311-53101299950c" containerID="000fc0b045b84139a680b968484eae30199fb6f6d15390c49183687ad83730a7" exitCode=0 Apr 16 16:48:07.974921 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:48:07.974787 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-k9g4g" event={"ID":"dac40226-d90d-4a2c-b311-53101299950c","Type":"ContainerDied","Data":"000fc0b045b84139a680b968484eae30199fb6f6d15390c49183687ad83730a7"} Apr 16 16:48:07.975975 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:48:07.975953 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:48:07.976184 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:48:07.976159 2576 generic.go:358] "Generic (PLEG): container finished" podID="2a588298-3c56-44ea-bfaa-f8d9211ca2f4" containerID="27fa297e16298cffb540602092e0e0d0cfc599ed7934994628bb5db696e8888f" exitCode=0 Apr 16 16:48:07.976322 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:48:07.976200 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-7c7x9" event={"ID":"2a588298-3c56-44ea-bfaa-f8d9211ca2f4","Type":"ContainerDied","Data":"27fa297e16298cffb540602092e0e0d0cfc599ed7934994628bb5db696e8888f"} Apr 16 16:48:07.976322 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:48:07.976250 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-7c7x9" event={"ID":"2a588298-3c56-44ea-bfaa-f8d9211ca2f4","Type":"ContainerDied","Data":"14cfb403566b11c3435da5a6a3052f01e9873de85daff3e0399bfbef12e8ff80"} Apr 16 16:48:07.976322 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:48:07.976250 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-7c7x9" Apr 16 16:48:07.976322 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:48:07.976262 2576 scope.go:117] "RemoveContainer" containerID="27fa297e16298cffb540602092e0e0d0cfc599ed7934994628bb5db696e8888f" Apr 16 16:48:07.988510 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:48:07.988490 2576 scope.go:117] "RemoveContainer" containerID="43304789b9a5b0964b7dcf072e71a16c1ab6f47b81fb24477df785816d6d2c36" Apr 16 16:48:07.998600 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:48:07.998583 2576 scope.go:117] "RemoveContainer" containerID="27fa297e16298cffb540602092e0e0d0cfc599ed7934994628bb5db696e8888f" Apr 16 16:48:07.998927 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:48:07.998899 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27fa297e16298cffb540602092e0e0d0cfc599ed7934994628bb5db696e8888f\": container with ID starting with 27fa297e16298cffb540602092e0e0d0cfc599ed7934994628bb5db696e8888f not found: ID does not exist" containerID="27fa297e16298cffb540602092e0e0d0cfc599ed7934994628bb5db696e8888f" Apr 16 16:48:07.999144 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:48:07.999097 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27fa297e16298cffb540602092e0e0d0cfc599ed7934994628bb5db696e8888f"} err="failed to get container status \"27fa297e16298cffb540602092e0e0d0cfc599ed7934994628bb5db696e8888f\": rpc error: code = NotFound desc = could not find container \"27fa297e16298cffb540602092e0e0d0cfc599ed7934994628bb5db696e8888f\": container with ID starting with 27fa297e16298cffb540602092e0e0d0cfc599ed7934994628bb5db696e8888f not found: ID does not exist" Apr 16 16:48:07.999273 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:48:07.999260 2576 scope.go:117] "RemoveContainer" containerID="43304789b9a5b0964b7dcf072e71a16c1ab6f47b81fb24477df785816d6d2c36" Apr 16 16:48:07.999671 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:48:07.999646 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43304789b9a5b0964b7dcf072e71a16c1ab6f47b81fb24477df785816d6d2c36\": container with ID starting with 43304789b9a5b0964b7dcf072e71a16c1ab6f47b81fb24477df785816d6d2c36 not found: ID does not exist" containerID="43304789b9a5b0964b7dcf072e71a16c1ab6f47b81fb24477df785816d6d2c36" Apr 16 16:48:07.999750 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:48:07.999680 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43304789b9a5b0964b7dcf072e71a16c1ab6f47b81fb24477df785816d6d2c36"} err="failed to get container status \"43304789b9a5b0964b7dcf072e71a16c1ab6f47b81fb24477df785816d6d2c36\": rpc error: code = NotFound desc = could not find container \"43304789b9a5b0964b7dcf072e71a16c1ab6f47b81fb24477df785816d6d2c36\": container with ID starting with 43304789b9a5b0964b7dcf072e71a16c1ab6f47b81fb24477df785816d6d2c36 not found: ID does not exist" Apr 16 16:48:08.016894 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:48:08.016872 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-7c7x9"] Apr 16 16:48:08.019187 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:48:08.019152 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5ff9c785b8-7c7x9"] Apr 16 16:48:08.981520 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:48:08.981474 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-k9g4g" event={"ID":"dac40226-d90d-4a2c-b311-53101299950c","Type":"ContainerStarted","Data":"5dd95d33fdddb76c576352fffca3afe76ecb22aee0dada0a5a24430d5e23ed07"} Apr 16 16:48:08.981993 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:48:08.981821 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-k9g4g" Apr 16 16:48:08.983103 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:48:08.983081 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-k9g4g" podUID="dac40226-d90d-4a2c-b311-53101299950c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.59:8080: connect: connection refused" Apr 16 16:48:08.997612 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:48:08.997570 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-k9g4g" podStartSLOduration=5.997555436 podStartE2EDuration="5.997555436s" podCreationTimestamp="2026-04-16 16:48:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:48:08.996986548 +0000 UTC m=+2712.351976678" watchObservedRunningTime="2026-04-16 16:48:08.997555436 +0000 UTC m=+2712.352545566" Apr 16 16:48:09.278916 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:48:09.278841 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a588298-3c56-44ea-bfaa-f8d9211ca2f4" path="/var/lib/kubelet/pods/2a588298-3c56-44ea-bfaa-f8d9211ca2f4/volumes" Apr 16 16:48:09.984782 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:48:09.984745 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-k9g4g" podUID="dac40226-d90d-4a2c-b311-53101299950c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.59:8080: connect: connection refused" Apr 16 16:48:19.985397 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:48:19.985354 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-k9g4g" podUID="dac40226-d90d-4a2c-b311-53101299950c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.59:8080: connect: connection refused" Apr 16 16:48:29.984864 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:48:29.984827 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-k9g4g" podUID="dac40226-d90d-4a2c-b311-53101299950c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.59:8080: connect: connection refused" Apr 16 16:48:39.985586 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:48:39.985546 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-k9g4g" podUID="dac40226-d90d-4a2c-b311-53101299950c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.59:8080: connect: connection refused" Apr 16 16:48:49.985069 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:48:49.985026 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-k9g4g" podUID="dac40226-d90d-4a2c-b311-53101299950c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.59:8080: connect: connection refused" Apr 16 16:48:59.985186 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:48:59.985145 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-k9g4g" podUID="dac40226-d90d-4a2c-b311-53101299950c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.59:8080: connect: connection refused" Apr 16 16:49:09.985389 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:49:09.985349 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-k9g4g" podUID="dac40226-d90d-4a2c-b311-53101299950c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.59:8080: connect: connection refused" Apr 16 16:49:19.986454 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:49:19.986374 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-k9g4g" Apr 16 16:49:23.348299 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:49:23.348268 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-k9g4g"] Apr 16 16:49:23.348697 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:49:23.348521 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-k9g4g" podUID="dac40226-d90d-4a2c-b311-53101299950c" containerName="kserve-container" containerID="cri-o://5dd95d33fdddb76c576352fffca3afe76ecb22aee0dada0a5a24430d5e23ed07" gracePeriod=30 Apr 16 16:49:23.405471 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:49:23.405437 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-864f6b7649-gcsbm"] Apr 16 16:49:23.405755 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:49:23.405742 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2a588298-3c56-44ea-bfaa-f8d9211ca2f4" containerName="kserve-container" Apr 16 16:49:23.405808 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:49:23.405757 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a588298-3c56-44ea-bfaa-f8d9211ca2f4" containerName="kserve-container" Apr 16 16:49:23.405808 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:49:23.405779 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2a588298-3c56-44ea-bfaa-f8d9211ca2f4" containerName="storage-initializer" Apr 16 16:49:23.405808 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:49:23.405785 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a588298-3c56-44ea-bfaa-f8d9211ca2f4" containerName="storage-initializer" Apr 16 16:49:23.405901 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:49:23.405836 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="2a588298-3c56-44ea-bfaa-f8d9211ca2f4" containerName="kserve-container" Apr 16 16:49:23.410301 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:49:23.410283 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-864f6b7649-gcsbm" Apr 16 16:49:23.418479 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:49:23.418266 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-864f6b7649-gcsbm"] Apr 16 16:49:23.449310 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:49:23.449279 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6d23a8c0-9b4c-4789-ba33-747199d2a496-kserve-provision-location\") pod \"isvc-tensorflow-predictor-864f6b7649-gcsbm\" (UID: \"6d23a8c0-9b4c-4789-ba33-747199d2a496\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-864f6b7649-gcsbm" Apr 16 16:49:23.550613 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:49:23.550570 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6d23a8c0-9b4c-4789-ba33-747199d2a496-kserve-provision-location\") pod \"isvc-tensorflow-predictor-864f6b7649-gcsbm\" (UID: \"6d23a8c0-9b4c-4789-ba33-747199d2a496\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-864f6b7649-gcsbm" Apr 16 16:49:23.550913 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:49:23.550896 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6d23a8c0-9b4c-4789-ba33-747199d2a496-kserve-provision-location\") pod \"isvc-tensorflow-predictor-864f6b7649-gcsbm\" (UID: \"6d23a8c0-9b4c-4789-ba33-747199d2a496\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-864f6b7649-gcsbm" Apr 16 16:49:23.722402 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:49:23.722369 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-864f6b7649-gcsbm" Apr 16 16:49:23.841333 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:49:23.841309 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-864f6b7649-gcsbm"] Apr 16 16:49:23.843878 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:49:23.843839 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d23a8c0_9b4c_4789_ba33_747199d2a496.slice/crio-52d12580544bc112f636a66641e0550f784c5d8988db0d8f562105c473491f52 WatchSource:0}: Error finding container 52d12580544bc112f636a66641e0550f784c5d8988db0d8f562105c473491f52: Status 404 returned error can't find the container with id 52d12580544bc112f636a66641e0550f784c5d8988db0d8f562105c473491f52 Apr 16 16:49:24.213912 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:49:24.213879 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-864f6b7649-gcsbm" event={"ID":"6d23a8c0-9b4c-4789-ba33-747199d2a496","Type":"ContainerStarted","Data":"b487e6ecfffed92b3f37a09ff377b7677db622fba53947317812b25055e3bad8"} Apr 16 16:49:24.213912 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:49:24.213914 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-864f6b7649-gcsbm" event={"ID":"6d23a8c0-9b4c-4789-ba33-747199d2a496","Type":"ContainerStarted","Data":"52d12580544bc112f636a66641e0550f784c5d8988db0d8f562105c473491f52"} Apr 16 16:49:27.693590 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:49:27.693565 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-k9g4g" Apr 16 16:49:27.781133 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:49:27.781020 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dac40226-d90d-4a2c-b311-53101299950c-kserve-provision-location\") pod \"dac40226-d90d-4a2c-b311-53101299950c\" (UID: \"dac40226-d90d-4a2c-b311-53101299950c\") " Apr 16 16:49:27.781423 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:49:27.781397 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dac40226-d90d-4a2c-b311-53101299950c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "dac40226-d90d-4a2c-b311-53101299950c" (UID: "dac40226-d90d-4a2c-b311-53101299950c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:49:27.881946 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:49:27.881906 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dac40226-d90d-4a2c-b311-53101299950c-kserve-provision-location\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:49:28.229675 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:49:28.229637 2576 generic.go:358] "Generic (PLEG): container finished" podID="dac40226-d90d-4a2c-b311-53101299950c" containerID="5dd95d33fdddb76c576352fffca3afe76ecb22aee0dada0a5a24430d5e23ed07" exitCode=0 Apr 16 16:49:28.229839 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:49:28.229721 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-k9g4g" Apr 16 16:49:28.229839 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:49:28.229726 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-k9g4g" event={"ID":"dac40226-d90d-4a2c-b311-53101299950c","Type":"ContainerDied","Data":"5dd95d33fdddb76c576352fffca3afe76ecb22aee0dada0a5a24430d5e23ed07"} Apr 16 16:49:28.229839 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:49:28.229776 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-k9g4g" event={"ID":"dac40226-d90d-4a2c-b311-53101299950c","Type":"ContainerDied","Data":"c0828e293a9e96bbf4ac77a0edc68f586936370973c877fb7dc151867895c854"} Apr 16 16:49:28.229839 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:49:28.229796 2576 scope.go:117] "RemoveContainer" containerID="5dd95d33fdddb76c576352fffca3afe76ecb22aee0dada0a5a24430d5e23ed07" Apr 16 16:49:28.239105 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:49:28.239082 2576 scope.go:117] "RemoveContainer" containerID="000fc0b045b84139a680b968484eae30199fb6f6d15390c49183687ad83730a7" Apr 16 16:49:28.246800 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:49:28.246778 2576 scope.go:117] "RemoveContainer" containerID="5dd95d33fdddb76c576352fffca3afe76ecb22aee0dada0a5a24430d5e23ed07" Apr 16 16:49:28.247047 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:49:28.247024 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5dd95d33fdddb76c576352fffca3afe76ecb22aee0dada0a5a24430d5e23ed07\": container with ID starting with 5dd95d33fdddb76c576352fffca3afe76ecb22aee0dada0a5a24430d5e23ed07 not found: ID does not exist" containerID="5dd95d33fdddb76c576352fffca3afe76ecb22aee0dada0a5a24430d5e23ed07" Apr 16 16:49:28.247090 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:49:28.247059 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dd95d33fdddb76c576352fffca3afe76ecb22aee0dada0a5a24430d5e23ed07"} err="failed to get container status \"5dd95d33fdddb76c576352fffca3afe76ecb22aee0dada0a5a24430d5e23ed07\": rpc error: code = NotFound desc = could not find container \"5dd95d33fdddb76c576352fffca3afe76ecb22aee0dada0a5a24430d5e23ed07\": container with ID starting with 5dd95d33fdddb76c576352fffca3afe76ecb22aee0dada0a5a24430d5e23ed07 not found: ID does not exist" Apr 16 16:49:28.247090 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:49:28.247076 2576 scope.go:117] "RemoveContainer" containerID="000fc0b045b84139a680b968484eae30199fb6f6d15390c49183687ad83730a7" Apr 16 16:49:28.247317 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:49:28.247300 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"000fc0b045b84139a680b968484eae30199fb6f6d15390c49183687ad83730a7\": container with ID starting with 000fc0b045b84139a680b968484eae30199fb6f6d15390c49183687ad83730a7 not found: ID does not exist" containerID="000fc0b045b84139a680b968484eae30199fb6f6d15390c49183687ad83730a7" Apr 16 16:49:28.247365 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:49:28.247325 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"000fc0b045b84139a680b968484eae30199fb6f6d15390c49183687ad83730a7"} err="failed to get container status \"000fc0b045b84139a680b968484eae30199fb6f6d15390c49183687ad83730a7\": rpc error: code = NotFound desc = could not find container \"000fc0b045b84139a680b968484eae30199fb6f6d15390c49183687ad83730a7\": container with ID starting with 000fc0b045b84139a680b968484eae30199fb6f6d15390c49183687ad83730a7 not found: ID does not exist" Apr 16 16:49:28.250831 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:49:28.250810 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-k9g4g"] Apr 16 16:49:28.254862 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:49:28.254841 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7bb9cdb449-k9g4g"] Apr 16 16:49:29.233929 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:49:29.233901 2576 generic.go:358] "Generic (PLEG): container finished" podID="6d23a8c0-9b4c-4789-ba33-747199d2a496" containerID="b487e6ecfffed92b3f37a09ff377b7677db622fba53947317812b25055e3bad8" exitCode=0 Apr 16 16:49:29.234309 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:49:29.233974 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-864f6b7649-gcsbm" event={"ID":"6d23a8c0-9b4c-4789-ba33-747199d2a496","Type":"ContainerDied","Data":"b487e6ecfffed92b3f37a09ff377b7677db622fba53947317812b25055e3bad8"} Apr 16 16:49:29.278860 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:49:29.278834 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dac40226-d90d-4a2c-b311-53101299950c" path="/var/lib/kubelet/pods/dac40226-d90d-4a2c-b311-53101299950c/volumes" Apr 16 16:49:33.251279 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:49:33.251245 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-864f6b7649-gcsbm" event={"ID":"6d23a8c0-9b4c-4789-ba33-747199d2a496","Type":"ContainerStarted","Data":"9b945855aa280ef97194c51fbc8c790ac9c76e0fc7b9caebb04261dc78ed4553"} Apr 16 16:49:33.251643 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:49:33.251517 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-864f6b7649-gcsbm" Apr 16 16:49:33.252685 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:49:33.252657 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-864f6b7649-gcsbm" podUID="6d23a8c0-9b4c-4789-ba33-747199d2a496" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.60:8080: connect: connection refused" Apr 16 16:49:33.269011 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:49:33.268930 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-864f6b7649-gcsbm" podStartSLOduration=6.494281932 podStartE2EDuration="10.268919229s" podCreationTimestamp="2026-04-16 16:49:23 +0000 UTC" firstStartedPulling="2026-04-16 16:49:29.235150477 +0000 UTC m=+2792.590140583" lastFinishedPulling="2026-04-16 16:49:33.009787773 +0000 UTC m=+2796.364777880" observedRunningTime="2026-04-16 16:49:33.267122514 +0000 UTC m=+2796.622112643" watchObservedRunningTime="2026-04-16 16:49:33.268919229 +0000 UTC m=+2796.623909357" Apr 16 16:49:34.255439 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:49:34.255404 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-864f6b7649-gcsbm" podUID="6d23a8c0-9b4c-4789-ba33-747199d2a496" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.60:8080: connect: connection refused" Apr 16 16:49:44.256110 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:49:44.256081 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-864f6b7649-gcsbm" Apr 16 16:50:03.556096 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:50:03.556058 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-864f6b7649-gcsbm"] Apr 16 16:50:03.556774 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:50:03.556426 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-864f6b7649-gcsbm" podUID="6d23a8c0-9b4c-4789-ba33-747199d2a496" containerName="kserve-container" containerID="cri-o://9b945855aa280ef97194c51fbc8c790ac9c76e0fc7b9caebb04261dc78ed4553" gracePeriod=30 Apr 16 16:50:03.622176 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:50:03.622144 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-7446cd7bb-rcsqq"] Apr 16 16:50:03.622478 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:50:03.622466 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dac40226-d90d-4a2c-b311-53101299950c" containerName="storage-initializer" Apr 16 16:50:03.622523 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:50:03.622480 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="dac40226-d90d-4a2c-b311-53101299950c" containerName="storage-initializer" Apr 16 16:50:03.622523 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:50:03.622498 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dac40226-d90d-4a2c-b311-53101299950c" containerName="kserve-container" Apr 16 16:50:03.622523 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:50:03.622504 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="dac40226-d90d-4a2c-b311-53101299950c" containerName="kserve-container" Apr 16 16:50:03.622621 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:50:03.622559 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="dac40226-d90d-4a2c-b311-53101299950c" containerName="kserve-container" Apr 16 16:50:03.629309 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:50:03.629287 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-7446cd7bb-rcsqq" Apr 16 16:50:03.639847 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:50:03.639822 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-7446cd7bb-rcsqq"] Apr 16 16:50:03.753711 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:50:03.753674 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59c8bd03-5046-4461-9e75-be04f23c7291-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-7446cd7bb-rcsqq\" (UID: \"59c8bd03-5046-4461-9e75-be04f23c7291\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-7446cd7bb-rcsqq" Apr 16 16:50:03.854775 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:50:03.854696 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59c8bd03-5046-4461-9e75-be04f23c7291-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-7446cd7bb-rcsqq\" (UID: \"59c8bd03-5046-4461-9e75-be04f23c7291\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-7446cd7bb-rcsqq" Apr 16 16:50:03.855037 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:50:03.855018 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59c8bd03-5046-4461-9e75-be04f23c7291-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-7446cd7bb-rcsqq\" (UID: \"59c8bd03-5046-4461-9e75-be04f23c7291\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-7446cd7bb-rcsqq" Apr 16 16:50:03.939863 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:50:03.939839 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-7446cd7bb-rcsqq" Apr 16 16:50:04.059774 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:50:04.059746 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-7446cd7bb-rcsqq"] Apr 16 16:50:04.062016 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:50:04.061981 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59c8bd03_5046_4461_9e75_be04f23c7291.slice/crio-66a6c46aced653773a7646407c13711c5ed863ad12f55ef558c196e6decf651b WatchSource:0}: Error finding container 66a6c46aced653773a7646407c13711c5ed863ad12f55ef558c196e6decf651b: Status 404 returned error can't find the container with id 66a6c46aced653773a7646407c13711c5ed863ad12f55ef558c196e6decf651b Apr 16 16:50:04.347833 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:50:04.347791 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-7446cd7bb-rcsqq" event={"ID":"59c8bd03-5046-4461-9e75-be04f23c7291","Type":"ContainerStarted","Data":"e3f28eca48d7bce4403f9ca2a9c5988a99bfabd28f639a218e85beee63791ab0"} Apr 16 16:50:04.347988 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:50:04.347841 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-7446cd7bb-rcsqq" event={"ID":"59c8bd03-5046-4461-9e75-be04f23c7291","Type":"ContainerStarted","Data":"66a6c46aced653773a7646407c13711c5ed863ad12f55ef558c196e6decf651b"} Apr 16 16:50:10.367426 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:50:10.367395 2576 generic.go:358] "Generic (PLEG): container finished" podID="59c8bd03-5046-4461-9e75-be04f23c7291" containerID="e3f28eca48d7bce4403f9ca2a9c5988a99bfabd28f639a218e85beee63791ab0" exitCode=0 Apr 16 16:50:10.367796 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:50:10.367470 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-7446cd7bb-rcsqq" event={"ID":"59c8bd03-5046-4461-9e75-be04f23c7291","Type":"ContainerDied","Data":"e3f28eca48d7bce4403f9ca2a9c5988a99bfabd28f639a218e85beee63791ab0"} Apr 16 16:50:11.372336 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:50:11.372299 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-7446cd7bb-rcsqq" event={"ID":"59c8bd03-5046-4461-9e75-be04f23c7291","Type":"ContainerStarted","Data":"566d64f3e05ba83f0e6eace49e4435be05e3d7cd132d9b9c55a4822a8f41ebb6"} Apr 16 16:50:11.372735 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:50:11.372585 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-7446cd7bb-rcsqq" Apr 16 16:50:11.373963 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:50:11.373937 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-7446cd7bb-rcsqq" podUID="59c8bd03-5046-4461-9e75-be04f23c7291" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.61:8080: connect: connection refused" Apr 16 16:50:11.394874 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:50:11.394827 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-7446cd7bb-rcsqq" podStartSLOduration=8.394812438 podStartE2EDuration="8.394812438s" podCreationTimestamp="2026-04-16 16:50:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:50:11.394232623 +0000 UTC m=+2834.749222743" watchObservedRunningTime="2026-04-16 16:50:11.394812438 +0000 UTC m=+2834.749802568" Apr 16 16:50:12.376708 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:50:12.376674 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-7446cd7bb-rcsqq" podUID="59c8bd03-5046-4461-9e75-be04f23c7291" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.61:8080: connect: connection refused" Apr 16 16:50:22.378038 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:50:22.378006 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-7446cd7bb-rcsqq" Apr 16 16:50:33.866037 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:50:33.865962 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-7446cd7bb-rcsqq"] Apr 16 16:50:33.866543 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:50:33.866346 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-7446cd7bb-rcsqq" podUID="59c8bd03-5046-4461-9e75-be04f23c7291" containerName="kserve-container" containerID="cri-o://566d64f3e05ba83f0e6eace49e4435be05e3d7cd132d9b9c55a4822a8f41ebb6" gracePeriod=30 Apr 16 16:50:33.941055 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:50:33.941020 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-5fc768bcf-lc4ds"] Apr 16 16:50:33.944342 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:50:33.944320 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-5fc768bcf-lc4ds" Apr 16 16:50:33.954646 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:50:33.954623 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-5fc768bcf-lc4ds"] Apr 16 16:50:33.999203 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:50:33.999178 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6353c00e-6b28-4e75-93fa-7e720bedda09-kserve-provision-location\") pod \"isvc-triton-predictor-5fc768bcf-lc4ds\" (UID: \"6353c00e-6b28-4e75-93fa-7e720bedda09\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-5fc768bcf-lc4ds" Apr 16 16:50:34.100425 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:50:34.100386 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6353c00e-6b28-4e75-93fa-7e720bedda09-kserve-provision-location\") pod \"isvc-triton-predictor-5fc768bcf-lc4ds\" (UID: \"6353c00e-6b28-4e75-93fa-7e720bedda09\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-5fc768bcf-lc4ds" Apr 16 16:50:34.100741 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:50:34.100721 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6353c00e-6b28-4e75-93fa-7e720bedda09-kserve-provision-location\") pod \"isvc-triton-predictor-5fc768bcf-lc4ds\" (UID: \"6353c00e-6b28-4e75-93fa-7e720bedda09\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-5fc768bcf-lc4ds" Apr 16 16:50:34.193865 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:50:34.193845 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-864f6b7649-gcsbm" Apr 16 16:50:34.201425 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:50:34.201404 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6d23a8c0-9b4c-4789-ba33-747199d2a496-kserve-provision-location\") pod \"6d23a8c0-9b4c-4789-ba33-747199d2a496\" (UID: \"6d23a8c0-9b4c-4789-ba33-747199d2a496\") " Apr 16 16:50:34.212712 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:50:34.212687 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d23a8c0-9b4c-4789-ba33-747199d2a496-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6d23a8c0-9b4c-4789-ba33-747199d2a496" (UID: "6d23a8c0-9b4c-4789-ba33-747199d2a496"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:50:34.254623 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:50:34.254600 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-5fc768bcf-lc4ds" Apr 16 16:50:34.302819 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:50:34.302778 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6d23a8c0-9b4c-4789-ba33-747199d2a496-kserve-provision-location\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:50:34.375656 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:50:34.375601 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-5fc768bcf-lc4ds"] Apr 16 16:50:34.378200 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:50:34.378171 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6353c00e_6b28_4e75_93fa_7e720bedda09.slice/crio-53ca912bf0a4aa1514f94e10812446026d1932fd28e18ab70e3507de676e6c39 WatchSource:0}: Error finding container 53ca912bf0a4aa1514f94e10812446026d1932fd28e18ab70e3507de676e6c39: Status 404 returned error can't find the container with id 53ca912bf0a4aa1514f94e10812446026d1932fd28e18ab70e3507de676e6c39 Apr 16 16:50:34.444726 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:50:34.444698 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-5fc768bcf-lc4ds" event={"ID":"6353c00e-6b28-4e75-93fa-7e720bedda09","Type":"ContainerStarted","Data":"7f54592c468e6118ac1b3c3096a5fa284f61868aca762855dcddbe801b02fe01"} Apr 16 16:50:34.444843 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:50:34.444737 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-5fc768bcf-lc4ds" event={"ID":"6353c00e-6b28-4e75-93fa-7e720bedda09","Type":"ContainerStarted","Data":"53ca912bf0a4aa1514f94e10812446026d1932fd28e18ab70e3507de676e6c39"} Apr 16 16:50:34.446183 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:50:34.446162 2576 generic.go:358] "Generic (PLEG): container finished" podID="6d23a8c0-9b4c-4789-ba33-747199d2a496" containerID="9b945855aa280ef97194c51fbc8c790ac9c76e0fc7b9caebb04261dc78ed4553" exitCode=137 Apr 16 16:50:34.446306 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:50:34.446236 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-864f6b7649-gcsbm" Apr 16 16:50:34.446306 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:50:34.446241 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-864f6b7649-gcsbm" event={"ID":"6d23a8c0-9b4c-4789-ba33-747199d2a496","Type":"ContainerDied","Data":"9b945855aa280ef97194c51fbc8c790ac9c76e0fc7b9caebb04261dc78ed4553"} Apr 16 16:50:34.446306 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:50:34.446278 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-864f6b7649-gcsbm" event={"ID":"6d23a8c0-9b4c-4789-ba33-747199d2a496","Type":"ContainerDied","Data":"52d12580544bc112f636a66641e0550f784c5d8988db0d8f562105c473491f52"} Apr 16 16:50:34.446306 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:50:34.446296 2576 scope.go:117] "RemoveContainer" containerID="9b945855aa280ef97194c51fbc8c790ac9c76e0fc7b9caebb04261dc78ed4553" Apr 16 16:50:34.455067 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:50:34.455049 2576 scope.go:117] "RemoveContainer" containerID="b487e6ecfffed92b3f37a09ff377b7677db622fba53947317812b25055e3bad8" Apr 16 16:50:34.463353 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:50:34.463338 2576 scope.go:117] "RemoveContainer" containerID="9b945855aa280ef97194c51fbc8c790ac9c76e0fc7b9caebb04261dc78ed4553" Apr 16 16:50:34.463603 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:50:34.463582 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b945855aa280ef97194c51fbc8c790ac9c76e0fc7b9caebb04261dc78ed4553\": container with ID starting with 9b945855aa280ef97194c51fbc8c790ac9c76e0fc7b9caebb04261dc78ed4553 not found: ID does not exist" containerID="9b945855aa280ef97194c51fbc8c790ac9c76e0fc7b9caebb04261dc78ed4553" Apr 16 16:50:34.463669 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:50:34.463616 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b945855aa280ef97194c51fbc8c790ac9c76e0fc7b9caebb04261dc78ed4553"} err="failed to get container status \"9b945855aa280ef97194c51fbc8c790ac9c76e0fc7b9caebb04261dc78ed4553\": rpc error: code = NotFound desc = could not find container \"9b945855aa280ef97194c51fbc8c790ac9c76e0fc7b9caebb04261dc78ed4553\": container with ID starting with 9b945855aa280ef97194c51fbc8c790ac9c76e0fc7b9caebb04261dc78ed4553 not found: ID does not exist" Apr 16 16:50:34.463669 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:50:34.463639 2576 scope.go:117] "RemoveContainer" containerID="b487e6ecfffed92b3f37a09ff377b7677db622fba53947317812b25055e3bad8" Apr 16 16:50:34.463881 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:50:34.463856 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b487e6ecfffed92b3f37a09ff377b7677db622fba53947317812b25055e3bad8\": container with ID starting with b487e6ecfffed92b3f37a09ff377b7677db622fba53947317812b25055e3bad8 not found: ID does not exist" containerID="b487e6ecfffed92b3f37a09ff377b7677db622fba53947317812b25055e3bad8" Apr 16 16:50:34.463984 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:50:34.463884 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b487e6ecfffed92b3f37a09ff377b7677db622fba53947317812b25055e3bad8"} err="failed to get container status \"b487e6ecfffed92b3f37a09ff377b7677db622fba53947317812b25055e3bad8\": rpc error: code = NotFound desc = could not find container \"b487e6ecfffed92b3f37a09ff377b7677db622fba53947317812b25055e3bad8\": container with ID starting with b487e6ecfffed92b3f37a09ff377b7677db622fba53947317812b25055e3bad8 not found: ID does not exist" Apr 16 16:50:34.473726 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:50:34.473704 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-864f6b7649-gcsbm"] Apr 16 16:50:34.478789 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:50:34.478767 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-864f6b7649-gcsbm"] Apr 16 16:50:35.278371 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:50:35.278337 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d23a8c0-9b4c-4789-ba33-747199d2a496" path="/var/lib/kubelet/pods/6d23a8c0-9b4c-4789-ba33-747199d2a496/volumes" Apr 16 16:50:38.463875 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:50:38.463796 2576 generic.go:358] "Generic (PLEG): container finished" podID="6353c00e-6b28-4e75-93fa-7e720bedda09" containerID="7f54592c468e6118ac1b3c3096a5fa284f61868aca762855dcddbe801b02fe01" exitCode=0 Apr 16 16:50:38.464231 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:50:38.463874 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-5fc768bcf-lc4ds" event={"ID":"6353c00e-6b28-4e75-93fa-7e720bedda09","Type":"ContainerDied","Data":"7f54592c468e6118ac1b3c3096a5fa284f61868aca762855dcddbe801b02fe01"} Apr 16 16:51:04.559476 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:51:04.559453 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-7446cd7bb-rcsqq" Apr 16 16:51:04.583795 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:51:04.583763 2576 generic.go:358] "Generic (PLEG): container finished" podID="59c8bd03-5046-4461-9e75-be04f23c7291" containerID="566d64f3e05ba83f0e6eace49e4435be05e3d7cd132d9b9c55a4822a8f41ebb6" exitCode=137 Apr 16 16:51:04.583950 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:51:04.583842 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-7446cd7bb-rcsqq" event={"ID":"59c8bd03-5046-4461-9e75-be04f23c7291","Type":"ContainerDied","Data":"566d64f3e05ba83f0e6eace49e4435be05e3d7cd132d9b9c55a4822a8f41ebb6"} Apr 16 16:51:04.583950 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:51:04.583872 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-7446cd7bb-rcsqq" event={"ID":"59c8bd03-5046-4461-9e75-be04f23c7291","Type":"ContainerDied","Data":"66a6c46aced653773a7646407c13711c5ed863ad12f55ef558c196e6decf651b"} Apr 16 16:51:04.583950 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:51:04.583894 2576 scope.go:117] "RemoveContainer" containerID="566d64f3e05ba83f0e6eace49e4435be05e3d7cd132d9b9c55a4822a8f41ebb6" Apr 16 16:51:04.583950 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:51:04.583848 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-7446cd7bb-rcsqq" Apr 16 16:51:04.595450 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:51:04.595427 2576 scope.go:117] "RemoveContainer" containerID="e3f28eca48d7bce4403f9ca2a9c5988a99bfabd28f639a218e85beee63791ab0" Apr 16 16:51:04.609133 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:51:04.609110 2576 scope.go:117] "RemoveContainer" containerID="566d64f3e05ba83f0e6eace49e4435be05e3d7cd132d9b9c55a4822a8f41ebb6" Apr 16 16:51:04.609643 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:51:04.609616 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"566d64f3e05ba83f0e6eace49e4435be05e3d7cd132d9b9c55a4822a8f41ebb6\": container with ID starting with 566d64f3e05ba83f0e6eace49e4435be05e3d7cd132d9b9c55a4822a8f41ebb6 not found: ID does not exist" containerID="566d64f3e05ba83f0e6eace49e4435be05e3d7cd132d9b9c55a4822a8f41ebb6" Apr 16 16:51:04.609762 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:51:04.609650 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"566d64f3e05ba83f0e6eace49e4435be05e3d7cd132d9b9c55a4822a8f41ebb6"} err="failed to get container status \"566d64f3e05ba83f0e6eace49e4435be05e3d7cd132d9b9c55a4822a8f41ebb6\": rpc error: code = NotFound desc = could not find container \"566d64f3e05ba83f0e6eace49e4435be05e3d7cd132d9b9c55a4822a8f41ebb6\": container with ID starting with 566d64f3e05ba83f0e6eace49e4435be05e3d7cd132d9b9c55a4822a8f41ebb6 not found: ID does not exist" Apr 16 16:51:04.609762 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:51:04.609674 2576 scope.go:117] "RemoveContainer" containerID="e3f28eca48d7bce4403f9ca2a9c5988a99bfabd28f639a218e85beee63791ab0" Apr 16 16:51:04.610136 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:51:04.610050 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3f28eca48d7bce4403f9ca2a9c5988a99bfabd28f639a218e85beee63791ab0\": container with ID starting with e3f28eca48d7bce4403f9ca2a9c5988a99bfabd28f639a218e85beee63791ab0 not found: ID does not exist" containerID="e3f28eca48d7bce4403f9ca2a9c5988a99bfabd28f639a218e85beee63791ab0" Apr 16 16:51:04.610136 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:51:04.610083 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3f28eca48d7bce4403f9ca2a9c5988a99bfabd28f639a218e85beee63791ab0"} err="failed to get container status \"e3f28eca48d7bce4403f9ca2a9c5988a99bfabd28f639a218e85beee63791ab0\": rpc error: code = NotFound desc = could not find container \"e3f28eca48d7bce4403f9ca2a9c5988a99bfabd28f639a218e85beee63791ab0\": container with ID starting with e3f28eca48d7bce4403f9ca2a9c5988a99bfabd28f639a218e85beee63791ab0 not found: ID does not exist" Apr 16 16:51:04.674851 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:51:04.674774 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59c8bd03-5046-4461-9e75-be04f23c7291-kserve-provision-location\") pod \"59c8bd03-5046-4461-9e75-be04f23c7291\" (UID: \"59c8bd03-5046-4461-9e75-be04f23c7291\") " Apr 16 16:51:04.679450 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:51:04.679419 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59c8bd03-5046-4461-9e75-be04f23c7291-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "59c8bd03-5046-4461-9e75-be04f23c7291" (UID: "59c8bd03-5046-4461-9e75-be04f23c7291"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:51:04.775845 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:51:04.775819 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/59c8bd03-5046-4461-9e75-be04f23c7291-kserve-provision-location\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:51:04.910187 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:51:04.910155 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-7446cd7bb-rcsqq"] Apr 16 16:51:04.913776 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:51:04.913751 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-7446cd7bb-rcsqq"] Apr 16 16:51:05.279709 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:51:05.279674 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59c8bd03-5046-4461-9e75-be04f23c7291" path="/var/lib/kubelet/pods/59c8bd03-5046-4461-9e75-be04f23c7291/volumes" Apr 16 16:52:32.889371 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:52:32.889336 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-5fc768bcf-lc4ds" event={"ID":"6353c00e-6b28-4e75-93fa-7e720bedda09","Type":"ContainerStarted","Data":"939ddf014903ef2c3c0f684758bf0d15488721bcbb8411ed8ecc14b5ec93e3e9"} Apr 16 16:52:32.889784 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:52:32.889531 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-5fc768bcf-lc4ds" Apr 16 16:52:32.890748 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:52:32.890726 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-5fc768bcf-lc4ds" podUID="6353c00e-6b28-4e75-93fa-7e720bedda09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.62:8080: connect: connection refused" Apr 16 16:52:32.907376 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:52:32.907333 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-triton-predictor-5fc768bcf-lc4ds" podStartSLOduration=6.238256317 podStartE2EDuration="1m59.907320535s" podCreationTimestamp="2026-04-16 16:50:33 +0000 UTC" firstStartedPulling="2026-04-16 16:50:38.464874515 +0000 UTC m=+2861.819864622" lastFinishedPulling="2026-04-16 16:52:32.133938729 +0000 UTC m=+2975.488928840" observedRunningTime="2026-04-16 16:52:32.905355319 +0000 UTC m=+2976.260345448" watchObservedRunningTime="2026-04-16 16:52:32.907320535 +0000 UTC m=+2976.262310681" Apr 16 16:52:33.893312 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:52:33.893279 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-5fc768bcf-lc4ds" podUID="6353c00e-6b28-4e75-93fa-7e720bedda09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.62:8080: connect: connection refused" Apr 16 16:52:43.894124 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:52:43.894092 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-5fc768bcf-lc4ds" Apr 16 16:52:55.396806 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:52:55.396775 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-5fc768bcf-lc4ds"] Apr 16 16:52:55.397257 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:52:55.397035 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-triton-predictor-5fc768bcf-lc4ds" podUID="6353c00e-6b28-4e75-93fa-7e720bedda09" containerName="kserve-container" containerID="cri-o://939ddf014903ef2c3c0f684758bf0d15488721bcbb8411ed8ecc14b5ec93e3e9" gracePeriod=30 Apr 16 16:52:55.554730 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:52:55.554693 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6bd4d9fcc8-drgbd"] Apr 16 16:52:55.555001 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:52:55.554990 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59c8bd03-5046-4461-9e75-be04f23c7291" containerName="storage-initializer" Apr 16 16:52:55.555046 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:52:55.555003 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="59c8bd03-5046-4461-9e75-be04f23c7291" containerName="storage-initializer" Apr 16 16:52:55.555046 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:52:55.555014 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="59c8bd03-5046-4461-9e75-be04f23c7291" containerName="kserve-container" Apr 16 16:52:55.555046 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:52:55.555020 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="59c8bd03-5046-4461-9e75-be04f23c7291" containerName="kserve-container" Apr 16 16:52:55.555046 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:52:55.555030 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6d23a8c0-9b4c-4789-ba33-747199d2a496" containerName="storage-initializer" Apr 16 16:52:55.555046 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:52:55.555036 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d23a8c0-9b4c-4789-ba33-747199d2a496" containerName="storage-initializer" Apr 16 16:52:55.555233 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:52:55.555049 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6d23a8c0-9b4c-4789-ba33-747199d2a496" containerName="kserve-container" Apr 16 16:52:55.555233 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:52:55.555055 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d23a8c0-9b4c-4789-ba33-747199d2a496" containerName="kserve-container" Apr 16 16:52:55.555233 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:52:55.555106 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="6d23a8c0-9b4c-4789-ba33-747199d2a496" containerName="kserve-container" Apr 16 16:52:55.555233 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:52:55.555116 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="59c8bd03-5046-4461-9e75-be04f23c7291" containerName="kserve-container" Apr 16 16:52:55.559809 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:52:55.559791 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6bd4d9fcc8-drgbd" Apr 16 16:52:55.565429 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:52:55.565407 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6bd4d9fcc8-drgbd"] Apr 16 16:52:55.589266 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:52:55.589234 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3ac9d3c9-1777-4123-8c54-634d2819b9da-kserve-provision-location\") pod \"isvc-xgboost-predictor-6bd4d9fcc8-drgbd\" (UID: \"3ac9d3c9-1777-4123-8c54-634d2819b9da\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6bd4d9fcc8-drgbd" Apr 16 16:52:55.690534 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:52:55.690456 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3ac9d3c9-1777-4123-8c54-634d2819b9da-kserve-provision-location\") pod \"isvc-xgboost-predictor-6bd4d9fcc8-drgbd\" (UID: \"3ac9d3c9-1777-4123-8c54-634d2819b9da\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6bd4d9fcc8-drgbd" Apr 16 16:52:55.690827 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:52:55.690810 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3ac9d3c9-1777-4123-8c54-634d2819b9da-kserve-provision-location\") pod \"isvc-xgboost-predictor-6bd4d9fcc8-drgbd\" (UID: \"3ac9d3c9-1777-4123-8c54-634d2819b9da\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6bd4d9fcc8-drgbd" Apr 16 16:52:55.869679 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:52:55.869653 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6bd4d9fcc8-drgbd" Apr 16 16:52:56.027708 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:52:56.027672 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6bd4d9fcc8-drgbd"] Apr 16 16:52:56.031283 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:52:56.031255 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ac9d3c9_1777_4123_8c54_634d2819b9da.slice/crio-e1dd2bf72e8f9179bf83145045d54cc31fb4192fbbe586f9e9b93778634c4f70 WatchSource:0}: Error finding container e1dd2bf72e8f9179bf83145045d54cc31fb4192fbbe586f9e9b93778634c4f70: Status 404 returned error can't find the container with id e1dd2bf72e8f9179bf83145045d54cc31fb4192fbbe586f9e9b93778634c4f70 Apr 16 16:52:56.971887 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:52:56.971853 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6bd4d9fcc8-drgbd" event={"ID":"3ac9d3c9-1777-4123-8c54-634d2819b9da","Type":"ContainerStarted","Data":"1ac0ad8a16a7dd91cad48367fadb994a7bf8d2ec7a02da071563ffbf2fea0ec0"} Apr 16 16:52:56.971887 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:52:56.971888 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6bd4d9fcc8-drgbd" event={"ID":"3ac9d3c9-1777-4123-8c54-634d2819b9da","Type":"ContainerStarted","Data":"e1dd2bf72e8f9179bf83145045d54cc31fb4192fbbe586f9e9b93778634c4f70"} Apr 16 16:52:57.666259 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:52:57.666237 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-5fc768bcf-lc4ds" Apr 16 16:52:57.706806 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:52:57.706736 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6353c00e-6b28-4e75-93fa-7e720bedda09-kserve-provision-location\") pod \"6353c00e-6b28-4e75-93fa-7e720bedda09\" (UID: \"6353c00e-6b28-4e75-93fa-7e720bedda09\") " Apr 16 16:52:57.707120 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:52:57.707099 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6353c00e-6b28-4e75-93fa-7e720bedda09-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6353c00e-6b28-4e75-93fa-7e720bedda09" (UID: "6353c00e-6b28-4e75-93fa-7e720bedda09"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:52:57.807308 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:52:57.807279 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6353c00e-6b28-4e75-93fa-7e720bedda09-kserve-provision-location\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:52:57.975733 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:52:57.975652 2576 generic.go:358] "Generic (PLEG): container finished" podID="6353c00e-6b28-4e75-93fa-7e720bedda09" containerID="939ddf014903ef2c3c0f684758bf0d15488721bcbb8411ed8ecc14b5ec93e3e9" exitCode=0 Apr 16 16:52:57.975733 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:52:57.975718 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-5fc768bcf-lc4ds" Apr 16 16:52:57.975733 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:52:57.975720 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-5fc768bcf-lc4ds" event={"ID":"6353c00e-6b28-4e75-93fa-7e720bedda09","Type":"ContainerDied","Data":"939ddf014903ef2c3c0f684758bf0d15488721bcbb8411ed8ecc14b5ec93e3e9"} Apr 16 16:52:57.976295 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:52:57.975758 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-5fc768bcf-lc4ds" event={"ID":"6353c00e-6b28-4e75-93fa-7e720bedda09","Type":"ContainerDied","Data":"53ca912bf0a4aa1514f94e10812446026d1932fd28e18ab70e3507de676e6c39"} Apr 16 16:52:57.976295 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:52:57.975774 2576 scope.go:117] "RemoveContainer" containerID="939ddf014903ef2c3c0f684758bf0d15488721bcbb8411ed8ecc14b5ec93e3e9" Apr 16 16:52:57.984270 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:52:57.984252 2576 scope.go:117] "RemoveContainer" containerID="7f54592c468e6118ac1b3c3096a5fa284f61868aca762855dcddbe801b02fe01" Apr 16 16:52:57.991173 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:52:57.991155 2576 scope.go:117] "RemoveContainer" containerID="939ddf014903ef2c3c0f684758bf0d15488721bcbb8411ed8ecc14b5ec93e3e9" Apr 16 16:52:57.991428 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:52:57.991408 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"939ddf014903ef2c3c0f684758bf0d15488721bcbb8411ed8ecc14b5ec93e3e9\": container with ID starting with 939ddf014903ef2c3c0f684758bf0d15488721bcbb8411ed8ecc14b5ec93e3e9 not found: ID does not exist" containerID="939ddf014903ef2c3c0f684758bf0d15488721bcbb8411ed8ecc14b5ec93e3e9" Apr 16 16:52:57.991505 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:52:57.991440 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"939ddf014903ef2c3c0f684758bf0d15488721bcbb8411ed8ecc14b5ec93e3e9"} err="failed to get container status \"939ddf014903ef2c3c0f684758bf0d15488721bcbb8411ed8ecc14b5ec93e3e9\": rpc error: code = NotFound desc = could not find container \"939ddf014903ef2c3c0f684758bf0d15488721bcbb8411ed8ecc14b5ec93e3e9\": container with ID starting with 939ddf014903ef2c3c0f684758bf0d15488721bcbb8411ed8ecc14b5ec93e3e9 not found: ID does not exist" Apr 16 16:52:57.991505 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:52:57.991464 2576 scope.go:117] "RemoveContainer" containerID="7f54592c468e6118ac1b3c3096a5fa284f61868aca762855dcddbe801b02fe01" Apr 16 16:52:57.991681 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:52:57.991665 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f54592c468e6118ac1b3c3096a5fa284f61868aca762855dcddbe801b02fe01\": container with ID starting with 7f54592c468e6118ac1b3c3096a5fa284f61868aca762855dcddbe801b02fe01 not found: ID does not exist" containerID="7f54592c468e6118ac1b3c3096a5fa284f61868aca762855dcddbe801b02fe01" Apr 16 16:52:57.991726 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:52:57.991688 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f54592c468e6118ac1b3c3096a5fa284f61868aca762855dcddbe801b02fe01"} err="failed to get container status \"7f54592c468e6118ac1b3c3096a5fa284f61868aca762855dcddbe801b02fe01\": rpc error: code = NotFound desc = could not find container \"7f54592c468e6118ac1b3c3096a5fa284f61868aca762855dcddbe801b02fe01\": container with ID starting with 7f54592c468e6118ac1b3c3096a5fa284f61868aca762855dcddbe801b02fe01 not found: ID does not exist" Apr 16 16:52:57.998807 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:52:57.998787 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-5fc768bcf-lc4ds"] Apr 16 16:52:58.002727 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:52:58.002709 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-5fc768bcf-lc4ds"] Apr 16 16:52:59.277764 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:52:59.277723 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6353c00e-6b28-4e75-93fa-7e720bedda09" path="/var/lib/kubelet/pods/6353c00e-6b28-4e75-93fa-7e720bedda09/volumes" Apr 16 16:52:59.986181 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:52:59.986157 2576 generic.go:358] "Generic (PLEG): container finished" podID="3ac9d3c9-1777-4123-8c54-634d2819b9da" containerID="1ac0ad8a16a7dd91cad48367fadb994a7bf8d2ec7a02da071563ffbf2fea0ec0" exitCode=0 Apr 16 16:52:59.986339 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:52:59.986243 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6bd4d9fcc8-drgbd" event={"ID":"3ac9d3c9-1777-4123-8c54-634d2819b9da","Type":"ContainerDied","Data":"1ac0ad8a16a7dd91cad48367fadb994a7bf8d2ec7a02da071563ffbf2fea0ec0"} Apr 16 16:53:20.063934 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:53:20.063903 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6bd4d9fcc8-drgbd" event={"ID":"3ac9d3c9-1777-4123-8c54-634d2819b9da","Type":"ContainerStarted","Data":"2a824421a491cdd308988765165afe2f75e4bdd37f8647d3d231bdb4b3ceea9e"} Apr 16 16:53:20.064390 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:53:20.064201 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6bd4d9fcc8-drgbd" Apr 16 16:53:20.065783 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:53:20.065756 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6bd4d9fcc8-drgbd" podUID="3ac9d3c9-1777-4123-8c54-634d2819b9da" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.63:8080: connect: connection refused" Apr 16 16:53:20.080443 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:53:20.080405 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6bd4d9fcc8-drgbd" podStartSLOduration=5.133005251 podStartE2EDuration="25.080393754s" podCreationTimestamp="2026-04-16 16:52:55 +0000 UTC" firstStartedPulling="2026-04-16 16:52:59.987303366 +0000 UTC m=+3003.342293472" lastFinishedPulling="2026-04-16 16:53:19.934691868 +0000 UTC m=+3023.289681975" observedRunningTime="2026-04-16 16:53:20.07865713 +0000 UTC m=+3023.433647260" watchObservedRunningTime="2026-04-16 16:53:20.080393754 +0000 UTC m=+3023.435383883" Apr 16 16:53:21.068904 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:53:21.068869 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6bd4d9fcc8-drgbd" podUID="3ac9d3c9-1777-4123-8c54-634d2819b9da" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.63:8080: connect: connection refused" Apr 16 16:53:31.069734 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:53:31.069692 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6bd4d9fcc8-drgbd" podUID="3ac9d3c9-1777-4123-8c54-634d2819b9da" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.63:8080: connect: connection refused" Apr 16 16:53:41.069050 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:53:41.069005 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6bd4d9fcc8-drgbd" podUID="3ac9d3c9-1777-4123-8c54-634d2819b9da" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.63:8080: connect: connection refused" Apr 16 16:53:51.068924 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:53:51.068841 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6bd4d9fcc8-drgbd" podUID="3ac9d3c9-1777-4123-8c54-634d2819b9da" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.63:8080: connect: connection refused" Apr 16 16:54:01.069484 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:54:01.069440 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6bd4d9fcc8-drgbd" podUID="3ac9d3c9-1777-4123-8c54-634d2819b9da" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.63:8080: connect: connection refused" Apr 16 16:54:11.069428 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:54:11.069387 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6bd4d9fcc8-drgbd" podUID="3ac9d3c9-1777-4123-8c54-634d2819b9da" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.63:8080: connect: connection refused" Apr 16 16:54:21.070078 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:54:21.070044 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6bd4d9fcc8-drgbd" Apr 16 16:54:25.595595 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:54:25.595563 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6bd4d9fcc8-drgbd"] Apr 16 16:54:25.595996 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:54:25.595790 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6bd4d9fcc8-drgbd" podUID="3ac9d3c9-1777-4123-8c54-634d2819b9da" containerName="kserve-container" containerID="cri-o://2a824421a491cdd308988765165afe2f75e4bdd37f8647d3d231bdb4b3ceea9e" gracePeriod=30 Apr 16 16:54:25.673638 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:54:25.673605 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-9c799c49c-xbxpr"] Apr 16 16:54:25.673923 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:54:25.673910 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6353c00e-6b28-4e75-93fa-7e720bedda09" containerName="kserve-container" Apr 16 16:54:25.673923 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:54:25.673923 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="6353c00e-6b28-4e75-93fa-7e720bedda09" containerName="kserve-container" Apr 16 16:54:25.674010 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:54:25.673935 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6353c00e-6b28-4e75-93fa-7e720bedda09" containerName="storage-initializer" Apr 16 16:54:25.674010 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:54:25.673941 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="6353c00e-6b28-4e75-93fa-7e720bedda09" containerName="storage-initializer" Apr 16 16:54:25.674010 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:54:25.673998 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="6353c00e-6b28-4e75-93fa-7e720bedda09" containerName="kserve-container" Apr 16 16:54:25.679499 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:54:25.679472 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-9c799c49c-xbxpr" Apr 16 16:54:25.685622 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:54:25.685594 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-9c799c49c-xbxpr"] Apr 16 16:54:25.781815 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:54:25.781785 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f999475f-ccdd-45b8-b3e0-bcfa2d0a0e3c-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-9c799c49c-xbxpr\" (UID: \"f999475f-ccdd-45b8-b3e0-bcfa2d0a0e3c\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-9c799c49c-xbxpr" Apr 16 16:54:25.882799 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:54:25.882718 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f999475f-ccdd-45b8-b3e0-bcfa2d0a0e3c-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-9c799c49c-xbxpr\" (UID: \"f999475f-ccdd-45b8-b3e0-bcfa2d0a0e3c\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-9c799c49c-xbxpr" Apr 16 16:54:25.883054 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:54:25.883036 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f999475f-ccdd-45b8-b3e0-bcfa2d0a0e3c-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-9c799c49c-xbxpr\" (UID: \"f999475f-ccdd-45b8-b3e0-bcfa2d0a0e3c\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-9c799c49c-xbxpr" Apr 16 16:54:25.991943 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:54:25.991905 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-9c799c49c-xbxpr" Apr 16 16:54:26.144560 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:54:26.144533 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-9c799c49c-xbxpr"] Apr 16 16:54:26.147932 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:54:26.147902 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf999475f_ccdd_45b8_b3e0_bcfa2d0a0e3c.slice/crio-4533349b99c088fdfe6d5eac83232a1bbc1bbb863640541329469cc9ce99ff58 WatchSource:0}: Error finding container 4533349b99c088fdfe6d5eac83232a1bbc1bbb863640541329469cc9ce99ff58: Status 404 returned error can't find the container with id 4533349b99c088fdfe6d5eac83232a1bbc1bbb863640541329469cc9ce99ff58 Apr 16 16:54:26.150022 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:54:26.149999 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:54:26.278337 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:54:26.278304 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-9c799c49c-xbxpr" event={"ID":"f999475f-ccdd-45b8-b3e0-bcfa2d0a0e3c","Type":"ContainerStarted","Data":"1e7ea48cfa1b75ec3328415f839aaa6c31de8ecb0596c98977c8201c6236726c"} Apr 16 16:54:26.278337 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:54:26.278336 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-9c799c49c-xbxpr" event={"ID":"f999475f-ccdd-45b8-b3e0-bcfa2d0a0e3c","Type":"ContainerStarted","Data":"4533349b99c088fdfe6d5eac83232a1bbc1bbb863640541329469cc9ce99ff58"} Apr 16 16:54:29.030585 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:54:29.030565 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6bd4d9fcc8-drgbd" Apr 16 16:54:29.109289 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:54:29.109204 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3ac9d3c9-1777-4123-8c54-634d2819b9da-kserve-provision-location\") pod \"3ac9d3c9-1777-4123-8c54-634d2819b9da\" (UID: \"3ac9d3c9-1777-4123-8c54-634d2819b9da\") " Apr 16 16:54:29.109495 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:54:29.109472 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ac9d3c9-1777-4123-8c54-634d2819b9da-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3ac9d3c9-1777-4123-8c54-634d2819b9da" (UID: "3ac9d3c9-1777-4123-8c54-634d2819b9da"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:54:29.210586 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:54:29.210560 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3ac9d3c9-1777-4123-8c54-634d2819b9da-kserve-provision-location\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:54:29.290467 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:54:29.290442 2576 generic.go:358] "Generic (PLEG): container finished" podID="3ac9d3c9-1777-4123-8c54-634d2819b9da" containerID="2a824421a491cdd308988765165afe2f75e4bdd37f8647d3d231bdb4b3ceea9e" exitCode=0 Apr 16 16:54:29.290565 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:54:29.290499 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6bd4d9fcc8-drgbd" event={"ID":"3ac9d3c9-1777-4123-8c54-634d2819b9da","Type":"ContainerDied","Data":"2a824421a491cdd308988765165afe2f75e4bdd37f8647d3d231bdb4b3ceea9e"} Apr 16 16:54:29.290565 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:54:29.290502 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6bd4d9fcc8-drgbd" Apr 16 16:54:29.290565 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:54:29.290521 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6bd4d9fcc8-drgbd" event={"ID":"3ac9d3c9-1777-4123-8c54-634d2819b9da","Type":"ContainerDied","Data":"e1dd2bf72e8f9179bf83145045d54cc31fb4192fbbe586f9e9b93778634c4f70"} Apr 16 16:54:29.290565 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:54:29.290536 2576 scope.go:117] "RemoveContainer" containerID="2a824421a491cdd308988765165afe2f75e4bdd37f8647d3d231bdb4b3ceea9e" Apr 16 16:54:29.298086 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:54:29.298070 2576 scope.go:117] "RemoveContainer" containerID="1ac0ad8a16a7dd91cad48367fadb994a7bf8d2ec7a02da071563ffbf2fea0ec0" Apr 16 16:54:29.304877 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:54:29.304860 2576 scope.go:117] "RemoveContainer" containerID="2a824421a491cdd308988765165afe2f75e4bdd37f8647d3d231bdb4b3ceea9e" Apr 16 16:54:29.305192 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:54:29.305160 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a824421a491cdd308988765165afe2f75e4bdd37f8647d3d231bdb4b3ceea9e\": container with ID starting with 2a824421a491cdd308988765165afe2f75e4bdd37f8647d3d231bdb4b3ceea9e not found: ID does not exist" containerID="2a824421a491cdd308988765165afe2f75e4bdd37f8647d3d231bdb4b3ceea9e" Apr 16 16:54:29.305305 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:54:29.305201 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a824421a491cdd308988765165afe2f75e4bdd37f8647d3d231bdb4b3ceea9e"} err="failed to get container status \"2a824421a491cdd308988765165afe2f75e4bdd37f8647d3d231bdb4b3ceea9e\": rpc error: code = NotFound desc = could not find container \"2a824421a491cdd308988765165afe2f75e4bdd37f8647d3d231bdb4b3ceea9e\": container with ID starting with 2a824421a491cdd308988765165afe2f75e4bdd37f8647d3d231bdb4b3ceea9e not found: ID does not exist" Apr 16 16:54:29.305305 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:54:29.305247 2576 scope.go:117] "RemoveContainer" containerID="1ac0ad8a16a7dd91cad48367fadb994a7bf8d2ec7a02da071563ffbf2fea0ec0" Apr 16 16:54:29.305504 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:54:29.305484 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ac0ad8a16a7dd91cad48367fadb994a7bf8d2ec7a02da071563ffbf2fea0ec0\": container with ID starting with 1ac0ad8a16a7dd91cad48367fadb994a7bf8d2ec7a02da071563ffbf2fea0ec0 not found: ID does not exist" containerID="1ac0ad8a16a7dd91cad48367fadb994a7bf8d2ec7a02da071563ffbf2fea0ec0" Apr 16 16:54:29.305552 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:54:29.305511 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ac0ad8a16a7dd91cad48367fadb994a7bf8d2ec7a02da071563ffbf2fea0ec0"} err="failed to get container status \"1ac0ad8a16a7dd91cad48367fadb994a7bf8d2ec7a02da071563ffbf2fea0ec0\": rpc error: code = NotFound desc = could not find container \"1ac0ad8a16a7dd91cad48367fadb994a7bf8d2ec7a02da071563ffbf2fea0ec0\": container with ID starting with 1ac0ad8a16a7dd91cad48367fadb994a7bf8d2ec7a02da071563ffbf2fea0ec0 not found: ID does not exist" Apr 16 16:54:29.306712 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:54:29.306691 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6bd4d9fcc8-drgbd"] Apr 16 16:54:29.309693 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:54:29.309669 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6bd4d9fcc8-drgbd"] Apr 16 16:54:30.295612 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:54:30.295584 2576 generic.go:358] "Generic (PLEG): container finished" podID="f999475f-ccdd-45b8-b3e0-bcfa2d0a0e3c" containerID="1e7ea48cfa1b75ec3328415f839aaa6c31de8ecb0596c98977c8201c6236726c" exitCode=0 Apr 16 16:54:30.296017 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:54:30.295654 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-9c799c49c-xbxpr" event={"ID":"f999475f-ccdd-45b8-b3e0-bcfa2d0a0e3c","Type":"ContainerDied","Data":"1e7ea48cfa1b75ec3328415f839aaa6c31de8ecb0596c98977c8201c6236726c"} Apr 16 16:54:31.282450 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:54:31.282418 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ac9d3c9-1777-4123-8c54-634d2819b9da" path="/var/lib/kubelet/pods/3ac9d3c9-1777-4123-8c54-634d2819b9da/volumes" Apr 16 16:54:31.300590 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:54:31.300561 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-9c799c49c-xbxpr" event={"ID":"f999475f-ccdd-45b8-b3e0-bcfa2d0a0e3c","Type":"ContainerStarted","Data":"d9d94979ee2156898726fd6044751275ed04024e4eee58a39db22b7b1bd0e433"} Apr 16 16:54:31.300947 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:54:31.300781 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-9c799c49c-xbxpr" Apr 16 16:54:31.318311 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:54:31.318266 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-9c799c49c-xbxpr" podStartSLOduration=6.3182531730000004 podStartE2EDuration="6.318253173s" podCreationTimestamp="2026-04-16 16:54:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:54:31.316841363 +0000 UTC m=+3094.671831492" watchObservedRunningTime="2026-04-16 16:54:31.318253173 +0000 UTC m=+3094.673243307" Apr 16 16:55:02.325504 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:02.325475 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-9c799c49c-xbxpr" Apr 16 16:55:05.865682 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:05.865606 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-9c799c49c-xbxpr"] Apr 16 16:55:05.866133 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:05.865866 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-9c799c49c-xbxpr" podUID="f999475f-ccdd-45b8-b3e0-bcfa2d0a0e3c" containerName="kserve-container" containerID="cri-o://d9d94979ee2156898726fd6044751275ed04024e4eee58a39db22b7b1bd0e433" gracePeriod=30 Apr 16 16:55:05.926472 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:05.926440 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-654cf7d7c6-k44rj"] Apr 16 16:55:05.926775 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:05.926762 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3ac9d3c9-1777-4123-8c54-634d2819b9da" containerName="kserve-container" Apr 16 16:55:05.926825 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:05.926777 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ac9d3c9-1777-4123-8c54-634d2819b9da" containerName="kserve-container" Apr 16 16:55:05.926825 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:05.926787 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3ac9d3c9-1777-4123-8c54-634d2819b9da" containerName="storage-initializer" Apr 16 16:55:05.926825 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:05.926792 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ac9d3c9-1777-4123-8c54-634d2819b9da" containerName="storage-initializer" Apr 16 16:55:05.926939 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:05.926849 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="3ac9d3c9-1777-4123-8c54-634d2819b9da" containerName="kserve-container" Apr 16 16:55:05.930979 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:05.930964 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-654cf7d7c6-k44rj" Apr 16 16:55:05.939352 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:05.939330 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-654cf7d7c6-k44rj"] Apr 16 16:55:06.100761 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:06.100722 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a5a4f4d9-4aad-47c1-8b34-d9eaf08d3e11-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-654cf7d7c6-k44rj\" (UID: \"a5a4f4d9-4aad-47c1-8b34-d9eaf08d3e11\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-654cf7d7c6-k44rj" Apr 16 16:55:06.201881 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:06.201849 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a5a4f4d9-4aad-47c1-8b34-d9eaf08d3e11-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-654cf7d7c6-k44rj\" (UID: \"a5a4f4d9-4aad-47c1-8b34-d9eaf08d3e11\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-654cf7d7c6-k44rj" Apr 16 16:55:06.202268 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:06.202245 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a5a4f4d9-4aad-47c1-8b34-d9eaf08d3e11-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-654cf7d7c6-k44rj\" (UID: \"a5a4f4d9-4aad-47c1-8b34-d9eaf08d3e11\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-654cf7d7c6-k44rj" Apr 16 16:55:06.241588 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:06.241557 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-654cf7d7c6-k44rj" Apr 16 16:55:06.357354 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:06.357303 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-654cf7d7c6-k44rj"] Apr 16 16:55:06.361096 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:55:06.361070 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5a4f4d9_4aad_47c1_8b34_d9eaf08d3e11.slice/crio-b61b0f70ceef1638570d11bfc248162e89155782c3899dcaa7f2446cc38b3764 WatchSource:0}: Error finding container b61b0f70ceef1638570d11bfc248162e89155782c3899dcaa7f2446cc38b3764: Status 404 returned error can't find the container with id b61b0f70ceef1638570d11bfc248162e89155782c3899dcaa7f2446cc38b3764 Apr 16 16:55:06.415566 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:06.415540 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-654cf7d7c6-k44rj" event={"ID":"a5a4f4d9-4aad-47c1-8b34-d9eaf08d3e11","Type":"ContainerStarted","Data":"b61b0f70ceef1638570d11bfc248162e89155782c3899dcaa7f2446cc38b3764"} Apr 16 16:55:07.419652 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:07.419616 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-654cf7d7c6-k44rj" event={"ID":"a5a4f4d9-4aad-47c1-8b34-d9eaf08d3e11","Type":"ContainerStarted","Data":"cdb1c5131e897b79a02c55cd4ed80bd9790acfb4ba7481a4ac12aeeccc209964"} Apr 16 16:55:10.430102 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:10.430066 2576 generic.go:358] "Generic (PLEG): container finished" podID="a5a4f4d9-4aad-47c1-8b34-d9eaf08d3e11" containerID="cdb1c5131e897b79a02c55cd4ed80bd9790acfb4ba7481a4ac12aeeccc209964" exitCode=0 Apr 16 16:55:10.430529 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:10.430144 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-654cf7d7c6-k44rj" event={"ID":"a5a4f4d9-4aad-47c1-8b34-d9eaf08d3e11","Type":"ContainerDied","Data":"cdb1c5131e897b79a02c55cd4ed80bd9790acfb4ba7481a4ac12aeeccc209964"} Apr 16 16:55:11.435003 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:11.434972 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-654cf7d7c6-k44rj" event={"ID":"a5a4f4d9-4aad-47c1-8b34-d9eaf08d3e11","Type":"ContainerStarted","Data":"e207a33a90fe479b8476c02f38285d4ed46fc9fffa89444bc2cc3be34c9a0af3"} Apr 16 16:55:11.435403 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:11.435185 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-654cf7d7c6-k44rj" Apr 16 16:55:11.452795 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:11.452743 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-654cf7d7c6-k44rj" podStartSLOduration=6.452726056 podStartE2EDuration="6.452726056s" podCreationTimestamp="2026-04-16 16:55:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:55:11.451837348 +0000 UTC m=+3134.806827478" watchObservedRunningTime="2026-04-16 16:55:11.452726056 +0000 UTC m=+3134.807716187" Apr 16 16:55:12.099512 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:12.099481 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-9c799c49c-xbxpr" Apr 16 16:55:12.250978 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:12.250891 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f999475f-ccdd-45b8-b3e0-bcfa2d0a0e3c-kserve-provision-location\") pod \"f999475f-ccdd-45b8-b3e0-bcfa2d0a0e3c\" (UID: \"f999475f-ccdd-45b8-b3e0-bcfa2d0a0e3c\") " Apr 16 16:55:12.251301 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:12.251270 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f999475f-ccdd-45b8-b3e0-bcfa2d0a0e3c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f999475f-ccdd-45b8-b3e0-bcfa2d0a0e3c" (UID: "f999475f-ccdd-45b8-b3e0-bcfa2d0a0e3c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:55:12.351848 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:12.351821 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f999475f-ccdd-45b8-b3e0-bcfa2d0a0e3c-kserve-provision-location\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:55:12.439967 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:12.439939 2576 generic.go:358] "Generic (PLEG): container finished" podID="f999475f-ccdd-45b8-b3e0-bcfa2d0a0e3c" containerID="d9d94979ee2156898726fd6044751275ed04024e4eee58a39db22b7b1bd0e433" exitCode=0 Apr 16 16:55:12.440361 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:12.440005 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-9c799c49c-xbxpr" Apr 16 16:55:12.440361 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:12.440025 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-9c799c49c-xbxpr" event={"ID":"f999475f-ccdd-45b8-b3e0-bcfa2d0a0e3c","Type":"ContainerDied","Data":"d9d94979ee2156898726fd6044751275ed04024e4eee58a39db22b7b1bd0e433"} Apr 16 16:55:12.440361 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:12.440057 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-9c799c49c-xbxpr" event={"ID":"f999475f-ccdd-45b8-b3e0-bcfa2d0a0e3c","Type":"ContainerDied","Data":"4533349b99c088fdfe6d5eac83232a1bbc1bbb863640541329469cc9ce99ff58"} Apr 16 16:55:12.440361 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:12.440070 2576 scope.go:117] "RemoveContainer" containerID="d9d94979ee2156898726fd6044751275ed04024e4eee58a39db22b7b1bd0e433" Apr 16 16:55:12.447621 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:12.447599 2576 scope.go:117] "RemoveContainer" containerID="1e7ea48cfa1b75ec3328415f839aaa6c31de8ecb0596c98977c8201c6236726c" Apr 16 16:55:12.454412 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:12.454386 2576 scope.go:117] "RemoveContainer" containerID="d9d94979ee2156898726fd6044751275ed04024e4eee58a39db22b7b1bd0e433" Apr 16 16:55:12.454625 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:55:12.454609 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9d94979ee2156898726fd6044751275ed04024e4eee58a39db22b7b1bd0e433\": container with ID starting with d9d94979ee2156898726fd6044751275ed04024e4eee58a39db22b7b1bd0e433 not found: ID does not exist" containerID="d9d94979ee2156898726fd6044751275ed04024e4eee58a39db22b7b1bd0e433" Apr 16 16:55:12.454672 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:12.454632 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9d94979ee2156898726fd6044751275ed04024e4eee58a39db22b7b1bd0e433"} err="failed to get container status \"d9d94979ee2156898726fd6044751275ed04024e4eee58a39db22b7b1bd0e433\": rpc error: code = NotFound desc = could not find container \"d9d94979ee2156898726fd6044751275ed04024e4eee58a39db22b7b1bd0e433\": container with ID starting with d9d94979ee2156898726fd6044751275ed04024e4eee58a39db22b7b1bd0e433 not found: ID does not exist" Apr 16 16:55:12.454672 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:12.454647 2576 scope.go:117] "RemoveContainer" containerID="1e7ea48cfa1b75ec3328415f839aaa6c31de8ecb0596c98977c8201c6236726c" Apr 16 16:55:12.454853 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:55:12.454834 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e7ea48cfa1b75ec3328415f839aaa6c31de8ecb0596c98977c8201c6236726c\": container with ID starting with 1e7ea48cfa1b75ec3328415f839aaa6c31de8ecb0596c98977c8201c6236726c not found: ID does not exist" containerID="1e7ea48cfa1b75ec3328415f839aaa6c31de8ecb0596c98977c8201c6236726c" Apr 16 16:55:12.454890 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:12.454860 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e7ea48cfa1b75ec3328415f839aaa6c31de8ecb0596c98977c8201c6236726c"} err="failed to get container status \"1e7ea48cfa1b75ec3328415f839aaa6c31de8ecb0596c98977c8201c6236726c\": rpc error: code = NotFound desc = could not find container \"1e7ea48cfa1b75ec3328415f839aaa6c31de8ecb0596c98977c8201c6236726c\": container with ID starting with 1e7ea48cfa1b75ec3328415f839aaa6c31de8ecb0596c98977c8201c6236726c not found: ID does not exist" Apr 16 16:55:12.460590 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:12.460570 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-9c799c49c-xbxpr"] Apr 16 16:55:12.463998 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:12.463981 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-9c799c49c-xbxpr"] Apr 16 16:55:13.278525 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:13.278494 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f999475f-ccdd-45b8-b3e0-bcfa2d0a0e3c" path="/var/lib/kubelet/pods/f999475f-ccdd-45b8-b3e0-bcfa2d0a0e3c/volumes" Apr 16 16:55:42.526015 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:42.525982 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-654cf7d7c6-k44rj" Apr 16 16:55:46.031309 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:46.031193 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-654cf7d7c6-k44rj"] Apr 16 16:55:46.031760 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:46.031437 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-654cf7d7c6-k44rj" podUID="a5a4f4d9-4aad-47c1-8b34-d9eaf08d3e11" containerName="kserve-container" containerID="cri-o://e207a33a90fe479b8476c02f38285d4ed46fc9fffa89444bc2cc3be34c9a0af3" gracePeriod=30 Apr 16 16:55:46.091375 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:46.091343 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-5947cc4c99-7ngrj"] Apr 16 16:55:46.091685 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:46.091674 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f999475f-ccdd-45b8-b3e0-bcfa2d0a0e3c" containerName="storage-initializer" Apr 16 16:55:46.091727 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:46.091687 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f999475f-ccdd-45b8-b3e0-bcfa2d0a0e3c" containerName="storage-initializer" Apr 16 16:55:46.091727 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:46.091703 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f999475f-ccdd-45b8-b3e0-bcfa2d0a0e3c" containerName="kserve-container" Apr 16 16:55:46.091727 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:46.091709 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f999475f-ccdd-45b8-b3e0-bcfa2d0a0e3c" containerName="kserve-container" Apr 16 16:55:46.091824 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:46.091756 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="f999475f-ccdd-45b8-b3e0-bcfa2d0a0e3c" containerName="kserve-container" Apr 16 16:55:46.094618 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:46.094600 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-5947cc4c99-7ngrj" Apr 16 16:55:46.102778 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:46.102755 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-5947cc4c99-7ngrj"] Apr 16 16:55:46.210321 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:46.210280 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb24c9b3-436e-4d49-8ec6-238e4c9f9733-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-5947cc4c99-7ngrj\" (UID: \"fb24c9b3-436e-4d49-8ec6-238e4c9f9733\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-5947cc4c99-7ngrj" Apr 16 16:55:46.310982 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:46.310901 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb24c9b3-436e-4d49-8ec6-238e4c9f9733-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-5947cc4c99-7ngrj\" (UID: \"fb24c9b3-436e-4d49-8ec6-238e4c9f9733\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-5947cc4c99-7ngrj" Apr 16 16:55:46.311349 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:46.311331 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb24c9b3-436e-4d49-8ec6-238e4c9f9733-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-5947cc4c99-7ngrj\" (UID: \"fb24c9b3-436e-4d49-8ec6-238e4c9f9733\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-5947cc4c99-7ngrj" Apr 16 16:55:46.405200 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:46.405167 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-5947cc4c99-7ngrj" Apr 16 16:55:46.531772 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:46.531743 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-5947cc4c99-7ngrj"] Apr 16 16:55:46.535422 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:55:46.535392 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb24c9b3_436e_4d49_8ec6_238e4c9f9733.slice/crio-f3c1da729980bd3d6d49d10ffd3370ef44aca905004579f97633aff9fd0a63f0 WatchSource:0}: Error finding container f3c1da729980bd3d6d49d10ffd3370ef44aca905004579f97633aff9fd0a63f0: Status 404 returned error can't find the container with id f3c1da729980bd3d6d49d10ffd3370ef44aca905004579f97633aff9fd0a63f0 Apr 16 16:55:46.552053 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:46.552022 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-5947cc4c99-7ngrj" event={"ID":"fb24c9b3-436e-4d49-8ec6-238e4c9f9733","Type":"ContainerStarted","Data":"f3c1da729980bd3d6d49d10ffd3370ef44aca905004579f97633aff9fd0a63f0"} Apr 16 16:55:47.557502 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:47.557464 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-5947cc4c99-7ngrj" event={"ID":"fb24c9b3-436e-4d49-8ec6-238e4c9f9733","Type":"ContainerStarted","Data":"128598a5902b1de85c85aa9db418e830fdfc072e0fd30fe2f27c597208674e11"} Apr 16 16:55:50.568137 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:50.568059 2576 generic.go:358] "Generic (PLEG): container finished" podID="fb24c9b3-436e-4d49-8ec6-238e4c9f9733" containerID="128598a5902b1de85c85aa9db418e830fdfc072e0fd30fe2f27c597208674e11" exitCode=0 Apr 16 16:55:50.568137 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:50.568110 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-5947cc4c99-7ngrj" event={"ID":"fb24c9b3-436e-4d49-8ec6-238e4c9f9733","Type":"ContainerDied","Data":"128598a5902b1de85c85aa9db418e830fdfc072e0fd30fe2f27c597208674e11"} Apr 16 16:55:51.573065 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:51.573032 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-5947cc4c99-7ngrj" event={"ID":"fb24c9b3-436e-4d49-8ec6-238e4c9f9733","Type":"ContainerStarted","Data":"59ba6dc5ad35e4446ee5d8419ec8040f82ea4f07ad2541c5ac0ec9bbc149a648"} Apr 16 16:55:51.573481 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:51.573328 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-5947cc4c99-7ngrj" Apr 16 16:55:51.574658 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:51.574633 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-5947cc4c99-7ngrj" podUID="fb24c9b3-436e-4d49-8ec6-238e4c9f9733" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.66:8080: connect: connection refused" Apr 16 16:55:51.594307 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:51.594256 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-5947cc4c99-7ngrj" podStartSLOduration=5.594242133 podStartE2EDuration="5.594242133s" podCreationTimestamp="2026-04-16 16:55:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:55:51.592152228 +0000 UTC m=+3174.947142356" watchObservedRunningTime="2026-04-16 16:55:51.594242133 +0000 UTC m=+3174.949232284" Apr 16 16:55:52.440630 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:52.440593 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-654cf7d7c6-k44rj" podUID="a5a4f4d9-4aad-47c1-8b34-d9eaf08d3e11" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.65:8080/v2/models/xgboost-v2-mlserver/ready\": dial tcp 10.133.0.65:8080: connect: connection refused" Apr 16 16:55:52.581094 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:52.581040 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-5947cc4c99-7ngrj" podUID="fb24c9b3-436e-4d49-8ec6-238e4c9f9733" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.66:8080: connect: connection refused" Apr 16 16:55:55.587780 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:55.587750 2576 generic.go:358] "Generic (PLEG): container finished" podID="a5a4f4d9-4aad-47c1-8b34-d9eaf08d3e11" containerID="e207a33a90fe479b8476c02f38285d4ed46fc9fffa89444bc2cc3be34c9a0af3" exitCode=0 Apr 16 16:55:55.588141 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:55.587812 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-654cf7d7c6-k44rj" event={"ID":"a5a4f4d9-4aad-47c1-8b34-d9eaf08d3e11","Type":"ContainerDied","Data":"e207a33a90fe479b8476c02f38285d4ed46fc9fffa89444bc2cc3be34c9a0af3"} Apr 16 16:55:55.588141 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:55.587836 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-654cf7d7c6-k44rj" event={"ID":"a5a4f4d9-4aad-47c1-8b34-d9eaf08d3e11","Type":"ContainerDied","Data":"b61b0f70ceef1638570d11bfc248162e89155782c3899dcaa7f2446cc38b3764"} Apr 16 16:55:55.588141 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:55.587846 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b61b0f70ceef1638570d11bfc248162e89155782c3899dcaa7f2446cc38b3764" Apr 16 16:55:55.590617 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:55.590598 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-654cf7d7c6-k44rj" Apr 16 16:55:55.682404 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:55.682299 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a5a4f4d9-4aad-47c1-8b34-d9eaf08d3e11-kserve-provision-location\") pod \"a5a4f4d9-4aad-47c1-8b34-d9eaf08d3e11\" (UID: \"a5a4f4d9-4aad-47c1-8b34-d9eaf08d3e11\") " Apr 16 16:55:55.682661 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:55.682635 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5a4f4d9-4aad-47c1-8b34-d9eaf08d3e11-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a5a4f4d9-4aad-47c1-8b34-d9eaf08d3e11" (UID: "a5a4f4d9-4aad-47c1-8b34-d9eaf08d3e11"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:55:55.783353 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:55.783316 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a5a4f4d9-4aad-47c1-8b34-d9eaf08d3e11-kserve-provision-location\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:55:56.590512 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:56.590478 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-654cf7d7c6-k44rj" Apr 16 16:55:56.614649 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:56.614619 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-654cf7d7c6-k44rj"] Apr 16 16:55:56.617701 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:56.617674 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-654cf7d7c6-k44rj"] Apr 16 16:55:57.278314 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:55:57.278268 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5a4f4d9-4aad-47c1-8b34-d9eaf08d3e11" path="/var/lib/kubelet/pods/a5a4f4d9-4aad-47c1-8b34-d9eaf08d3e11/volumes" Apr 16 16:56:02.578785 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:56:02.578743 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-5947cc4c99-7ngrj" podUID="fb24c9b3-436e-4d49-8ec6-238e4c9f9733" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.66:8080: connect: connection refused" Apr 16 16:56:12.579044 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:56:12.578998 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-5947cc4c99-7ngrj" podUID="fb24c9b3-436e-4d49-8ec6-238e4c9f9733" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.66:8080: connect: connection refused" Apr 16 16:56:22.578728 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:56:22.578681 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-5947cc4c99-7ngrj" podUID="fb24c9b3-436e-4d49-8ec6-238e4c9f9733" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.66:8080: connect: connection refused" Apr 16 16:56:32.578628 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:56:32.578583 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-5947cc4c99-7ngrj" podUID="fb24c9b3-436e-4d49-8ec6-238e4c9f9733" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.66:8080: connect: connection refused" Apr 16 16:56:42.579161 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:56:42.579112 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-5947cc4c99-7ngrj" podUID="fb24c9b3-436e-4d49-8ec6-238e4c9f9733" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.66:8080: connect: connection refused" Apr 16 16:56:52.579517 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:56:52.579431 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-5947cc4c99-7ngrj" Apr 16 16:56:56.259841 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:56:56.259801 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-5947cc4c99-7ngrj"] Apr 16 16:56:56.260325 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:56:56.260230 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-5947cc4c99-7ngrj" podUID="fb24c9b3-436e-4d49-8ec6-238e4c9f9733" containerName="kserve-container" containerID="cri-o://59ba6dc5ad35e4446ee5d8419ec8040f82ea4f07ad2541c5ac0ec9bbc149a648" gracePeriod=30 Apr 16 16:56:56.337394 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:56:56.337357 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-d58974886-6h8lv"] Apr 16 16:56:56.337667 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:56:56.337655 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a5a4f4d9-4aad-47c1-8b34-d9eaf08d3e11" containerName="kserve-container" Apr 16 16:56:56.337710 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:56:56.337669 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5a4f4d9-4aad-47c1-8b34-d9eaf08d3e11" containerName="kserve-container" Apr 16 16:56:56.337710 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:56:56.337686 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a5a4f4d9-4aad-47c1-8b34-d9eaf08d3e11" containerName="storage-initializer" Apr 16 16:56:56.337710 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:56:56.337692 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5a4f4d9-4aad-47c1-8b34-d9eaf08d3e11" containerName="storage-initializer" Apr 16 16:56:56.337812 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:56:56.337746 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a5a4f4d9-4aad-47c1-8b34-d9eaf08d3e11" containerName="kserve-container" Apr 16 16:56:56.340801 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:56:56.340784 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-d58974886-6h8lv" Apr 16 16:56:56.351145 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:56:56.351125 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eef9df76-2fc0-4daf-96e5-dcd8390d6e12-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-d58974886-6h8lv\" (UID: \"eef9df76-2fc0-4daf-96e5-dcd8390d6e12\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-d58974886-6h8lv" Apr 16 16:56:56.411455 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:56:56.411421 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-d58974886-6h8lv"] Apr 16 16:56:56.452288 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:56:56.452259 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eef9df76-2fc0-4daf-96e5-dcd8390d6e12-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-d58974886-6h8lv\" (UID: \"eef9df76-2fc0-4daf-96e5-dcd8390d6e12\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-d58974886-6h8lv" Apr 16 16:56:56.452628 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:56:56.452612 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eef9df76-2fc0-4daf-96e5-dcd8390d6e12-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-d58974886-6h8lv\" (UID: \"eef9df76-2fc0-4daf-96e5-dcd8390d6e12\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-d58974886-6h8lv" Apr 16 16:56:56.650447 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:56:56.650414 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-d58974886-6h8lv" Apr 16 16:56:56.779193 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:56:56.779165 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-d58974886-6h8lv"] Apr 16 16:56:56.781922 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:56:56.781884 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeef9df76_2fc0_4daf_96e5_dcd8390d6e12.slice/crio-c635216a9ac0322fd9d9ed41b9d10bc88d51276d61032c29791ba700103f8ad9 WatchSource:0}: Error finding container c635216a9ac0322fd9d9ed41b9d10bc88d51276d61032c29791ba700103f8ad9: Status 404 returned error can't find the container with id c635216a9ac0322fd9d9ed41b9d10bc88d51276d61032c29791ba700103f8ad9 Apr 16 16:56:57.787882 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:56:57.787849 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-d58974886-6h8lv" event={"ID":"eef9df76-2fc0-4daf-96e5-dcd8390d6e12","Type":"ContainerStarted","Data":"cd91fdd102095745340fdbcfa968e5bf60cdd20d8b8d24fbd4123d5c0e4bbc3d"} Apr 16 16:56:57.787882 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:56:57.787884 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-d58974886-6h8lv" event={"ID":"eef9df76-2fc0-4daf-96e5-dcd8390d6e12","Type":"ContainerStarted","Data":"c635216a9ac0322fd9d9ed41b9d10bc88d51276d61032c29791ba700103f8ad9"} Apr 16 16:57:00.003445 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:00.003420 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-5947cc4c99-7ngrj" Apr 16 16:57:00.078995 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:00.078887 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb24c9b3-436e-4d49-8ec6-238e4c9f9733-kserve-provision-location\") pod \"fb24c9b3-436e-4d49-8ec6-238e4c9f9733\" (UID: \"fb24c9b3-436e-4d49-8ec6-238e4c9f9733\") " Apr 16 16:57:00.079290 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:00.079268 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb24c9b3-436e-4d49-8ec6-238e4c9f9733-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "fb24c9b3-436e-4d49-8ec6-238e4c9f9733" (UID: "fb24c9b3-436e-4d49-8ec6-238e4c9f9733"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:57:00.180041 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:00.179991 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb24c9b3-436e-4d49-8ec6-238e4c9f9733-kserve-provision-location\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:57:00.798873 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:00.798787 2576 generic.go:358] "Generic (PLEG): container finished" podID="fb24c9b3-436e-4d49-8ec6-238e4c9f9733" containerID="59ba6dc5ad35e4446ee5d8419ec8040f82ea4f07ad2541c5ac0ec9bbc149a648" exitCode=0 Apr 16 16:57:00.798873 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:00.798857 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-5947cc4c99-7ngrj" Apr 16 16:57:00.799106 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:00.798871 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-5947cc4c99-7ngrj" event={"ID":"fb24c9b3-436e-4d49-8ec6-238e4c9f9733","Type":"ContainerDied","Data":"59ba6dc5ad35e4446ee5d8419ec8040f82ea4f07ad2541c5ac0ec9bbc149a648"} Apr 16 16:57:00.799106 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:00.798906 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-5947cc4c99-7ngrj" event={"ID":"fb24c9b3-436e-4d49-8ec6-238e4c9f9733","Type":"ContainerDied","Data":"f3c1da729980bd3d6d49d10ffd3370ef44aca905004579f97633aff9fd0a63f0"} Apr 16 16:57:00.799106 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:00.798928 2576 scope.go:117] "RemoveContainer" containerID="59ba6dc5ad35e4446ee5d8419ec8040f82ea4f07ad2541c5ac0ec9bbc149a648" Apr 16 16:57:00.800412 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:00.800374 2576 generic.go:358] "Generic (PLEG): container finished" podID="eef9df76-2fc0-4daf-96e5-dcd8390d6e12" containerID="cd91fdd102095745340fdbcfa968e5bf60cdd20d8b8d24fbd4123d5c0e4bbc3d" exitCode=0 Apr 16 16:57:00.800524 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:00.800428 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-d58974886-6h8lv" event={"ID":"eef9df76-2fc0-4daf-96e5-dcd8390d6e12","Type":"ContainerDied","Data":"cd91fdd102095745340fdbcfa968e5bf60cdd20d8b8d24fbd4123d5c0e4bbc3d"} Apr 16 16:57:00.807982 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:00.807824 2576 scope.go:117] "RemoveContainer" containerID="128598a5902b1de85c85aa9db418e830fdfc072e0fd30fe2f27c597208674e11" Apr 16 16:57:00.814667 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:00.814647 2576 scope.go:117] "RemoveContainer" containerID="59ba6dc5ad35e4446ee5d8419ec8040f82ea4f07ad2541c5ac0ec9bbc149a648" Apr 16 16:57:00.814917 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:57:00.814896 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59ba6dc5ad35e4446ee5d8419ec8040f82ea4f07ad2541c5ac0ec9bbc149a648\": container with ID starting with 59ba6dc5ad35e4446ee5d8419ec8040f82ea4f07ad2541c5ac0ec9bbc149a648 not found: ID does not exist" containerID="59ba6dc5ad35e4446ee5d8419ec8040f82ea4f07ad2541c5ac0ec9bbc149a648" Apr 16 16:57:00.814966 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:00.814928 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59ba6dc5ad35e4446ee5d8419ec8040f82ea4f07ad2541c5ac0ec9bbc149a648"} err="failed to get container status \"59ba6dc5ad35e4446ee5d8419ec8040f82ea4f07ad2541c5ac0ec9bbc149a648\": rpc error: code = NotFound desc = could not find container \"59ba6dc5ad35e4446ee5d8419ec8040f82ea4f07ad2541c5ac0ec9bbc149a648\": container with ID starting with 59ba6dc5ad35e4446ee5d8419ec8040f82ea4f07ad2541c5ac0ec9bbc149a648 not found: ID does not exist" Apr 16 16:57:00.814966 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:00.814949 2576 scope.go:117] "RemoveContainer" containerID="128598a5902b1de85c85aa9db418e830fdfc072e0fd30fe2f27c597208674e11" Apr 16 16:57:00.815196 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:57:00.815175 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"128598a5902b1de85c85aa9db418e830fdfc072e0fd30fe2f27c597208674e11\": container with ID starting with 128598a5902b1de85c85aa9db418e830fdfc072e0fd30fe2f27c597208674e11 not found: ID does not exist" containerID="128598a5902b1de85c85aa9db418e830fdfc072e0fd30fe2f27c597208674e11" Apr 16 16:57:00.815275 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:00.815202 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"128598a5902b1de85c85aa9db418e830fdfc072e0fd30fe2f27c597208674e11"} err="failed to get container status \"128598a5902b1de85c85aa9db418e830fdfc072e0fd30fe2f27c597208674e11\": rpc error: code = NotFound desc = could not find container \"128598a5902b1de85c85aa9db418e830fdfc072e0fd30fe2f27c597208674e11\": container with ID starting with 128598a5902b1de85c85aa9db418e830fdfc072e0fd30fe2f27c597208674e11 not found: ID does not exist" Apr 16 16:57:00.834955 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:00.834929 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-5947cc4c99-7ngrj"] Apr 16 16:57:00.838074 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:00.838050 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-5947cc4c99-7ngrj"] Apr 16 16:57:01.278110 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:01.278075 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb24c9b3-436e-4d49-8ec6-238e4c9f9733" path="/var/lib/kubelet/pods/fb24c9b3-436e-4d49-8ec6-238e4c9f9733/volumes" Apr 16 16:57:01.805291 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:01.805255 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-d58974886-6h8lv" event={"ID":"eef9df76-2fc0-4daf-96e5-dcd8390d6e12","Type":"ContainerStarted","Data":"f42755cb2f36788783e95b46e6cef86d98716a0ea60d9815bc556076ee5531b2"} Apr 16 16:57:01.805515 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:01.805495 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-d58974886-6h8lv" Apr 16 16:57:01.826545 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:01.826495 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-d58974886-6h8lv" podStartSLOduration=5.82648136 podStartE2EDuration="5.82648136s" podCreationTimestamp="2026-04-16 16:56:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:57:01.825068121 +0000 UTC m=+3245.180058272" watchObservedRunningTime="2026-04-16 16:57:01.82648136 +0000 UTC m=+3245.181471491" Apr 16 16:57:32.825164 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:32.825104 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-d58974886-6h8lv" podUID="eef9df76-2fc0-4daf-96e5-dcd8390d6e12" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 16 16:57:42.811781 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:42.811753 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-d58974886-6h8lv" Apr 16 16:57:46.362562 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:46.362533 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-d58974886-6h8lv"] Apr 16 16:57:46.362988 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:46.362842 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-d58974886-6h8lv" podUID="eef9df76-2fc0-4daf-96e5-dcd8390d6e12" containerName="kserve-container" containerID="cri-o://f42755cb2f36788783e95b46e6cef86d98716a0ea60d9815bc556076ee5531b2" gracePeriod=30 Apr 16 16:57:46.416076 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:46.416045 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-675d9b5ff-5b9rv"] Apr 16 16:57:46.416425 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:46.416404 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb24c9b3-436e-4d49-8ec6-238e4c9f9733" containerName="storage-initializer" Apr 16 16:57:46.416425 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:46.416426 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb24c9b3-436e-4d49-8ec6-238e4c9f9733" containerName="storage-initializer" Apr 16 16:57:46.416521 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:46.416436 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb24c9b3-436e-4d49-8ec6-238e4c9f9733" containerName="kserve-container" Apr 16 16:57:46.416521 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:46.416442 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb24c9b3-436e-4d49-8ec6-238e4c9f9733" containerName="kserve-container" Apr 16 16:57:46.416521 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:46.416488 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="fb24c9b3-436e-4d49-8ec6-238e4c9f9733" containerName="kserve-container" Apr 16 16:57:46.419379 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:46.419362 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-675d9b5ff-5b9rv" Apr 16 16:57:46.427732 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:46.427710 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-675d9b5ff-5b9rv"] Apr 16 16:57:46.538304 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:46.538267 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/88f2cb88-fd7c-4867-8d75-328d9a4a5b92-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-675d9b5ff-5b9rv\" (UID: \"88f2cb88-fd7c-4867-8d75-328d9a4a5b92\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-675d9b5ff-5b9rv" Apr 16 16:57:46.638980 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:46.638894 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/88f2cb88-fd7c-4867-8d75-328d9a4a5b92-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-675d9b5ff-5b9rv\" (UID: \"88f2cb88-fd7c-4867-8d75-328d9a4a5b92\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-675d9b5ff-5b9rv" Apr 16 16:57:46.639303 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:46.639283 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/88f2cb88-fd7c-4867-8d75-328d9a4a5b92-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-675d9b5ff-5b9rv\" (UID: \"88f2cb88-fd7c-4867-8d75-328d9a4a5b92\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-675d9b5ff-5b9rv" Apr 16 16:57:46.730406 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:46.730377 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-675d9b5ff-5b9rv" Apr 16 16:57:46.875992 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:46.875968 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-675d9b5ff-5b9rv"] Apr 16 16:57:46.878183 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:57:46.878152 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88f2cb88_fd7c_4867_8d75_328d9a4a5b92.slice/crio-cbce7e0c6302c3c019e3549925bff8fd67d581b90f94b23b2b5a2ea472eb479e WatchSource:0}: Error finding container cbce7e0c6302c3c019e3549925bff8fd67d581b90f94b23b2b5a2ea472eb479e: Status 404 returned error can't find the container with id cbce7e0c6302c3c019e3549925bff8fd67d581b90f94b23b2b5a2ea472eb479e Apr 16 16:57:46.951706 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:46.951678 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-675d9b5ff-5b9rv" event={"ID":"88f2cb88-fd7c-4867-8d75-328d9a4a5b92","Type":"ContainerStarted","Data":"60a04ebabab81d22f8bda7905fffdc8bfb8b340f7fdf004816ffcd2793f4a680"} Apr 16 16:57:46.951706 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:46.951711 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-675d9b5ff-5b9rv" event={"ID":"88f2cb88-fd7c-4867-8d75-328d9a4a5b92","Type":"ContainerStarted","Data":"cbce7e0c6302c3c019e3549925bff8fd67d581b90f94b23b2b5a2ea472eb479e"} Apr 16 16:57:50.967108 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:50.967077 2576 generic.go:358] "Generic (PLEG): container finished" podID="88f2cb88-fd7c-4867-8d75-328d9a4a5b92" containerID="60a04ebabab81d22f8bda7905fffdc8bfb8b340f7fdf004816ffcd2793f4a680" exitCode=0 Apr 16 16:57:50.967579 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:50.967116 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-675d9b5ff-5b9rv" event={"ID":"88f2cb88-fd7c-4867-8d75-328d9a4a5b92","Type":"ContainerDied","Data":"60a04ebabab81d22f8bda7905fffdc8bfb8b340f7fdf004816ffcd2793f4a680"} Apr 16 16:57:51.972399 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:51.972364 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-675d9b5ff-5b9rv" event={"ID":"88f2cb88-fd7c-4867-8d75-328d9a4a5b92","Type":"ContainerStarted","Data":"6d740be3aa12cbef0682deaf5b7c0abf2e1ef0b8b6179478e6dbc8a5d3b324e9"} Apr 16 16:57:51.972822 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:51.972693 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-675d9b5ff-5b9rv" Apr 16 16:57:51.973753 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:51.973729 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-675d9b5ff-5b9rv" podUID="88f2cb88-fd7c-4867-8d75-328d9a4a5b92" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.68:8080: connect: connection refused" Apr 16 16:57:51.988443 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:51.988402 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-675d9b5ff-5b9rv" podStartSLOduration=5.988390132 podStartE2EDuration="5.988390132s" podCreationTimestamp="2026-04-16 16:57:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:57:51.986989429 +0000 UTC m=+3295.341979558" watchObservedRunningTime="2026-04-16 16:57:51.988390132 +0000 UTC m=+3295.343380254" Apr 16 16:57:52.810308 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:52.810265 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-d58974886-6h8lv" podUID="eef9df76-2fc0-4daf-96e5-dcd8390d6e12" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.67:8080/v2/models/isvc-xgboost-v2-runtime/ready\": dial tcp 10.133.0.67:8080: connect: connection refused" Apr 16 16:57:52.976649 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:52.976610 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-675d9b5ff-5b9rv" podUID="88f2cb88-fd7c-4867-8d75-328d9a4a5b92" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.68:8080: connect: connection refused" Apr 16 16:57:53.704460 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:53.704433 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-d58974886-6h8lv" Apr 16 16:57:53.795587 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:53.795518 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eef9df76-2fc0-4daf-96e5-dcd8390d6e12-kserve-provision-location\") pod \"eef9df76-2fc0-4daf-96e5-dcd8390d6e12\" (UID: \"eef9df76-2fc0-4daf-96e5-dcd8390d6e12\") " Apr 16 16:57:53.795815 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:53.795793 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eef9df76-2fc0-4daf-96e5-dcd8390d6e12-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "eef9df76-2fc0-4daf-96e5-dcd8390d6e12" (UID: "eef9df76-2fc0-4daf-96e5-dcd8390d6e12"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:57:53.896553 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:53.896521 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eef9df76-2fc0-4daf-96e5-dcd8390d6e12-kserve-provision-location\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:57:53.980432 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:53.980396 2576 generic.go:358] "Generic (PLEG): container finished" podID="eef9df76-2fc0-4daf-96e5-dcd8390d6e12" containerID="f42755cb2f36788783e95b46e6cef86d98716a0ea60d9815bc556076ee5531b2" exitCode=0 Apr 16 16:57:53.980802 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:53.980474 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-d58974886-6h8lv" Apr 16 16:57:53.980802 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:53.980484 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-d58974886-6h8lv" event={"ID":"eef9df76-2fc0-4daf-96e5-dcd8390d6e12","Type":"ContainerDied","Data":"f42755cb2f36788783e95b46e6cef86d98716a0ea60d9815bc556076ee5531b2"} Apr 16 16:57:53.980802 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:53.980520 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-d58974886-6h8lv" event={"ID":"eef9df76-2fc0-4daf-96e5-dcd8390d6e12","Type":"ContainerDied","Data":"c635216a9ac0322fd9d9ed41b9d10bc88d51276d61032c29791ba700103f8ad9"} Apr 16 16:57:53.980802 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:53.980535 2576 scope.go:117] "RemoveContainer" containerID="f42755cb2f36788783e95b46e6cef86d98716a0ea60d9815bc556076ee5531b2" Apr 16 16:57:53.989352 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:53.989335 2576 scope.go:117] "RemoveContainer" containerID="cd91fdd102095745340fdbcfa968e5bf60cdd20d8b8d24fbd4123d5c0e4bbc3d" Apr 16 16:57:53.995994 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:53.995981 2576 scope.go:117] "RemoveContainer" containerID="f42755cb2f36788783e95b46e6cef86d98716a0ea60d9815bc556076ee5531b2" Apr 16 16:57:53.996204 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:57:53.996189 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f42755cb2f36788783e95b46e6cef86d98716a0ea60d9815bc556076ee5531b2\": container with ID starting with f42755cb2f36788783e95b46e6cef86d98716a0ea60d9815bc556076ee5531b2 not found: ID does not exist" containerID="f42755cb2f36788783e95b46e6cef86d98716a0ea60d9815bc556076ee5531b2" Apr 16 16:57:53.996334 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:53.996227 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f42755cb2f36788783e95b46e6cef86d98716a0ea60d9815bc556076ee5531b2"} err="failed to get container status \"f42755cb2f36788783e95b46e6cef86d98716a0ea60d9815bc556076ee5531b2\": rpc error: code = NotFound desc = could not find container \"f42755cb2f36788783e95b46e6cef86d98716a0ea60d9815bc556076ee5531b2\": container with ID starting with f42755cb2f36788783e95b46e6cef86d98716a0ea60d9815bc556076ee5531b2 not found: ID does not exist" Apr 16 16:57:53.996334 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:53.996253 2576 scope.go:117] "RemoveContainer" containerID="cd91fdd102095745340fdbcfa968e5bf60cdd20d8b8d24fbd4123d5c0e4bbc3d" Apr 16 16:57:53.996490 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:57:53.996474 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd91fdd102095745340fdbcfa968e5bf60cdd20d8b8d24fbd4123d5c0e4bbc3d\": container with ID starting with cd91fdd102095745340fdbcfa968e5bf60cdd20d8b8d24fbd4123d5c0e4bbc3d not found: ID does not exist" containerID="cd91fdd102095745340fdbcfa968e5bf60cdd20d8b8d24fbd4123d5c0e4bbc3d" Apr 16 16:57:53.996535 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:53.996494 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd91fdd102095745340fdbcfa968e5bf60cdd20d8b8d24fbd4123d5c0e4bbc3d"} err="failed to get container status \"cd91fdd102095745340fdbcfa968e5bf60cdd20d8b8d24fbd4123d5c0e4bbc3d\": rpc error: code = NotFound desc = could not find container \"cd91fdd102095745340fdbcfa968e5bf60cdd20d8b8d24fbd4123d5c0e4bbc3d\": container with ID starting with cd91fdd102095745340fdbcfa968e5bf60cdd20d8b8d24fbd4123d5c0e4bbc3d not found: ID does not exist" Apr 16 16:57:54.002466 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:54.002433 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-d58974886-6h8lv"] Apr 16 16:57:54.004008 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:54.003989 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-d58974886-6h8lv"] Apr 16 16:57:55.277910 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:57:55.277878 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eef9df76-2fc0-4daf-96e5-dcd8390d6e12" path="/var/lib/kubelet/pods/eef9df76-2fc0-4daf-96e5-dcd8390d6e12/volumes" Apr 16 16:58:02.977428 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:58:02.977386 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-675d9b5ff-5b9rv" podUID="88f2cb88-fd7c-4867-8d75-328d9a4a5b92" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.68:8080: connect: connection refused" Apr 16 16:58:12.977165 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:58:12.977124 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-675d9b5ff-5b9rv" podUID="88f2cb88-fd7c-4867-8d75-328d9a4a5b92" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.68:8080: connect: connection refused" Apr 16 16:58:22.977199 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:58:22.977105 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-675d9b5ff-5b9rv" podUID="88f2cb88-fd7c-4867-8d75-328d9a4a5b92" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.68:8080: connect: connection refused" Apr 16 16:58:32.977398 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:58:32.977356 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-675d9b5ff-5b9rv" podUID="88f2cb88-fd7c-4867-8d75-328d9a4a5b92" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.68:8080: connect: connection refused" Apr 16 16:58:42.977280 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:58:42.977236 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-675d9b5ff-5b9rv" podUID="88f2cb88-fd7c-4867-8d75-328d9a4a5b92" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.68:8080: connect: connection refused" Apr 16 16:58:52.977203 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:58:52.977158 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-675d9b5ff-5b9rv" Apr 16 16:58:56.558743 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:58:56.558710 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-675d9b5ff-5b9rv"] Apr 16 16:58:56.559137 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:58:56.558985 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-675d9b5ff-5b9rv" podUID="88f2cb88-fd7c-4867-8d75-328d9a4a5b92" containerName="kserve-container" containerID="cri-o://6d740be3aa12cbef0682deaf5b7c0abf2e1ef0b8b6179478e6dbc8a5d3b324e9" gracePeriod=30 Apr 16 16:58:56.642494 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:58:56.642463 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-f8wg8"] Apr 16 16:58:56.642782 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:58:56.642771 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eef9df76-2fc0-4daf-96e5-dcd8390d6e12" containerName="storage-initializer" Apr 16 16:58:56.642832 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:58:56.642785 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="eef9df76-2fc0-4daf-96e5-dcd8390d6e12" containerName="storage-initializer" Apr 16 16:58:56.642832 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:58:56.642794 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eef9df76-2fc0-4daf-96e5-dcd8390d6e12" containerName="kserve-container" Apr 16 16:58:56.642832 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:58:56.642800 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="eef9df76-2fc0-4daf-96e5-dcd8390d6e12" containerName="kserve-container" Apr 16 16:58:56.642926 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:58:56.642863 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="eef9df76-2fc0-4daf-96e5-dcd8390d6e12" containerName="kserve-container" Apr 16 16:58:56.645852 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:58:56.645834 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-f8wg8" Apr 16 16:58:56.649598 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:58:56.649578 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"storage-config\"" Apr 16 16:58:56.661623 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:58:56.661601 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-f8wg8"] Apr 16 16:58:56.754461 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:58:56.754424 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c63aba9a-0bec-4778-8115-a8cb69433080-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-57cc89c4d5-f8wg8\" (UID: \"c63aba9a-0bec-4778-8115-a8cb69433080\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-f8wg8" Apr 16 16:58:56.855663 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:58:56.855582 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c63aba9a-0bec-4778-8115-a8cb69433080-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-57cc89c4d5-f8wg8\" (UID: \"c63aba9a-0bec-4778-8115-a8cb69433080\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-f8wg8" Apr 16 16:58:56.855959 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:58:56.855941 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c63aba9a-0bec-4778-8115-a8cb69433080-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-57cc89c4d5-f8wg8\" (UID: \"c63aba9a-0bec-4778-8115-a8cb69433080\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-f8wg8" Apr 16 16:58:56.956195 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:58:56.956167 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-f8wg8" Apr 16 16:58:57.073707 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:58:57.073670 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-f8wg8"] Apr 16 16:58:57.076340 ip-10-0-137-150 kubenswrapper[2576]: W0416 16:58:57.076310 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc63aba9a_0bec_4778_8115_a8cb69433080.slice/crio-b52c6657f045483e0ce436a8ded8fadddd3340f6b4ff0fcdfa30baaa01f7684d WatchSource:0}: Error finding container b52c6657f045483e0ce436a8ded8fadddd3340f6b4ff0fcdfa30baaa01f7684d: Status 404 returned error can't find the container with id b52c6657f045483e0ce436a8ded8fadddd3340f6b4ff0fcdfa30baaa01f7684d Apr 16 16:58:57.188670 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:58:57.188639 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-f8wg8" event={"ID":"c63aba9a-0bec-4778-8115-a8cb69433080","Type":"ContainerStarted","Data":"95cb2f0bfc853ca1eaad10b5f3708c0f4358d27572b7fd42f0312afb1063d979"} Apr 16 16:58:57.188802 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:58:57.188676 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-f8wg8" event={"ID":"c63aba9a-0bec-4778-8115-a8cb69433080","Type":"ContainerStarted","Data":"b52c6657f045483e0ce436a8ded8fadddd3340f6b4ff0fcdfa30baaa01f7684d"} Apr 16 16:58:58.192953 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:58:58.192917 2576 generic.go:358] "Generic (PLEG): container finished" podID="c63aba9a-0bec-4778-8115-a8cb69433080" containerID="95cb2f0bfc853ca1eaad10b5f3708c0f4358d27572b7fd42f0312afb1063d979" exitCode=0 Apr 16 16:58:58.193432 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:58:58.193007 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-f8wg8" event={"ID":"c63aba9a-0bec-4778-8115-a8cb69433080","Type":"ContainerDied","Data":"95cb2f0bfc853ca1eaad10b5f3708c0f4358d27572b7fd42f0312afb1063d979"} Apr 16 16:58:59.198158 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:58:59.198125 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-f8wg8" event={"ID":"c63aba9a-0bec-4778-8115-a8cb69433080","Type":"ContainerStarted","Data":"183a702d2e94c228fadcd541f916c626703ca729378680396046986b4c936cf0"} Apr 16 16:58:59.198559 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:58:59.198358 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-f8wg8" Apr 16 16:58:59.199296 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:58:59.199274 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-f8wg8" podUID="c63aba9a-0bec-4778-8115-a8cb69433080" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.69:8080: connect: connection refused" Apr 16 16:58:59.216366 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:58:59.216321 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-f8wg8" podStartSLOduration=3.216306654 podStartE2EDuration="3.216306654s" podCreationTimestamp="2026-04-16 16:58:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:58:59.215472538 +0000 UTC m=+3362.570462670" watchObservedRunningTime="2026-04-16 16:58:59.216306654 +0000 UTC m=+3362.571296783" Apr 16 16:58:59.998749 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:58:59.998728 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-675d9b5ff-5b9rv" Apr 16 16:59:00.080088 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:59:00.080014 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/88f2cb88-fd7c-4867-8d75-328d9a4a5b92-kserve-provision-location\") pod \"88f2cb88-fd7c-4867-8d75-328d9a4a5b92\" (UID: \"88f2cb88-fd7c-4867-8d75-328d9a4a5b92\") " Apr 16 16:59:00.080295 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:59:00.080274 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88f2cb88-fd7c-4867-8d75-328d9a4a5b92-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "88f2cb88-fd7c-4867-8d75-328d9a4a5b92" (UID: "88f2cb88-fd7c-4867-8d75-328d9a4a5b92"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:59:00.181341 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:59:00.181314 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/88f2cb88-fd7c-4867-8d75-328d9a4a5b92-kserve-provision-location\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 16:59:00.204750 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:59:00.204722 2576 generic.go:358] "Generic (PLEG): container finished" podID="88f2cb88-fd7c-4867-8d75-328d9a4a5b92" containerID="6d740be3aa12cbef0682deaf5b7c0abf2e1ef0b8b6179478e6dbc8a5d3b324e9" exitCode=0 Apr 16 16:59:00.205129 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:59:00.204788 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-675d9b5ff-5b9rv" Apr 16 16:59:00.205129 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:59:00.204813 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-675d9b5ff-5b9rv" event={"ID":"88f2cb88-fd7c-4867-8d75-328d9a4a5b92","Type":"ContainerDied","Data":"6d740be3aa12cbef0682deaf5b7c0abf2e1ef0b8b6179478e6dbc8a5d3b324e9"} Apr 16 16:59:00.205129 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:59:00.204851 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-675d9b5ff-5b9rv" event={"ID":"88f2cb88-fd7c-4867-8d75-328d9a4a5b92","Type":"ContainerDied","Data":"cbce7e0c6302c3c019e3549925bff8fd67d581b90f94b23b2b5a2ea472eb479e"} Apr 16 16:59:00.205129 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:59:00.204869 2576 scope.go:117] "RemoveContainer" containerID="6d740be3aa12cbef0682deaf5b7c0abf2e1ef0b8b6179478e6dbc8a5d3b324e9" Apr 16 16:59:00.205470 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:59:00.205445 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-f8wg8" podUID="c63aba9a-0bec-4778-8115-a8cb69433080" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.69:8080: connect: connection refused" Apr 16 16:59:00.213331 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:59:00.213311 2576 scope.go:117] "RemoveContainer" containerID="60a04ebabab81d22f8bda7905fffdc8bfb8b340f7fdf004816ffcd2793f4a680" Apr 16 16:59:00.219812 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:59:00.219796 2576 scope.go:117] "RemoveContainer" containerID="6d740be3aa12cbef0682deaf5b7c0abf2e1ef0b8b6179478e6dbc8a5d3b324e9" Apr 16 16:59:00.220061 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:59:00.220034 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d740be3aa12cbef0682deaf5b7c0abf2e1ef0b8b6179478e6dbc8a5d3b324e9\": container with ID starting with 6d740be3aa12cbef0682deaf5b7c0abf2e1ef0b8b6179478e6dbc8a5d3b324e9 not found: ID does not exist" containerID="6d740be3aa12cbef0682deaf5b7c0abf2e1ef0b8b6179478e6dbc8a5d3b324e9" Apr 16 16:59:00.220113 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:59:00.220064 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d740be3aa12cbef0682deaf5b7c0abf2e1ef0b8b6179478e6dbc8a5d3b324e9"} err="failed to get container status \"6d740be3aa12cbef0682deaf5b7c0abf2e1ef0b8b6179478e6dbc8a5d3b324e9\": rpc error: code = NotFound desc = could not find container \"6d740be3aa12cbef0682deaf5b7c0abf2e1ef0b8b6179478e6dbc8a5d3b324e9\": container with ID starting with 6d740be3aa12cbef0682deaf5b7c0abf2e1ef0b8b6179478e6dbc8a5d3b324e9 not found: ID does not exist" Apr 16 16:59:00.220113 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:59:00.220080 2576 scope.go:117] "RemoveContainer" containerID="60a04ebabab81d22f8bda7905fffdc8bfb8b340f7fdf004816ffcd2793f4a680" Apr 16 16:59:00.220308 ip-10-0-137-150 kubenswrapper[2576]: E0416 16:59:00.220293 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60a04ebabab81d22f8bda7905fffdc8bfb8b340f7fdf004816ffcd2793f4a680\": container with ID starting with 60a04ebabab81d22f8bda7905fffdc8bfb8b340f7fdf004816ffcd2793f4a680 not found: ID does not exist" containerID="60a04ebabab81d22f8bda7905fffdc8bfb8b340f7fdf004816ffcd2793f4a680" Apr 16 16:59:00.220355 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:59:00.220311 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60a04ebabab81d22f8bda7905fffdc8bfb8b340f7fdf004816ffcd2793f4a680"} err="failed to get container status \"60a04ebabab81d22f8bda7905fffdc8bfb8b340f7fdf004816ffcd2793f4a680\": rpc error: code = NotFound desc = could not find container \"60a04ebabab81d22f8bda7905fffdc8bfb8b340f7fdf004816ffcd2793f4a680\": container with ID starting with 60a04ebabab81d22f8bda7905fffdc8bfb8b340f7fdf004816ffcd2793f4a680 not found: ID does not exist" Apr 16 16:59:00.224622 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:59:00.224599 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-675d9b5ff-5b9rv"] Apr 16 16:59:00.226833 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:59:00.226808 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-675d9b5ff-5b9rv"] Apr 16 16:59:01.278361 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:59:01.278331 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88f2cb88-fd7c-4867-8d75-328d9a4a5b92" path="/var/lib/kubelet/pods/88f2cb88-fd7c-4867-8d75-328d9a4a5b92/volumes" Apr 16 16:59:10.205411 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:59:10.205369 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-f8wg8" podUID="c63aba9a-0bec-4778-8115-a8cb69433080" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.69:8080: connect: connection refused" Apr 16 16:59:20.206063 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:59:20.206023 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-f8wg8" podUID="c63aba9a-0bec-4778-8115-a8cb69433080" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.69:8080: connect: connection refused" Apr 16 16:59:30.205802 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:59:30.205759 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-f8wg8" podUID="c63aba9a-0bec-4778-8115-a8cb69433080" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.69:8080: connect: connection refused" Apr 16 16:59:40.205709 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:59:40.205661 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-f8wg8" podUID="c63aba9a-0bec-4778-8115-a8cb69433080" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.69:8080: connect: connection refused" Apr 16 16:59:50.206266 ip-10-0-137-150 kubenswrapper[2576]: I0416 16:59:50.206167 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-f8wg8" podUID="c63aba9a-0bec-4778-8115-a8cb69433080" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.69:8080: connect: connection refused" Apr 16 17:00:00.206175 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:00:00.206134 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-f8wg8" podUID="c63aba9a-0bec-4778-8115-a8cb69433080" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.69:8080: connect: connection refused" Apr 16 17:00:10.206410 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:00:10.206377 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-f8wg8" Apr 16 17:00:16.711864 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:00:16.711825 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-f8wg8"] Apr 16 17:00:16.712430 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:00:16.712087 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-f8wg8" podUID="c63aba9a-0bec-4778-8115-a8cb69433080" containerName="kserve-container" containerID="cri-o://183a702d2e94c228fadcd541f916c626703ca729378680396046986b4c936cf0" gracePeriod=30 Apr 16 17:00:16.812179 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:00:16.812149 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-jfkwp"] Apr 16 17:00:16.812504 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:00:16.812491 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88f2cb88-fd7c-4867-8d75-328d9a4a5b92" containerName="storage-initializer" Apr 16 17:00:16.812553 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:00:16.812507 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="88f2cb88-fd7c-4867-8d75-328d9a4a5b92" containerName="storage-initializer" Apr 16 17:00:16.812553 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:00:16.812516 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88f2cb88-fd7c-4867-8d75-328d9a4a5b92" containerName="kserve-container" Apr 16 17:00:16.812553 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:00:16.812521 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="88f2cb88-fd7c-4867-8d75-328d9a4a5b92" containerName="kserve-container" Apr 16 17:00:16.812656 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:00:16.812581 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="88f2cb88-fd7c-4867-8d75-328d9a4a5b92" containerName="kserve-container" Apr 16 17:00:16.815593 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:00:16.815571 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-jfkwp" Apr 16 17:00:16.817802 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:00:16.817778 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 16 17:00:16.825415 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:00:16.825391 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-jfkwp"] Apr 16 17:00:16.958077 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:00:16.958043 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/3e4be2fd-154e-41ec-8f64-68c18bdde206-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-jfkwp\" (UID: \"3e4be2fd-154e-41ec-8f64-68c18bdde206\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-jfkwp" Apr 16 17:00:16.958257 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:00:16.958088 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3e4be2fd-154e-41ec-8f64-68c18bdde206-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-jfkwp\" (UID: \"3e4be2fd-154e-41ec-8f64-68c18bdde206\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-jfkwp" Apr 16 17:00:17.058766 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:00:17.058685 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/3e4be2fd-154e-41ec-8f64-68c18bdde206-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-jfkwp\" (UID: \"3e4be2fd-154e-41ec-8f64-68c18bdde206\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-jfkwp" Apr 16 17:00:17.058766 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:00:17.058731 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3e4be2fd-154e-41ec-8f64-68c18bdde206-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-jfkwp\" (UID: \"3e4be2fd-154e-41ec-8f64-68c18bdde206\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-jfkwp" Apr 16 17:00:17.059096 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:00:17.059081 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3e4be2fd-154e-41ec-8f64-68c18bdde206-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-jfkwp\" (UID: \"3e4be2fd-154e-41ec-8f64-68c18bdde206\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-jfkwp" Apr 16 17:00:17.059307 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:00:17.059288 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/3e4be2fd-154e-41ec-8f64-68c18bdde206-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-jfkwp\" (UID: \"3e4be2fd-154e-41ec-8f64-68c18bdde206\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-jfkwp" Apr 16 17:00:17.126261 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:00:17.126237 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-jfkwp" Apr 16 17:00:17.273904 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:00:17.273879 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-jfkwp"] Apr 16 17:00:17.276882 ip-10-0-137-150 kubenswrapper[2576]: W0416 17:00:17.276846 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e4be2fd_154e_41ec_8f64_68c18bdde206.slice/crio-c28a717b03a12ef0eebb81da0fdbfb50965e56aed116153934a39afb0804dc05 WatchSource:0}: Error finding container c28a717b03a12ef0eebb81da0fdbfb50965e56aed116153934a39afb0804dc05: Status 404 returned error can't find the container with id c28a717b03a12ef0eebb81da0fdbfb50965e56aed116153934a39afb0804dc05 Apr 16 17:00:17.278828 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:00:17.278805 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 17:00:17.442863 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:00:17.442832 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-jfkwp" event={"ID":"3e4be2fd-154e-41ec-8f64-68c18bdde206","Type":"ContainerStarted","Data":"a20f52095b61e7931f143522859ac5daa743175f252cb6f438116addc82ddd3a"} Apr 16 17:00:17.443031 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:00:17.442872 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-jfkwp" event={"ID":"3e4be2fd-154e-41ec-8f64-68c18bdde206","Type":"ContainerStarted","Data":"c28a717b03a12ef0eebb81da0fdbfb50965e56aed116153934a39afb0804dc05"} Apr 16 17:00:18.447558 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:00:18.447526 2576 generic.go:358] "Generic (PLEG): container finished" podID="3e4be2fd-154e-41ec-8f64-68c18bdde206" containerID="a20f52095b61e7931f143522859ac5daa743175f252cb6f438116addc82ddd3a" exitCode=0 Apr 16 17:00:18.447978 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:00:18.447594 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-jfkwp" event={"ID":"3e4be2fd-154e-41ec-8f64-68c18bdde206","Type":"ContainerDied","Data":"a20f52095b61e7931f143522859ac5daa743175f252cb6f438116addc82ddd3a"} Apr 16 17:00:19.451962 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:00:19.451712 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-jfkwp" event={"ID":"3e4be2fd-154e-41ec-8f64-68c18bdde206","Type":"ContainerStarted","Data":"5084070b1237459684024b178f2f6f25b8c42e243e9ae2e2133762961d852a5a"} Apr 16 17:00:19.454474 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:00:19.454443 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-jfkwp" Apr 16 17:00:19.457061 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:00:19.457026 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-jfkwp" podUID="3e4be2fd-154e-41ec-8f64-68c18bdde206" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.70:8080: connect: connection refused" Apr 16 17:00:19.470440 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:00:19.470398 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-jfkwp" podStartSLOduration=3.470386912 podStartE2EDuration="3.470386912s" podCreationTimestamp="2026-04-16 17:00:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:00:19.468926531 +0000 UTC m=+3442.823916672" watchObservedRunningTime="2026-04-16 17:00:19.470386912 +0000 UTC m=+3442.825377040" Apr 16 17:00:20.205704 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:00:20.205660 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-f8wg8" podUID="c63aba9a-0bec-4778-8115-a8cb69433080" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.69:8080: connect: connection refused" Apr 16 17:00:20.455953 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:00:20.455855 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-jfkwp" podUID="3e4be2fd-154e-41ec-8f64-68c18bdde206" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.70:8080: connect: connection refused" Apr 16 17:00:20.956125 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:00:20.956101 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-f8wg8" Apr 16 17:00:21.088994 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:00:21.088912 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c63aba9a-0bec-4778-8115-a8cb69433080-kserve-provision-location\") pod \"c63aba9a-0bec-4778-8115-a8cb69433080\" (UID: \"c63aba9a-0bec-4778-8115-a8cb69433080\") " Apr 16 17:00:21.089244 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:00:21.089198 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c63aba9a-0bec-4778-8115-a8cb69433080-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c63aba9a-0bec-4778-8115-a8cb69433080" (UID: "c63aba9a-0bec-4778-8115-a8cb69433080"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:00:21.189583 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:00:21.189551 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c63aba9a-0bec-4778-8115-a8cb69433080-kserve-provision-location\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 17:00:21.459347 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:00:21.459315 2576 generic.go:358] "Generic (PLEG): container finished" podID="c63aba9a-0bec-4778-8115-a8cb69433080" containerID="183a702d2e94c228fadcd541f916c626703ca729378680396046986b4c936cf0" exitCode=0 Apr 16 17:00:21.459767 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:00:21.459378 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-f8wg8" Apr 16 17:00:21.459767 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:00:21.459403 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-f8wg8" event={"ID":"c63aba9a-0bec-4778-8115-a8cb69433080","Type":"ContainerDied","Data":"183a702d2e94c228fadcd541f916c626703ca729378680396046986b4c936cf0"} Apr 16 17:00:21.459767 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:00:21.459439 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-f8wg8" event={"ID":"c63aba9a-0bec-4778-8115-a8cb69433080","Type":"ContainerDied","Data":"b52c6657f045483e0ce436a8ded8fadddd3340f6b4ff0fcdfa30baaa01f7684d"} Apr 16 17:00:21.459767 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:00:21.459456 2576 scope.go:117] "RemoveContainer" containerID="183a702d2e94c228fadcd541f916c626703ca729378680396046986b4c936cf0" Apr 16 17:00:21.460066 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:00:21.460036 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-jfkwp" podUID="3e4be2fd-154e-41ec-8f64-68c18bdde206" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.70:8080: connect: connection refused" Apr 16 17:00:21.467353 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:00:21.467334 2576 scope.go:117] "RemoveContainer" containerID="95cb2f0bfc853ca1eaad10b5f3708c0f4358d27572b7fd42f0312afb1063d979" Apr 16 17:00:21.473873 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:00:21.473857 2576 scope.go:117] "RemoveContainer" containerID="183a702d2e94c228fadcd541f916c626703ca729378680396046986b4c936cf0" Apr 16 17:00:21.474078 ip-10-0-137-150 kubenswrapper[2576]: E0416 17:00:21.474060 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"183a702d2e94c228fadcd541f916c626703ca729378680396046986b4c936cf0\": container with ID starting with 183a702d2e94c228fadcd541f916c626703ca729378680396046986b4c936cf0 not found: ID does not exist" containerID="183a702d2e94c228fadcd541f916c626703ca729378680396046986b4c936cf0" Apr 16 17:00:21.474128 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:00:21.474087 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"183a702d2e94c228fadcd541f916c626703ca729378680396046986b4c936cf0"} err="failed to get container status \"183a702d2e94c228fadcd541f916c626703ca729378680396046986b4c936cf0\": rpc error: code = NotFound desc = could not find container \"183a702d2e94c228fadcd541f916c626703ca729378680396046986b4c936cf0\": container with ID starting with 183a702d2e94c228fadcd541f916c626703ca729378680396046986b4c936cf0 not found: ID does not exist" Apr 16 17:00:21.474128 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:00:21.474103 2576 scope.go:117] "RemoveContainer" containerID="95cb2f0bfc853ca1eaad10b5f3708c0f4358d27572b7fd42f0312afb1063d979" Apr 16 17:00:21.474347 ip-10-0-137-150 kubenswrapper[2576]: E0416 17:00:21.474334 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95cb2f0bfc853ca1eaad10b5f3708c0f4358d27572b7fd42f0312afb1063d979\": container with ID starting with 95cb2f0bfc853ca1eaad10b5f3708c0f4358d27572b7fd42f0312afb1063d979 not found: ID does not exist" containerID="95cb2f0bfc853ca1eaad10b5f3708c0f4358d27572b7fd42f0312afb1063d979" Apr 16 17:00:21.474393 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:00:21.474352 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95cb2f0bfc853ca1eaad10b5f3708c0f4358d27572b7fd42f0312afb1063d979"} err="failed to get container status \"95cb2f0bfc853ca1eaad10b5f3708c0f4358d27572b7fd42f0312afb1063d979\": rpc error: code = NotFound desc = could not find container \"95cb2f0bfc853ca1eaad10b5f3708c0f4358d27572b7fd42f0312afb1063d979\": container with ID starting with 95cb2f0bfc853ca1eaad10b5f3708c0f4358d27572b7fd42f0312afb1063d979 not found: ID does not exist" Apr 16 17:00:21.481456 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:00:21.481436 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-f8wg8"] Apr 16 17:00:21.484232 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:00:21.484197 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-57cc89c4d5-f8wg8"] Apr 16 17:00:23.278041 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:00:23.277981 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c63aba9a-0bec-4778-8115-a8cb69433080" path="/var/lib/kubelet/pods/c63aba9a-0bec-4778-8115-a8cb69433080/volumes" Apr 16 17:00:31.461110 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:00:31.461067 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-jfkwp" podUID="3e4be2fd-154e-41ec-8f64-68c18bdde206" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.70:8080: connect: connection refused" Apr 16 17:00:41.460009 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:00:41.459967 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-jfkwp" podUID="3e4be2fd-154e-41ec-8f64-68c18bdde206" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.70:8080: connect: connection refused" Apr 16 17:00:51.460231 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:00:51.460177 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-jfkwp" podUID="3e4be2fd-154e-41ec-8f64-68c18bdde206" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.70:8080: connect: connection refused" Apr 16 17:01:01.460265 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:01.460196 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-jfkwp" podUID="3e4be2fd-154e-41ec-8f64-68c18bdde206" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.70:8080: connect: connection refused" Apr 16 17:01:11.461018 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:11.460978 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-jfkwp" podUID="3e4be2fd-154e-41ec-8f64-68c18bdde206" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.70:8080: connect: connection refused" Apr 16 17:01:21.460594 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:21.460506 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-jfkwp" podUID="3e4be2fd-154e-41ec-8f64-68c18bdde206" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.70:8080: connect: connection refused" Apr 16 17:01:29.278830 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:29.278803 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-jfkwp" Apr 16 17:01:36.857323 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:36.857283 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-jfkwp"] Apr 16 17:01:36.857743 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:36.857559 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-jfkwp" podUID="3e4be2fd-154e-41ec-8f64-68c18bdde206" containerName="kserve-container" containerID="cri-o://5084070b1237459684024b178f2f6f25b8c42e243e9ae2e2133762961d852a5a" gracePeriod=30 Apr 16 17:01:37.954047 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:37.954006 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-87z26"] Apr 16 17:01:37.954528 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:37.954459 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c63aba9a-0bec-4778-8115-a8cb69433080" containerName="storage-initializer" Apr 16 17:01:37.954528 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:37.954478 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c63aba9a-0bec-4778-8115-a8cb69433080" containerName="storage-initializer" Apr 16 17:01:37.954528 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:37.954509 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c63aba9a-0bec-4778-8115-a8cb69433080" containerName="kserve-container" Apr 16 17:01:37.954528 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:37.954517 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c63aba9a-0bec-4778-8115-a8cb69433080" containerName="kserve-container" Apr 16 17:01:37.954747 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:37.954599 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c63aba9a-0bec-4778-8115-a8cb69433080" containerName="kserve-container" Apr 16 17:01:37.957601 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:37.957580 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-87z26" Apr 16 17:01:37.966336 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:37.966314 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-87z26"] Apr 16 17:01:38.091330 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:38.091284 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bea2097c-6223-4b62-bb9b-00a7c35c2446-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-87z26\" (UID: \"bea2097c-6223-4b62-bb9b-00a7c35c2446\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-87z26" Apr 16 17:01:38.192178 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:38.192133 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bea2097c-6223-4b62-bb9b-00a7c35c2446-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-87z26\" (UID: \"bea2097c-6223-4b62-bb9b-00a7c35c2446\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-87z26" Apr 16 17:01:38.192551 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:38.192533 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bea2097c-6223-4b62-bb9b-00a7c35c2446-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-87z26\" (UID: \"bea2097c-6223-4b62-bb9b-00a7c35c2446\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-87z26" Apr 16 17:01:38.269399 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:38.269308 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-87z26" Apr 16 17:01:38.425685 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:38.425644 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-87z26"] Apr 16 17:01:38.429686 ip-10-0-137-150 kubenswrapper[2576]: W0416 17:01:38.429658 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbea2097c_6223_4b62_bb9b_00a7c35c2446.slice/crio-ba7f73455e0068d3d9aea556b859ec39d277a803a9215da6b23a8fbdc27cde07 WatchSource:0}: Error finding container ba7f73455e0068d3d9aea556b859ec39d277a803a9215da6b23a8fbdc27cde07: Status 404 returned error can't find the container with id ba7f73455e0068d3d9aea556b859ec39d277a803a9215da6b23a8fbdc27cde07 Apr 16 17:01:38.703122 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:38.703088 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-87z26" event={"ID":"bea2097c-6223-4b62-bb9b-00a7c35c2446","Type":"ContainerStarted","Data":"e9de11d902d81d83ebcce79a8121fa5f12730f6b25df578a5e95069af8870a9e"} Apr 16 17:01:38.703122 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:38.703124 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-87z26" event={"ID":"bea2097c-6223-4b62-bb9b-00a7c35c2446","Type":"ContainerStarted","Data":"ba7f73455e0068d3d9aea556b859ec39d277a803a9215da6b23a8fbdc27cde07"} Apr 16 17:01:39.275060 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:39.275012 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-jfkwp" podUID="3e4be2fd-154e-41ec-8f64-68c18bdde206" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.70:8080: connect: connection refused" Apr 16 17:01:41.391342 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:41.391318 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-jfkwp" Apr 16 17:01:41.519121 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:41.519026 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3e4be2fd-154e-41ec-8f64-68c18bdde206-kserve-provision-location\") pod \"3e4be2fd-154e-41ec-8f64-68c18bdde206\" (UID: \"3e4be2fd-154e-41ec-8f64-68c18bdde206\") " Apr 16 17:01:41.519312 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:41.519132 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/3e4be2fd-154e-41ec-8f64-68c18bdde206-cabundle-cert\") pod \"3e4be2fd-154e-41ec-8f64-68c18bdde206\" (UID: \"3e4be2fd-154e-41ec-8f64-68c18bdde206\") " Apr 16 17:01:41.519460 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:41.519427 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e4be2fd-154e-41ec-8f64-68c18bdde206-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3e4be2fd-154e-41ec-8f64-68c18bdde206" (UID: "3e4be2fd-154e-41ec-8f64-68c18bdde206"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:01:41.519563 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:41.519512 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e4be2fd-154e-41ec-8f64-68c18bdde206-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "3e4be2fd-154e-41ec-8f64-68c18bdde206" (UID: "3e4be2fd-154e-41ec-8f64-68c18bdde206"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 17:01:41.620133 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:41.620093 2576 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/3e4be2fd-154e-41ec-8f64-68c18bdde206-cabundle-cert\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 17:01:41.620133 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:41.620126 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3e4be2fd-154e-41ec-8f64-68c18bdde206-kserve-provision-location\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 17:01:41.714923 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:41.714887 2576 generic.go:358] "Generic (PLEG): container finished" podID="3e4be2fd-154e-41ec-8f64-68c18bdde206" containerID="5084070b1237459684024b178f2f6f25b8c42e243e9ae2e2133762961d852a5a" exitCode=0 Apr 16 17:01:41.715094 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:41.714973 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-jfkwp" Apr 16 17:01:41.715094 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:41.714978 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-jfkwp" event={"ID":"3e4be2fd-154e-41ec-8f64-68c18bdde206","Type":"ContainerDied","Data":"5084070b1237459684024b178f2f6f25b8c42e243e9ae2e2133762961d852a5a"} Apr 16 17:01:41.715094 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:41.715019 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-jfkwp" event={"ID":"3e4be2fd-154e-41ec-8f64-68c18bdde206","Type":"ContainerDied","Data":"c28a717b03a12ef0eebb81da0fdbfb50965e56aed116153934a39afb0804dc05"} Apr 16 17:01:41.715094 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:41.715035 2576 scope.go:117] "RemoveContainer" containerID="5084070b1237459684024b178f2f6f25b8c42e243e9ae2e2133762961d852a5a" Apr 16 17:01:41.728127 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:41.728106 2576 scope.go:117] "RemoveContainer" containerID="a20f52095b61e7931f143522859ac5daa743175f252cb6f438116addc82ddd3a" Apr 16 17:01:41.735234 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:41.735205 2576 scope.go:117] "RemoveContainer" containerID="5084070b1237459684024b178f2f6f25b8c42e243e9ae2e2133762961d852a5a" Apr 16 17:01:41.735509 ip-10-0-137-150 kubenswrapper[2576]: E0416 17:01:41.735487 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5084070b1237459684024b178f2f6f25b8c42e243e9ae2e2133762961d852a5a\": container with ID starting with 5084070b1237459684024b178f2f6f25b8c42e243e9ae2e2133762961d852a5a not found: ID does not exist" containerID="5084070b1237459684024b178f2f6f25b8c42e243e9ae2e2133762961d852a5a" Apr 16 17:01:41.735599 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:41.735520 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5084070b1237459684024b178f2f6f25b8c42e243e9ae2e2133762961d852a5a"} err="failed to get container status \"5084070b1237459684024b178f2f6f25b8c42e243e9ae2e2133762961d852a5a\": rpc error: code = NotFound desc = could not find container \"5084070b1237459684024b178f2f6f25b8c42e243e9ae2e2133762961d852a5a\": container with ID starting with 5084070b1237459684024b178f2f6f25b8c42e243e9ae2e2133762961d852a5a not found: ID does not exist" Apr 16 17:01:41.735599 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:41.735543 2576 scope.go:117] "RemoveContainer" containerID="a20f52095b61e7931f143522859ac5daa743175f252cb6f438116addc82ddd3a" Apr 16 17:01:41.735766 ip-10-0-137-150 kubenswrapper[2576]: E0416 17:01:41.735747 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a20f52095b61e7931f143522859ac5daa743175f252cb6f438116addc82ddd3a\": container with ID starting with a20f52095b61e7931f143522859ac5daa743175f252cb6f438116addc82ddd3a not found: ID does not exist" containerID="a20f52095b61e7931f143522859ac5daa743175f252cb6f438116addc82ddd3a" Apr 16 17:01:41.735807 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:41.735773 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a20f52095b61e7931f143522859ac5daa743175f252cb6f438116addc82ddd3a"} err="failed to get container status \"a20f52095b61e7931f143522859ac5daa743175f252cb6f438116addc82ddd3a\": rpc error: code = NotFound desc = could not find container \"a20f52095b61e7931f143522859ac5daa743175f252cb6f438116addc82ddd3a\": container with ID starting with a20f52095b61e7931f143522859ac5daa743175f252cb6f438116addc82ddd3a not found: ID does not exist" Apr 16 17:01:41.760138 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:41.760109 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-jfkwp"] Apr 16 17:01:41.768727 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:41.768704 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-77779fc9bc-jfkwp"] Apr 16 17:01:43.279391 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:43.279358 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e4be2fd-154e-41ec-8f64-68c18bdde206" path="/var/lib/kubelet/pods/3e4be2fd-154e-41ec-8f64-68c18bdde206/volumes" Apr 16 17:01:45.732638 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:45.732610 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-87z26_bea2097c-6223-4b62-bb9b-00a7c35c2446/storage-initializer/0.log" Apr 16 17:01:45.733029 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:45.732658 2576 generic.go:358] "Generic (PLEG): container finished" podID="bea2097c-6223-4b62-bb9b-00a7c35c2446" containerID="e9de11d902d81d83ebcce79a8121fa5f12730f6b25df578a5e95069af8870a9e" exitCode=1 Apr 16 17:01:45.733029 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:45.732718 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-87z26" event={"ID":"bea2097c-6223-4b62-bb9b-00a7c35c2446","Type":"ContainerDied","Data":"e9de11d902d81d83ebcce79a8121fa5f12730f6b25df578a5e95069af8870a9e"} Apr 16 17:01:46.737130 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:46.737101 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-87z26_bea2097c-6223-4b62-bb9b-00a7c35c2446/storage-initializer/0.log" Apr 16 17:01:46.737526 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:46.737159 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-87z26" event={"ID":"bea2097c-6223-4b62-bb9b-00a7c35c2446","Type":"ContainerStarted","Data":"c19aca705d136077a3a16316045d1bedd8995d923954c3c9a343ed303d856e7a"} Apr 16 17:01:47.947982 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:47.947941 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-87z26"] Apr 16 17:01:47.948433 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:47.948192 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-87z26" podUID="bea2097c-6223-4b62-bb9b-00a7c35c2446" containerName="storage-initializer" containerID="cri-o://c19aca705d136077a3a16316045d1bedd8995d923954c3c9a343ed303d856e7a" gracePeriod=30 Apr 16 17:01:49.017982 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:49.017952 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-b84cl"] Apr 16 17:01:49.018386 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:49.018264 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3e4be2fd-154e-41ec-8f64-68c18bdde206" containerName="kserve-container" Apr 16 17:01:49.018386 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:49.018275 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e4be2fd-154e-41ec-8f64-68c18bdde206" containerName="kserve-container" Apr 16 17:01:49.018386 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:49.018285 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3e4be2fd-154e-41ec-8f64-68c18bdde206" containerName="storage-initializer" Apr 16 17:01:49.018386 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:49.018292 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e4be2fd-154e-41ec-8f64-68c18bdde206" containerName="storage-initializer" Apr 16 17:01:49.018386 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:49.018352 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="3e4be2fd-154e-41ec-8f64-68c18bdde206" containerName="kserve-container" Apr 16 17:01:49.021254 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:49.021237 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-b84cl" Apr 16 17:01:49.023552 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:49.023530 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 16 17:01:49.029290 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:49.029267 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-b84cl"] Apr 16 17:01:49.183653 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:49.183616 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a9fb0344-9c97-4c17-8bcd-fcfa4c73cf21-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-b84cl\" (UID: \"a9fb0344-9c97-4c17-8bcd-fcfa4c73cf21\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-b84cl" Apr 16 17:01:49.183653 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:49.183658 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a9fb0344-9c97-4c17-8bcd-fcfa4c73cf21-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-b84cl\" (UID: \"a9fb0344-9c97-4c17-8bcd-fcfa4c73cf21\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-b84cl" Apr 16 17:01:49.284937 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:49.284853 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a9fb0344-9c97-4c17-8bcd-fcfa4c73cf21-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-b84cl\" (UID: \"a9fb0344-9c97-4c17-8bcd-fcfa4c73cf21\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-b84cl" Apr 16 17:01:49.284937 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:49.284890 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a9fb0344-9c97-4c17-8bcd-fcfa4c73cf21-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-b84cl\" (UID: \"a9fb0344-9c97-4c17-8bcd-fcfa4c73cf21\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-b84cl" Apr 16 17:01:49.285242 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:49.285199 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a9fb0344-9c97-4c17-8bcd-fcfa4c73cf21-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-b84cl\" (UID: \"a9fb0344-9c97-4c17-8bcd-fcfa4c73cf21\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-b84cl" Apr 16 17:01:49.285543 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:49.285520 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a9fb0344-9c97-4c17-8bcd-fcfa4c73cf21-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-b84cl\" (UID: \"a9fb0344-9c97-4c17-8bcd-fcfa4c73cf21\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-b84cl" Apr 16 17:01:49.332026 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:49.331990 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-b84cl" Apr 16 17:01:49.452702 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:49.452653 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-b84cl"] Apr 16 17:01:49.455820 ip-10-0-137-150 kubenswrapper[2576]: W0416 17:01:49.455785 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9fb0344_9c97_4c17_8bcd_fcfa4c73cf21.slice/crio-1a0be43c2d510c360e3a9fb29bf76fc88fd76185e51083c6ca305eef1b15f715 WatchSource:0}: Error finding container 1a0be43c2d510c360e3a9fb29bf76fc88fd76185e51083c6ca305eef1b15f715: Status 404 returned error can't find the container with id 1a0be43c2d510c360e3a9fb29bf76fc88fd76185e51083c6ca305eef1b15f715 Apr 16 17:01:49.749228 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:49.749185 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-b84cl" event={"ID":"a9fb0344-9c97-4c17-8bcd-fcfa4c73cf21","Type":"ContainerStarted","Data":"7c44b599f9ee05cb7d0fab5518e14334bf72f08e3220e7866386da492a7da052"} Apr 16 17:01:49.749437 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:49.749238 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-b84cl" event={"ID":"a9fb0344-9c97-4c17-8bcd-fcfa4c73cf21","Type":"ContainerStarted","Data":"1a0be43c2d510c360e3a9fb29bf76fc88fd76185e51083c6ca305eef1b15f715"} Apr 16 17:01:50.177399 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:50.177372 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-87z26_bea2097c-6223-4b62-bb9b-00a7c35c2446/storage-initializer/1.log" Apr 16 17:01:50.177768 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:50.177713 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-87z26_bea2097c-6223-4b62-bb9b-00a7c35c2446/storage-initializer/0.log" Apr 16 17:01:50.177809 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:50.177771 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-87z26" Apr 16 17:01:50.294274 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:50.294182 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bea2097c-6223-4b62-bb9b-00a7c35c2446-kserve-provision-location\") pod \"bea2097c-6223-4b62-bb9b-00a7c35c2446\" (UID: \"bea2097c-6223-4b62-bb9b-00a7c35c2446\") " Apr 16 17:01:50.294440 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:50.294418 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bea2097c-6223-4b62-bb9b-00a7c35c2446-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "bea2097c-6223-4b62-bb9b-00a7c35c2446" (UID: "bea2097c-6223-4b62-bb9b-00a7c35c2446"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:01:50.395725 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:50.395685 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bea2097c-6223-4b62-bb9b-00a7c35c2446-kserve-provision-location\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 17:01:50.753489 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:50.753462 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-87z26_bea2097c-6223-4b62-bb9b-00a7c35c2446/storage-initializer/1.log" Apr 16 17:01:50.753823 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:50.753806 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-87z26_bea2097c-6223-4b62-bb9b-00a7c35c2446/storage-initializer/0.log" Apr 16 17:01:50.753871 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:50.753847 2576 generic.go:358] "Generic (PLEG): container finished" podID="bea2097c-6223-4b62-bb9b-00a7c35c2446" containerID="c19aca705d136077a3a16316045d1bedd8995d923954c3c9a343ed303d856e7a" exitCode=1 Apr 16 17:01:50.753927 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:50.753911 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-87z26" event={"ID":"bea2097c-6223-4b62-bb9b-00a7c35c2446","Type":"ContainerDied","Data":"c19aca705d136077a3a16316045d1bedd8995d923954c3c9a343ed303d856e7a"} Apr 16 17:01:50.753975 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:50.753935 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-87z26" event={"ID":"bea2097c-6223-4b62-bb9b-00a7c35c2446","Type":"ContainerDied","Data":"ba7f73455e0068d3d9aea556b859ec39d277a803a9215da6b23a8fbdc27cde07"} Apr 16 17:01:50.753975 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:50.753937 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-87z26" Apr 16 17:01:50.753975 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:50.753955 2576 scope.go:117] "RemoveContainer" containerID="c19aca705d136077a3a16316045d1bedd8995d923954c3c9a343ed303d856e7a" Apr 16 17:01:50.755436 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:50.755412 2576 generic.go:358] "Generic (PLEG): container finished" podID="a9fb0344-9c97-4c17-8bcd-fcfa4c73cf21" containerID="7c44b599f9ee05cb7d0fab5518e14334bf72f08e3220e7866386da492a7da052" exitCode=0 Apr 16 17:01:50.755566 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:50.755434 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-b84cl" event={"ID":"a9fb0344-9c97-4c17-8bcd-fcfa4c73cf21","Type":"ContainerDied","Data":"7c44b599f9ee05cb7d0fab5518e14334bf72f08e3220e7866386da492a7da052"} Apr 16 17:01:50.763149 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:50.763006 2576 scope.go:117] "RemoveContainer" containerID="e9de11d902d81d83ebcce79a8121fa5f12730f6b25df578a5e95069af8870a9e" Apr 16 17:01:50.771091 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:50.771067 2576 scope.go:117] "RemoveContainer" containerID="c19aca705d136077a3a16316045d1bedd8995d923954c3c9a343ed303d856e7a" Apr 16 17:01:50.771380 ip-10-0-137-150 kubenswrapper[2576]: E0416 17:01:50.771363 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c19aca705d136077a3a16316045d1bedd8995d923954c3c9a343ed303d856e7a\": container with ID starting with c19aca705d136077a3a16316045d1bedd8995d923954c3c9a343ed303d856e7a not found: ID does not exist" containerID="c19aca705d136077a3a16316045d1bedd8995d923954c3c9a343ed303d856e7a" Apr 16 17:01:50.771435 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:50.771389 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c19aca705d136077a3a16316045d1bedd8995d923954c3c9a343ed303d856e7a"} err="failed to get container status \"c19aca705d136077a3a16316045d1bedd8995d923954c3c9a343ed303d856e7a\": rpc error: code = NotFound desc = could not find container \"c19aca705d136077a3a16316045d1bedd8995d923954c3c9a343ed303d856e7a\": container with ID starting with c19aca705d136077a3a16316045d1bedd8995d923954c3c9a343ed303d856e7a not found: ID does not exist" Apr 16 17:01:50.771435 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:50.771408 2576 scope.go:117] "RemoveContainer" containerID="e9de11d902d81d83ebcce79a8121fa5f12730f6b25df578a5e95069af8870a9e" Apr 16 17:01:50.771653 ip-10-0-137-150 kubenswrapper[2576]: E0416 17:01:50.771636 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9de11d902d81d83ebcce79a8121fa5f12730f6b25df578a5e95069af8870a9e\": container with ID starting with e9de11d902d81d83ebcce79a8121fa5f12730f6b25df578a5e95069af8870a9e not found: ID does not exist" containerID="e9de11d902d81d83ebcce79a8121fa5f12730f6b25df578a5e95069af8870a9e" Apr 16 17:01:50.771698 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:50.771657 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9de11d902d81d83ebcce79a8121fa5f12730f6b25df578a5e95069af8870a9e"} err="failed to get container status \"e9de11d902d81d83ebcce79a8121fa5f12730f6b25df578a5e95069af8870a9e\": rpc error: code = NotFound desc = could not find container \"e9de11d902d81d83ebcce79a8121fa5f12730f6b25df578a5e95069af8870a9e\": container with ID starting with e9de11d902d81d83ebcce79a8121fa5f12730f6b25df578a5e95069af8870a9e not found: ID does not exist" Apr 16 17:01:50.801319 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:50.801281 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-87z26"] Apr 16 17:01:50.805387 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:50.805359 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-67954558cd-87z26"] Apr 16 17:01:51.278190 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:51.278148 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bea2097c-6223-4b62-bb9b-00a7c35c2446" path="/var/lib/kubelet/pods/bea2097c-6223-4b62-bb9b-00a7c35c2446/volumes" Apr 16 17:01:51.760556 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:51.760519 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-b84cl" event={"ID":"a9fb0344-9c97-4c17-8bcd-fcfa4c73cf21","Type":"ContainerStarted","Data":"3178cadacd12b1dee8b6f0b45ff5f17f122b0ce310f4eec2f8948f52d9ca9d69"} Apr 16 17:01:51.760765 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:51.760733 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-b84cl" Apr 16 17:01:51.762173 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:51.762130 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-b84cl" podUID="a9fb0344-9c97-4c17-8bcd-fcfa4c73cf21" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.72:8080: connect: connection refused" Apr 16 17:01:51.776626 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:51.776575 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-b84cl" podStartSLOduration=2.7765609209999997 podStartE2EDuration="2.776560921s" podCreationTimestamp="2026-04-16 17:01:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:01:51.775677165 +0000 UTC m=+3535.130667296" watchObservedRunningTime="2026-04-16 17:01:51.776560921 +0000 UTC m=+3535.131551049" Apr 16 17:01:52.764166 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:01:52.764127 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-b84cl" podUID="a9fb0344-9c97-4c17-8bcd-fcfa4c73cf21" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.72:8080: connect: connection refused" Apr 16 17:02:02.765122 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:02:02.765077 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-b84cl" podUID="a9fb0344-9c97-4c17-8bcd-fcfa4c73cf21" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.72:8080: connect: connection refused" Apr 16 17:02:05.605840 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:02:05.605812 2576 scope.go:117] "RemoveContainer" containerID="cdb1c5131e897b79a02c55cd4ed80bd9790acfb4ba7481a4ac12aeeccc209964" Apr 16 17:02:05.613722 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:02:05.613705 2576 scope.go:117] "RemoveContainer" containerID="e207a33a90fe479b8476c02f38285d4ed46fc9fffa89444bc2cc3be34c9a0af3" Apr 16 17:02:12.764969 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:02:12.764927 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-b84cl" podUID="a9fb0344-9c97-4c17-8bcd-fcfa4c73cf21" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.72:8080: connect: connection refused" Apr 16 17:02:22.764142 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:02:22.764094 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-b84cl" podUID="a9fb0344-9c97-4c17-8bcd-fcfa4c73cf21" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.72:8080: connect: connection refused" Apr 16 17:02:32.765082 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:02:32.765039 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-b84cl" podUID="a9fb0344-9c97-4c17-8bcd-fcfa4c73cf21" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.72:8080: connect: connection refused" Apr 16 17:02:42.764677 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:02:42.764636 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-b84cl" podUID="a9fb0344-9c97-4c17-8bcd-fcfa4c73cf21" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.72:8080: connect: connection refused" Apr 16 17:02:52.764509 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:02:52.764470 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-b84cl" podUID="a9fb0344-9c97-4c17-8bcd-fcfa4c73cf21" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.72:8080: connect: connection refused" Apr 16 17:03:02.765995 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:02.765964 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-b84cl" Apr 16 17:03:09.067637 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:09.067594 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-b84cl"] Apr 16 17:03:09.068018 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:09.067855 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-b84cl" podUID="a9fb0344-9c97-4c17-8bcd-fcfa4c73cf21" containerName="kserve-container" containerID="cri-o://3178cadacd12b1dee8b6f0b45ff5f17f122b0ce310f4eec2f8948f52d9ca9d69" gracePeriod=30 Apr 16 17:03:10.192320 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:10.192285 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-9zvwt"] Apr 16 17:03:10.192693 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:10.192609 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bea2097c-6223-4b62-bb9b-00a7c35c2446" containerName="storage-initializer" Apr 16 17:03:10.192693 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:10.192620 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="bea2097c-6223-4b62-bb9b-00a7c35c2446" containerName="storage-initializer" Apr 16 17:03:10.192693 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:10.192629 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bea2097c-6223-4b62-bb9b-00a7c35c2446" containerName="storage-initializer" Apr 16 17:03:10.192693 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:10.192634 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="bea2097c-6223-4b62-bb9b-00a7c35c2446" containerName="storage-initializer" Apr 16 17:03:10.192693 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:10.192693 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="bea2097c-6223-4b62-bb9b-00a7c35c2446" containerName="storage-initializer" Apr 16 17:03:10.192864 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:10.192703 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="bea2097c-6223-4b62-bb9b-00a7c35c2446" containerName="storage-initializer" Apr 16 17:03:10.195510 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:10.195491 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-9zvwt" Apr 16 17:03:10.205336 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:10.205314 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-9zvwt"] Apr 16 17:03:10.298334 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:10.298293 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f9c8c783-6730-47e2-aaae-bd7548ec194c-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-9zvwt\" (UID: \"f9c8c783-6730-47e2-aaae-bd7548ec194c\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-9zvwt" Apr 16 17:03:10.399600 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:10.399566 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f9c8c783-6730-47e2-aaae-bd7548ec194c-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-9zvwt\" (UID: \"f9c8c783-6730-47e2-aaae-bd7548ec194c\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-9zvwt" Apr 16 17:03:10.399992 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:10.399972 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f9c8c783-6730-47e2-aaae-bd7548ec194c-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-9zvwt\" (UID: \"f9c8c783-6730-47e2-aaae-bd7548ec194c\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-9zvwt" Apr 16 17:03:10.505876 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:10.505813 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-9zvwt" Apr 16 17:03:10.624484 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:10.624450 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-9zvwt"] Apr 16 17:03:10.642904 ip-10-0-137-150 kubenswrapper[2576]: W0416 17:03:10.642871 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9c8c783_6730_47e2_aaae_bd7548ec194c.slice/crio-a28fd0a5bc6a80ff792f533f9b2161bcd0f6a7c77b39ec6e1f7c119d5a79d2cc WatchSource:0}: Error finding container a28fd0a5bc6a80ff792f533f9b2161bcd0f6a7c77b39ec6e1f7c119d5a79d2cc: Status 404 returned error can't find the container with id a28fd0a5bc6a80ff792f533f9b2161bcd0f6a7c77b39ec6e1f7c119d5a79d2cc Apr 16 17:03:11.005421 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:11.005382 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-9zvwt" event={"ID":"f9c8c783-6730-47e2-aaae-bd7548ec194c","Type":"ContainerStarted","Data":"fce8f60f466b02f1a507830b930c2d8d13b0d35b03c20a3acd68cab0246dff8f"} Apr 16 17:03:11.005421 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:11.005419 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-9zvwt" event={"ID":"f9c8c783-6730-47e2-aaae-bd7548ec194c","Type":"ContainerStarted","Data":"a28fd0a5bc6a80ff792f533f9b2161bcd0f6a7c77b39ec6e1f7c119d5a79d2cc"} Apr 16 17:03:12.764727 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:12.764688 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-b84cl" podUID="a9fb0344-9c97-4c17-8bcd-fcfa4c73cf21" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.72:8080: connect: connection refused" Apr 16 17:03:13.298759 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:13.298738 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-b84cl" Apr 16 17:03:13.423087 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:13.423055 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a9fb0344-9c97-4c17-8bcd-fcfa4c73cf21-kserve-provision-location\") pod \"a9fb0344-9c97-4c17-8bcd-fcfa4c73cf21\" (UID: \"a9fb0344-9c97-4c17-8bcd-fcfa4c73cf21\") " Apr 16 17:03:13.423289 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:13.423141 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a9fb0344-9c97-4c17-8bcd-fcfa4c73cf21-cabundle-cert\") pod \"a9fb0344-9c97-4c17-8bcd-fcfa4c73cf21\" (UID: \"a9fb0344-9c97-4c17-8bcd-fcfa4c73cf21\") " Apr 16 17:03:13.423460 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:13.423431 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9fb0344-9c97-4c17-8bcd-fcfa4c73cf21-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a9fb0344-9c97-4c17-8bcd-fcfa4c73cf21" (UID: "a9fb0344-9c97-4c17-8bcd-fcfa4c73cf21"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:03:13.423535 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:13.423482 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9fb0344-9c97-4c17-8bcd-fcfa4c73cf21-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "a9fb0344-9c97-4c17-8bcd-fcfa4c73cf21" (UID: "a9fb0344-9c97-4c17-8bcd-fcfa4c73cf21"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 17:03:13.524378 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:13.524348 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a9fb0344-9c97-4c17-8bcd-fcfa4c73cf21-kserve-provision-location\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 17:03:13.524378 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:13.524375 2576 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a9fb0344-9c97-4c17-8bcd-fcfa4c73cf21-cabundle-cert\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 17:03:14.015617 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:14.015581 2576 generic.go:358] "Generic (PLEG): container finished" podID="a9fb0344-9c97-4c17-8bcd-fcfa4c73cf21" containerID="3178cadacd12b1dee8b6f0b45ff5f17f122b0ce310f4eec2f8948f52d9ca9d69" exitCode=0 Apr 16 17:03:14.016105 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:14.015656 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-b84cl" Apr 16 17:03:14.016105 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:14.015656 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-b84cl" event={"ID":"a9fb0344-9c97-4c17-8bcd-fcfa4c73cf21","Type":"ContainerDied","Data":"3178cadacd12b1dee8b6f0b45ff5f17f122b0ce310f4eec2f8948f52d9ca9d69"} Apr 16 17:03:14.016105 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:14.015695 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-b84cl" event={"ID":"a9fb0344-9c97-4c17-8bcd-fcfa4c73cf21","Type":"ContainerDied","Data":"1a0be43c2d510c360e3a9fb29bf76fc88fd76185e51083c6ca305eef1b15f715"} Apr 16 17:03:14.016105 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:14.015710 2576 scope.go:117] "RemoveContainer" containerID="3178cadacd12b1dee8b6f0b45ff5f17f122b0ce310f4eec2f8948f52d9ca9d69" Apr 16 17:03:14.023989 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:14.023974 2576 scope.go:117] "RemoveContainer" containerID="7c44b599f9ee05cb7d0fab5518e14334bf72f08e3220e7866386da492a7da052" Apr 16 17:03:14.030905 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:14.030888 2576 scope.go:117] "RemoveContainer" containerID="3178cadacd12b1dee8b6f0b45ff5f17f122b0ce310f4eec2f8948f52d9ca9d69" Apr 16 17:03:14.031133 ip-10-0-137-150 kubenswrapper[2576]: E0416 17:03:14.031113 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3178cadacd12b1dee8b6f0b45ff5f17f122b0ce310f4eec2f8948f52d9ca9d69\": container with ID starting with 3178cadacd12b1dee8b6f0b45ff5f17f122b0ce310f4eec2f8948f52d9ca9d69 not found: ID does not exist" containerID="3178cadacd12b1dee8b6f0b45ff5f17f122b0ce310f4eec2f8948f52d9ca9d69" Apr 16 17:03:14.031227 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:14.031144 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3178cadacd12b1dee8b6f0b45ff5f17f122b0ce310f4eec2f8948f52d9ca9d69"} err="failed to get container status \"3178cadacd12b1dee8b6f0b45ff5f17f122b0ce310f4eec2f8948f52d9ca9d69\": rpc error: code = NotFound desc = could not find container \"3178cadacd12b1dee8b6f0b45ff5f17f122b0ce310f4eec2f8948f52d9ca9d69\": container with ID starting with 3178cadacd12b1dee8b6f0b45ff5f17f122b0ce310f4eec2f8948f52d9ca9d69 not found: ID does not exist" Apr 16 17:03:14.031227 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:14.031168 2576 scope.go:117] "RemoveContainer" containerID="7c44b599f9ee05cb7d0fab5518e14334bf72f08e3220e7866386da492a7da052" Apr 16 17:03:14.031421 ip-10-0-137-150 kubenswrapper[2576]: E0416 17:03:14.031404 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c44b599f9ee05cb7d0fab5518e14334bf72f08e3220e7866386da492a7da052\": container with ID starting with 7c44b599f9ee05cb7d0fab5518e14334bf72f08e3220e7866386da492a7da052 not found: ID does not exist" containerID="7c44b599f9ee05cb7d0fab5518e14334bf72f08e3220e7866386da492a7da052" Apr 16 17:03:14.031467 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:14.031428 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c44b599f9ee05cb7d0fab5518e14334bf72f08e3220e7866386da492a7da052"} err="failed to get container status \"7c44b599f9ee05cb7d0fab5518e14334bf72f08e3220e7866386da492a7da052\": rpc error: code = NotFound desc = could not find container \"7c44b599f9ee05cb7d0fab5518e14334bf72f08e3220e7866386da492a7da052\": container with ID starting with 7c44b599f9ee05cb7d0fab5518e14334bf72f08e3220e7866386da492a7da052 not found: ID does not exist" Apr 16 17:03:14.040193 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:14.040173 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-b84cl"] Apr 16 17:03:14.042492 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:14.042472 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-c65bd7c47-b84cl"] Apr 16 17:03:15.277911 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:15.277877 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9fb0344-9c97-4c17-8bcd-fcfa4c73cf21" path="/var/lib/kubelet/pods/a9fb0344-9c97-4c17-8bcd-fcfa4c73cf21/volumes" Apr 16 17:03:16.023851 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:16.023820 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-9zvwt_f9c8c783-6730-47e2-aaae-bd7548ec194c/storage-initializer/0.log" Apr 16 17:03:16.024018 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:16.023858 2576 generic.go:358] "Generic (PLEG): container finished" podID="f9c8c783-6730-47e2-aaae-bd7548ec194c" containerID="fce8f60f466b02f1a507830b930c2d8d13b0d35b03c20a3acd68cab0246dff8f" exitCode=1 Apr 16 17:03:16.024018 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:16.023934 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-9zvwt" event={"ID":"f9c8c783-6730-47e2-aaae-bd7548ec194c","Type":"ContainerDied","Data":"fce8f60f466b02f1a507830b930c2d8d13b0d35b03c20a3acd68cab0246dff8f"} Apr 16 17:03:17.028581 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:17.028553 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-9zvwt_f9c8c783-6730-47e2-aaae-bd7548ec194c/storage-initializer/0.log" Apr 16 17:03:17.028977 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:17.028671 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-9zvwt" event={"ID":"f9c8c783-6730-47e2-aaae-bd7548ec194c","Type":"ContainerStarted","Data":"b8db9524f0367ca90e94f7aa05e6c57ee35adc1f811287ccf2b020fec6b2d617"} Apr 16 17:03:20.212159 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:20.212121 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-9zvwt"] Apr 16 17:03:20.212644 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:20.212422 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-9zvwt" podUID="f9c8c783-6730-47e2-aaae-bd7548ec194c" containerName="storage-initializer" containerID="cri-o://b8db9524f0367ca90e94f7aa05e6c57ee35adc1f811287ccf2b020fec6b2d617" gracePeriod=30 Apr 16 17:03:21.236054 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:21.236009 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-7nqtd"] Apr 16 17:03:21.236468 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:21.236353 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a9fb0344-9c97-4c17-8bcd-fcfa4c73cf21" containerName="kserve-container" Apr 16 17:03:21.236468 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:21.236366 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9fb0344-9c97-4c17-8bcd-fcfa4c73cf21" containerName="kserve-container" Apr 16 17:03:21.236468 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:21.236377 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a9fb0344-9c97-4c17-8bcd-fcfa4c73cf21" containerName="storage-initializer" Apr 16 17:03:21.236468 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:21.236383 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9fb0344-9c97-4c17-8bcd-fcfa4c73cf21" containerName="storage-initializer" Apr 16 17:03:21.236468 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:21.236455 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a9fb0344-9c97-4c17-8bcd-fcfa4c73cf21" containerName="kserve-container" Apr 16 17:03:21.239509 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:21.239491 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-7nqtd" Apr 16 17:03:21.245555 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:21.245526 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 16 17:03:21.249835 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:21.249808 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-7nqtd"] Apr 16 17:03:21.385199 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:21.385160 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/043e5d5e-bf8f-4ce1-9c22-5530da62284c-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-7nqtd\" (UID: \"043e5d5e-bf8f-4ce1-9c22-5530da62284c\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-7nqtd" Apr 16 17:03:21.385429 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:21.385201 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/043e5d5e-bf8f-4ce1-9c22-5530da62284c-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-7nqtd\" (UID: \"043e5d5e-bf8f-4ce1-9c22-5530da62284c\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-7nqtd" Apr 16 17:03:21.486608 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:21.486505 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/043e5d5e-bf8f-4ce1-9c22-5530da62284c-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-7nqtd\" (UID: \"043e5d5e-bf8f-4ce1-9c22-5530da62284c\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-7nqtd" Apr 16 17:03:21.486608 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:21.486554 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/043e5d5e-bf8f-4ce1-9c22-5530da62284c-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-7nqtd\" (UID: \"043e5d5e-bf8f-4ce1-9c22-5530da62284c\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-7nqtd" Apr 16 17:03:21.486970 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:21.486947 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/043e5d5e-bf8f-4ce1-9c22-5530da62284c-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-7nqtd\" (UID: \"043e5d5e-bf8f-4ce1-9c22-5530da62284c\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-7nqtd" Apr 16 17:03:21.487123 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:21.487102 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/043e5d5e-bf8f-4ce1-9c22-5530da62284c-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-7nqtd\" (UID: \"043e5d5e-bf8f-4ce1-9c22-5530da62284c\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-7nqtd" Apr 16 17:03:21.549998 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:21.549971 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-7nqtd" Apr 16 17:03:21.669666 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:21.669630 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-7nqtd"] Apr 16 17:03:21.672392 ip-10-0-137-150 kubenswrapper[2576]: W0416 17:03:21.672350 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod043e5d5e_bf8f_4ce1_9c22_5530da62284c.slice/crio-e0757b8350335c97cc4b9ef59274e52b928fd85b50f737a0839fbb8a59f0ea71 WatchSource:0}: Error finding container e0757b8350335c97cc4b9ef59274e52b928fd85b50f737a0839fbb8a59f0ea71: Status 404 returned error can't find the container with id e0757b8350335c97cc4b9ef59274e52b928fd85b50f737a0839fbb8a59f0ea71 Apr 16 17:03:21.946714 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:21.946692 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-9zvwt_f9c8c783-6730-47e2-aaae-bd7548ec194c/storage-initializer/1.log" Apr 16 17:03:21.947041 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:21.947024 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-9zvwt_f9c8c783-6730-47e2-aaae-bd7548ec194c/storage-initializer/0.log" Apr 16 17:03:21.947155 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:21.947083 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-9zvwt" Apr 16 17:03:22.043527 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:22.043459 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-9zvwt_f9c8c783-6730-47e2-aaae-bd7548ec194c/storage-initializer/1.log" Apr 16 17:03:22.043832 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:22.043816 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-9zvwt_f9c8c783-6730-47e2-aaae-bd7548ec194c/storage-initializer/0.log" Apr 16 17:03:22.043889 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:22.043850 2576 generic.go:358] "Generic (PLEG): container finished" podID="f9c8c783-6730-47e2-aaae-bd7548ec194c" containerID="b8db9524f0367ca90e94f7aa05e6c57ee35adc1f811287ccf2b020fec6b2d617" exitCode=1 Apr 16 17:03:22.043936 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:22.043904 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-9zvwt" event={"ID":"f9c8c783-6730-47e2-aaae-bd7548ec194c","Type":"ContainerDied","Data":"b8db9524f0367ca90e94f7aa05e6c57ee35adc1f811287ccf2b020fec6b2d617"} Apr 16 17:03:22.043936 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:22.043924 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-9zvwt" event={"ID":"f9c8c783-6730-47e2-aaae-bd7548ec194c","Type":"ContainerDied","Data":"a28fd0a5bc6a80ff792f533f9b2161bcd0f6a7c77b39ec6e1f7c119d5a79d2cc"} Apr 16 17:03:22.044015 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:22.043941 2576 scope.go:117] "RemoveContainer" containerID="b8db9524f0367ca90e94f7aa05e6c57ee35adc1f811287ccf2b020fec6b2d617" Apr 16 17:03:22.044015 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:22.043944 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-9zvwt" Apr 16 17:03:22.045390 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:22.045368 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-7nqtd" event={"ID":"043e5d5e-bf8f-4ce1-9c22-5530da62284c","Type":"ContainerStarted","Data":"94490f1d04889d009a7339ffab500959c91f347b05ec1d1a34d41aec76c8f244"} Apr 16 17:03:22.045517 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:22.045395 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-7nqtd" event={"ID":"043e5d5e-bf8f-4ce1-9c22-5530da62284c","Type":"ContainerStarted","Data":"e0757b8350335c97cc4b9ef59274e52b928fd85b50f737a0839fbb8a59f0ea71"} Apr 16 17:03:22.051633 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:22.051614 2576 scope.go:117] "RemoveContainer" containerID="fce8f60f466b02f1a507830b930c2d8d13b0d35b03c20a3acd68cab0246dff8f" Apr 16 17:03:22.058797 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:22.058772 2576 scope.go:117] "RemoveContainer" containerID="b8db9524f0367ca90e94f7aa05e6c57ee35adc1f811287ccf2b020fec6b2d617" Apr 16 17:03:22.059021 ip-10-0-137-150 kubenswrapper[2576]: E0416 17:03:22.059005 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8db9524f0367ca90e94f7aa05e6c57ee35adc1f811287ccf2b020fec6b2d617\": container with ID starting with b8db9524f0367ca90e94f7aa05e6c57ee35adc1f811287ccf2b020fec6b2d617 not found: ID does not exist" containerID="b8db9524f0367ca90e94f7aa05e6c57ee35adc1f811287ccf2b020fec6b2d617" Apr 16 17:03:22.059079 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:22.059038 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8db9524f0367ca90e94f7aa05e6c57ee35adc1f811287ccf2b020fec6b2d617"} err="failed to get container status \"b8db9524f0367ca90e94f7aa05e6c57ee35adc1f811287ccf2b020fec6b2d617\": rpc error: code = NotFound desc = could not find container \"b8db9524f0367ca90e94f7aa05e6c57ee35adc1f811287ccf2b020fec6b2d617\": container with ID starting with b8db9524f0367ca90e94f7aa05e6c57ee35adc1f811287ccf2b020fec6b2d617 not found: ID does not exist" Apr 16 17:03:22.059079 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:22.059055 2576 scope.go:117] "RemoveContainer" containerID="fce8f60f466b02f1a507830b930c2d8d13b0d35b03c20a3acd68cab0246dff8f" Apr 16 17:03:22.059279 ip-10-0-137-150 kubenswrapper[2576]: E0416 17:03:22.059264 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fce8f60f466b02f1a507830b930c2d8d13b0d35b03c20a3acd68cab0246dff8f\": container with ID starting with fce8f60f466b02f1a507830b930c2d8d13b0d35b03c20a3acd68cab0246dff8f not found: ID does not exist" containerID="fce8f60f466b02f1a507830b930c2d8d13b0d35b03c20a3acd68cab0246dff8f" Apr 16 17:03:22.059328 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:22.059288 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fce8f60f466b02f1a507830b930c2d8d13b0d35b03c20a3acd68cab0246dff8f"} err="failed to get container status \"fce8f60f466b02f1a507830b930c2d8d13b0d35b03c20a3acd68cab0246dff8f\": rpc error: code = NotFound desc = could not find container \"fce8f60f466b02f1a507830b930c2d8d13b0d35b03c20a3acd68cab0246dff8f\": container with ID starting with fce8f60f466b02f1a507830b930c2d8d13b0d35b03c20a3acd68cab0246dff8f not found: ID does not exist" Apr 16 17:03:22.091398 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:22.091370 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f9c8c783-6730-47e2-aaae-bd7548ec194c-kserve-provision-location\") pod \"f9c8c783-6730-47e2-aaae-bd7548ec194c\" (UID: \"f9c8c783-6730-47e2-aaae-bd7548ec194c\") " Apr 16 17:03:22.091641 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:22.091618 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9c8c783-6730-47e2-aaae-bd7548ec194c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f9c8c783-6730-47e2-aaae-bd7548ec194c" (UID: "f9c8c783-6730-47e2-aaae-bd7548ec194c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:03:22.192593 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:22.192558 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f9c8c783-6730-47e2-aaae-bd7548ec194c-kserve-provision-location\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 17:03:22.381079 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:22.381050 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-9zvwt"] Apr 16 17:03:22.390594 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:22.390562 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-66b6cdf85d-9zvwt"] Apr 16 17:03:23.050263 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:23.050162 2576 generic.go:358] "Generic (PLEG): container finished" podID="043e5d5e-bf8f-4ce1-9c22-5530da62284c" containerID="94490f1d04889d009a7339ffab500959c91f347b05ec1d1a34d41aec76c8f244" exitCode=0 Apr 16 17:03:23.050263 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:23.050246 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-7nqtd" event={"ID":"043e5d5e-bf8f-4ce1-9c22-5530da62284c","Type":"ContainerDied","Data":"94490f1d04889d009a7339ffab500959c91f347b05ec1d1a34d41aec76c8f244"} Apr 16 17:03:23.278579 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:23.278546 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9c8c783-6730-47e2-aaae-bd7548ec194c" path="/var/lib/kubelet/pods/f9c8c783-6730-47e2-aaae-bd7548ec194c/volumes" Apr 16 17:03:24.057152 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:24.057112 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-7nqtd" event={"ID":"043e5d5e-bf8f-4ce1-9c22-5530da62284c","Type":"ContainerStarted","Data":"ceec701b172d6fb201ca71d5641fe7f96182d0c4ab3f85370065f728d18f6e0b"} Apr 16 17:03:24.057578 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:24.057319 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-7nqtd" Apr 16 17:03:24.058576 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:24.058550 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-7nqtd" podUID="043e5d5e-bf8f-4ce1-9c22-5530da62284c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.74:8080: connect: connection refused" Apr 16 17:03:24.075200 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:24.075162 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-7nqtd" podStartSLOduration=3.075151805 podStartE2EDuration="3.075151805s" podCreationTimestamp="2026-04-16 17:03:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:03:24.074113432 +0000 UTC m=+3627.429103562" watchObservedRunningTime="2026-04-16 17:03:24.075151805 +0000 UTC m=+3627.430141934" Apr 16 17:03:25.060829 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:25.060787 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-7nqtd" podUID="043e5d5e-bf8f-4ce1-9c22-5530da62284c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.74:8080: connect: connection refused" Apr 16 17:03:35.061745 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:35.061695 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-7nqtd" podUID="043e5d5e-bf8f-4ce1-9c22-5530da62284c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.74:8080: connect: connection refused" Apr 16 17:03:45.061068 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:45.061025 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-7nqtd" podUID="043e5d5e-bf8f-4ce1-9c22-5530da62284c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.74:8080: connect: connection refused" Apr 16 17:03:55.061262 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:03:55.061192 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-7nqtd" podUID="043e5d5e-bf8f-4ce1-9c22-5530da62284c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.74:8080: connect: connection refused" Apr 16 17:04:05.061799 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:05.061759 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-7nqtd" podUID="043e5d5e-bf8f-4ce1-9c22-5530da62284c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.74:8080: connect: connection refused" Apr 16 17:04:15.060983 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:15.060941 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-7nqtd" podUID="043e5d5e-bf8f-4ce1-9c22-5530da62284c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.74:8080: connect: connection refused" Apr 16 17:04:25.060844 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:25.060804 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-7nqtd" podUID="043e5d5e-bf8f-4ce1-9c22-5530da62284c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.74:8080: connect: connection refused" Apr 16 17:04:29.278585 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:29.278554 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-7nqtd" Apr 16 17:04:31.272775 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:31.272743 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-7nqtd"] Apr 16 17:04:31.273242 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:31.273073 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-7nqtd" podUID="043e5d5e-bf8f-4ce1-9c22-5530da62284c" containerName="kserve-container" containerID="cri-o://ceec701b172d6fb201ca71d5641fe7f96182d0c4ab3f85370065f728d18f6e0b" gracePeriod=30 Apr 16 17:04:32.335005 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:32.334967 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-6zkfm"] Apr 16 17:04:32.335428 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:32.335347 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f9c8c783-6730-47e2-aaae-bd7548ec194c" containerName="storage-initializer" Apr 16 17:04:32.335428 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:32.335360 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9c8c783-6730-47e2-aaae-bd7548ec194c" containerName="storage-initializer" Apr 16 17:04:32.335428 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:32.335421 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="f9c8c783-6730-47e2-aaae-bd7548ec194c" containerName="storage-initializer" Apr 16 17:04:32.335428 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:32.335430 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="f9c8c783-6730-47e2-aaae-bd7548ec194c" containerName="storage-initializer" Apr 16 17:04:32.335591 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:32.335478 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f9c8c783-6730-47e2-aaae-bd7548ec194c" containerName="storage-initializer" Apr 16 17:04:32.335591 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:32.335485 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9c8c783-6730-47e2-aaae-bd7548ec194c" containerName="storage-initializer" Apr 16 17:04:32.338485 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:32.338465 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-6zkfm" Apr 16 17:04:32.347182 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:32.347159 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-6zkfm"] Apr 16 17:04:32.515502 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:32.515470 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/35526482-b040-4d45-b341-01e160618636-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-6zkfm\" (UID: \"35526482-b040-4d45-b341-01e160618636\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-6zkfm" Apr 16 17:04:32.616120 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:32.616033 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/35526482-b040-4d45-b341-01e160618636-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-6zkfm\" (UID: \"35526482-b040-4d45-b341-01e160618636\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-6zkfm" Apr 16 17:04:32.616414 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:32.616395 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/35526482-b040-4d45-b341-01e160618636-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-6zkfm\" (UID: \"35526482-b040-4d45-b341-01e160618636\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-6zkfm" Apr 16 17:04:32.648199 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:32.648178 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-6zkfm" Apr 16 17:04:32.794985 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:32.794953 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-6zkfm"] Apr 16 17:04:32.798506 ip-10-0-137-150 kubenswrapper[2576]: W0416 17:04:32.798478 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35526482_b040_4d45_b341_01e160618636.slice/crio-3cfa6d8ef8c6696cab05b6e940d6940038509de270f8de60e9a8e95a8be6e33c WatchSource:0}: Error finding container 3cfa6d8ef8c6696cab05b6e940d6940038509de270f8de60e9a8e95a8be6e33c: Status 404 returned error can't find the container with id 3cfa6d8ef8c6696cab05b6e940d6940038509de270f8de60e9a8e95a8be6e33c Apr 16 17:04:33.268783 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:33.268752 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-6zkfm" event={"ID":"35526482-b040-4d45-b341-01e160618636","Type":"ContainerStarted","Data":"dbe54e89058fa15b251b82a77d6c14bdc2dec2a8d0786d6ad3c6147e575a319a"} Apr 16 17:04:33.268783 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:33.268786 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-6zkfm" event={"ID":"35526482-b040-4d45-b341-01e160618636","Type":"ContainerStarted","Data":"3cfa6d8ef8c6696cab05b6e940d6940038509de270f8de60e9a8e95a8be6e33c"} Apr 16 17:04:35.415294 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:35.415275 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-7nqtd" Apr 16 17:04:35.542134 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:35.542045 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/043e5d5e-bf8f-4ce1-9c22-5530da62284c-cabundle-cert\") pod \"043e5d5e-bf8f-4ce1-9c22-5530da62284c\" (UID: \"043e5d5e-bf8f-4ce1-9c22-5530da62284c\") " Apr 16 17:04:35.542134 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:35.542081 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/043e5d5e-bf8f-4ce1-9c22-5530da62284c-kserve-provision-location\") pod \"043e5d5e-bf8f-4ce1-9c22-5530da62284c\" (UID: \"043e5d5e-bf8f-4ce1-9c22-5530da62284c\") " Apr 16 17:04:35.542440 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:35.542416 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/043e5d5e-bf8f-4ce1-9c22-5530da62284c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "043e5d5e-bf8f-4ce1-9c22-5530da62284c" (UID: "043e5d5e-bf8f-4ce1-9c22-5530da62284c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:04:35.542512 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:35.542468 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/043e5d5e-bf8f-4ce1-9c22-5530da62284c-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "043e5d5e-bf8f-4ce1-9c22-5530da62284c" (UID: "043e5d5e-bf8f-4ce1-9c22-5530da62284c"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 17:04:35.643008 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:35.642977 2576 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/043e5d5e-bf8f-4ce1-9c22-5530da62284c-cabundle-cert\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 17:04:35.643008 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:35.643002 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/043e5d5e-bf8f-4ce1-9c22-5530da62284c-kserve-provision-location\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 17:04:36.279838 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:36.279818 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-6zkfm_35526482-b040-4d45-b341-01e160618636/storage-initializer/0.log" Apr 16 17:04:36.279943 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:36.279855 2576 generic.go:358] "Generic (PLEG): container finished" podID="35526482-b040-4d45-b341-01e160618636" containerID="dbe54e89058fa15b251b82a77d6c14bdc2dec2a8d0786d6ad3c6147e575a319a" exitCode=1 Apr 16 17:04:36.279943 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:36.279928 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-6zkfm" event={"ID":"35526482-b040-4d45-b341-01e160618636","Type":"ContainerDied","Data":"dbe54e89058fa15b251b82a77d6c14bdc2dec2a8d0786d6ad3c6147e575a319a"} Apr 16 17:04:36.281204 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:36.281178 2576 generic.go:358] "Generic (PLEG): container finished" podID="043e5d5e-bf8f-4ce1-9c22-5530da62284c" containerID="ceec701b172d6fb201ca71d5641fe7f96182d0c4ab3f85370065f728d18f6e0b" exitCode=0 Apr 16 17:04:36.281321 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:36.281228 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-7nqtd" event={"ID":"043e5d5e-bf8f-4ce1-9c22-5530da62284c","Type":"ContainerDied","Data":"ceec701b172d6fb201ca71d5641fe7f96182d0c4ab3f85370065f728d18f6e0b"} Apr 16 17:04:36.281321 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:36.281267 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-7nqtd" event={"ID":"043e5d5e-bf8f-4ce1-9c22-5530da62284c","Type":"ContainerDied","Data":"e0757b8350335c97cc4b9ef59274e52b928fd85b50f737a0839fbb8a59f0ea71"} Apr 16 17:04:36.281321 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:36.281288 2576 scope.go:117] "RemoveContainer" containerID="ceec701b172d6fb201ca71d5641fe7f96182d0c4ab3f85370065f728d18f6e0b" Apr 16 17:04:36.281321 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:36.281301 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-7nqtd" Apr 16 17:04:36.321952 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:36.321933 2576 scope.go:117] "RemoveContainer" containerID="94490f1d04889d009a7339ffab500959c91f347b05ec1d1a34d41aec76c8f244" Apr 16 17:04:36.348177 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:36.348155 2576 scope.go:117] "RemoveContainer" containerID="ceec701b172d6fb201ca71d5641fe7f96182d0c4ab3f85370065f728d18f6e0b" Apr 16 17:04:36.348881 ip-10-0-137-150 kubenswrapper[2576]: E0416 17:04:36.348848 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ceec701b172d6fb201ca71d5641fe7f96182d0c4ab3f85370065f728d18f6e0b\": container with ID starting with ceec701b172d6fb201ca71d5641fe7f96182d0c4ab3f85370065f728d18f6e0b not found: ID does not exist" containerID="ceec701b172d6fb201ca71d5641fe7f96182d0c4ab3f85370065f728d18f6e0b" Apr 16 17:04:36.348971 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:36.348888 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ceec701b172d6fb201ca71d5641fe7f96182d0c4ab3f85370065f728d18f6e0b"} err="failed to get container status \"ceec701b172d6fb201ca71d5641fe7f96182d0c4ab3f85370065f728d18f6e0b\": rpc error: code = NotFound desc = could not find container \"ceec701b172d6fb201ca71d5641fe7f96182d0c4ab3f85370065f728d18f6e0b\": container with ID starting with ceec701b172d6fb201ca71d5641fe7f96182d0c4ab3f85370065f728d18f6e0b not found: ID does not exist" Apr 16 17:04:36.348971 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:36.348912 2576 scope.go:117] "RemoveContainer" containerID="94490f1d04889d009a7339ffab500959c91f347b05ec1d1a34d41aec76c8f244" Apr 16 17:04:36.349313 ip-10-0-137-150 kubenswrapper[2576]: E0416 17:04:36.349295 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94490f1d04889d009a7339ffab500959c91f347b05ec1d1a34d41aec76c8f244\": container with ID starting with 94490f1d04889d009a7339ffab500959c91f347b05ec1d1a34d41aec76c8f244 not found: ID does not exist" containerID="94490f1d04889d009a7339ffab500959c91f347b05ec1d1a34d41aec76c8f244" Apr 16 17:04:36.349380 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:36.349317 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94490f1d04889d009a7339ffab500959c91f347b05ec1d1a34d41aec76c8f244"} err="failed to get container status \"94490f1d04889d009a7339ffab500959c91f347b05ec1d1a34d41aec76c8f244\": rpc error: code = NotFound desc = could not find container \"94490f1d04889d009a7339ffab500959c91f347b05ec1d1a34d41aec76c8f244\": container with ID starting with 94490f1d04889d009a7339ffab500959c91f347b05ec1d1a34d41aec76c8f244 not found: ID does not exist" Apr 16 17:04:36.354754 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:36.354734 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-7nqtd"] Apr 16 17:04:36.358398 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:36.358377 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-6cdc6d7987-7nqtd"] Apr 16 17:04:37.279169 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:37.279137 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="043e5d5e-bf8f-4ce1-9c22-5530da62284c" path="/var/lib/kubelet/pods/043e5d5e-bf8f-4ce1-9c22-5530da62284c/volumes" Apr 16 17:04:37.286441 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:37.286418 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-6zkfm_35526482-b040-4d45-b341-01e160618636/storage-initializer/0.log" Apr 16 17:04:37.286577 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:37.286459 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-6zkfm" event={"ID":"35526482-b040-4d45-b341-01e160618636","Type":"ContainerStarted","Data":"1404b34e80f07588a734a53c45bf39d4a2441b6650b8c63adcce1c6e11a7c631"} Apr 16 17:04:41.298406 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:41.298381 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-6zkfm_35526482-b040-4d45-b341-01e160618636/storage-initializer/1.log" Apr 16 17:04:41.298787 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:41.298724 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-6zkfm_35526482-b040-4d45-b341-01e160618636/storage-initializer/0.log" Apr 16 17:04:41.298787 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:41.298753 2576 generic.go:358] "Generic (PLEG): container finished" podID="35526482-b040-4d45-b341-01e160618636" containerID="1404b34e80f07588a734a53c45bf39d4a2441b6650b8c63adcce1c6e11a7c631" exitCode=1 Apr 16 17:04:41.298881 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:41.298812 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-6zkfm" event={"ID":"35526482-b040-4d45-b341-01e160618636","Type":"ContainerDied","Data":"1404b34e80f07588a734a53c45bf39d4a2441b6650b8c63adcce1c6e11a7c631"} Apr 16 17:04:41.298881 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:41.298840 2576 scope.go:117] "RemoveContainer" containerID="dbe54e89058fa15b251b82a77d6c14bdc2dec2a8d0786d6ad3c6147e575a319a" Apr 16 17:04:41.299168 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:41.299150 2576 scope.go:117] "RemoveContainer" containerID="dbe54e89058fa15b251b82a77d6c14bdc2dec2a8d0786d6ad3c6147e575a319a" Apr 16 17:04:41.308776 ip-10-0-137-150 kubenswrapper[2576]: E0416 17:04:41.308741 2576 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-6zkfm_kserve-ci-e2e-test_35526482-b040-4d45-b341-01e160618636_0 in pod sandbox 3cfa6d8ef8c6696cab05b6e940d6940038509de270f8de60e9a8e95a8be6e33c from index: no such id: 'dbe54e89058fa15b251b82a77d6c14bdc2dec2a8d0786d6ad3c6147e575a319a'" containerID="dbe54e89058fa15b251b82a77d6c14bdc2dec2a8d0786d6ad3c6147e575a319a" Apr 16 17:04:41.308852 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:41.308783 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbe54e89058fa15b251b82a77d6c14bdc2dec2a8d0786d6ad3c6147e575a319a"} err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-6zkfm_kserve-ci-e2e-test_35526482-b040-4d45-b341-01e160618636_0 in pod sandbox 3cfa6d8ef8c6696cab05b6e940d6940038509de270f8de60e9a8e95a8be6e33c from index: no such id: 'dbe54e89058fa15b251b82a77d6c14bdc2dec2a8d0786d6ad3c6147e575a319a'" Apr 16 17:04:41.308949 ip-10-0-137-150 kubenswrapper[2576]: E0416 17:04:41.308933 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-6zkfm_kserve-ci-e2e-test(35526482-b040-4d45-b341-01e160618636)\"" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-6zkfm" podUID="35526482-b040-4d45-b341-01e160618636" Apr 16 17:04:42.303562 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:42.303530 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-6zkfm_35526482-b040-4d45-b341-01e160618636/storage-initializer/1.log" Apr 16 17:04:42.355715 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:42.355677 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-6zkfm"] Apr 16 17:04:42.487102 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:42.487077 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-6zkfm_35526482-b040-4d45-b341-01e160618636/storage-initializer/1.log" Apr 16 17:04:42.487239 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:42.487138 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-6zkfm" Apr 16 17:04:42.590150 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:42.590081 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/35526482-b040-4d45-b341-01e160618636-kserve-provision-location\") pod \"35526482-b040-4d45-b341-01e160618636\" (UID: \"35526482-b040-4d45-b341-01e160618636\") " Apr 16 17:04:42.590336 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:42.590314 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35526482-b040-4d45-b341-01e160618636-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "35526482-b040-4d45-b341-01e160618636" (UID: "35526482-b040-4d45-b341-01e160618636"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 17:04:42.691482 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:42.691456 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/35526482-b040-4d45-b341-01e160618636-kserve-provision-location\") on node \"ip-10-0-137-150.ec2.internal\" DevicePath \"\"" Apr 16 17:04:43.307733 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:43.307707 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-6zkfm_35526482-b040-4d45-b341-01e160618636/storage-initializer/1.log" Apr 16 17:04:43.308132 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:43.307757 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-6zkfm" event={"ID":"35526482-b040-4d45-b341-01e160618636","Type":"ContainerDied","Data":"3cfa6d8ef8c6696cab05b6e940d6940038509de270f8de60e9a8e95a8be6e33c"} Apr 16 17:04:43.308132 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:43.307796 2576 scope.go:117] "RemoveContainer" containerID="1404b34e80f07588a734a53c45bf39d4a2441b6650b8c63adcce1c6e11a7c631" Apr 16 17:04:43.308132 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:43.307805 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-6zkfm" Apr 16 17:04:43.337835 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:43.337781 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-6zkfm"] Apr 16 17:04:43.341440 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:43.341417 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-66cc478cb4-6zkfm"] Apr 16 17:04:45.277510 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:04:45.277476 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35526482-b040-4d45-b341-01e160618636" path="/var/lib/kubelet/pods/35526482-b040-4d45-b341-01e160618636/volumes" Apr 16 17:05:13.439406 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:13.439372 2576 ???:1] "http: TLS handshake error from 10.0.137.150:49646: EOF" Apr 16 17:05:13.443363 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:13.443331 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-ntvgj_46b0410a-fdd8-490e-b05f-b4633630c446/global-pull-secret-syncer/0.log" Apr 16 17:05:13.501470 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:13.501444 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-8rbfc_c969e45c-de0a-46a1-b293-93c2eb9bcd6f/konnectivity-agent/0.log" Apr 16 17:05:13.716151 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:13.716077 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-137-150.ec2.internal_6ba071faccce2e60097e197ecb90d16a/haproxy/0.log" Apr 16 17:05:16.924764 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:16.924737 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-5876b4bbc7-kwwwq_16cac812-a0a2-4bf1-aa45-6ec4924939b1/monitoring-plugin/0.log" Apr 16 17:05:17.137326 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:17.137297 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zdvmv_ee37060b-37d5-4004-91af-f37493123dc3/node-exporter/0.log" Apr 16 17:05:17.161543 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:17.161520 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zdvmv_ee37060b-37d5-4004-91af-f37493123dc3/kube-rbac-proxy/0.log" Apr 16 17:05:17.184747 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:17.184692 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zdvmv_ee37060b-37d5-4004-91af-f37493123dc3/init-textfile/0.log" Apr 16 17:05:17.497786 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:17.497711 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-sfgcn_4973f86a-c513-4ca6-a939-13d71612d2c0/prometheus-operator/0.log" Apr 16 17:05:17.520056 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:17.520033 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-sfgcn_4973f86a-c513-4ca6-a939-13d71612d2c0/kube-rbac-proxy/0.log" Apr 16 17:05:19.815332 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:19.815305 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-69589b6f45-b29qs_de4905de-aba1-464b-b9be-0bb515ddf375/console/0.log" Apr 16 17:05:19.851387 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:19.851362 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-586b57c7b4-knjdk_4270a553-9c4d-44df-8a3d-2ec2e74d38b1/download-server/0.log" Apr 16 17:05:20.957383 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:20.957351 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7k2bn/perf-node-gather-daemonset-c5r2h"] Apr 16 17:05:20.957739 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:20.957669 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="043e5d5e-bf8f-4ce1-9c22-5530da62284c" containerName="kserve-container" Apr 16 17:05:20.957739 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:20.957680 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="043e5d5e-bf8f-4ce1-9c22-5530da62284c" containerName="kserve-container" Apr 16 17:05:20.957739 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:20.957698 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="35526482-b040-4d45-b341-01e160618636" containerName="storage-initializer" Apr 16 17:05:20.957739 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:20.957704 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="35526482-b040-4d45-b341-01e160618636" containerName="storage-initializer" Apr 16 17:05:20.957739 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:20.957711 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="043e5d5e-bf8f-4ce1-9c22-5530da62284c" containerName="storage-initializer" Apr 16 17:05:20.957739 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:20.957716 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="043e5d5e-bf8f-4ce1-9c22-5530da62284c" containerName="storage-initializer" Apr 16 17:05:20.957739 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:20.957722 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="35526482-b040-4d45-b341-01e160618636" containerName="storage-initializer" Apr 16 17:05:20.957739 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:20.957729 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="35526482-b040-4d45-b341-01e160618636" containerName="storage-initializer" Apr 16 17:05:20.958007 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:20.957775 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="043e5d5e-bf8f-4ce1-9c22-5530da62284c" containerName="kserve-container" Apr 16 17:05:20.958007 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:20.957783 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="35526482-b040-4d45-b341-01e160618636" containerName="storage-initializer" Apr 16 17:05:20.960831 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:20.960811 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7k2bn/perf-node-gather-daemonset-c5r2h" Apr 16 17:05:20.962996 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:20.962977 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7k2bn\"/\"openshift-service-ca.crt\"" Apr 16 17:05:20.962996 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:20.962987 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-7k2bn\"/\"default-dockercfg-nkv5v\"" Apr 16 17:05:20.963893 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:20.963879 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7k2bn\"/\"kube-root-ca.crt\"" Apr 16 17:05:20.969096 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:20.969074 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7k2bn/perf-node-gather-daemonset-c5r2h"] Apr 16 17:05:20.995689 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:20.995666 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-vzpf5_01216772-46ae-4344-a250-d689b2fe3c4c/dns/0.log" Apr 16 17:05:21.016343 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:21.016318 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-vzpf5_01216772-46ae-4344-a250-d689b2fe3c4c/kube-rbac-proxy/0.log" Apr 16 17:05:21.038016 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:21.037993 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-6bkvd_e5e28615-1240-4149-a23f-752b612f8a06/dns-node-resolver/0.log" Apr 16 17:05:21.069063 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:21.069042 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c8ac159b-feb9-412d-a595-c7fe9641c3fb-lib-modules\") pod \"perf-node-gather-daemonset-c5r2h\" (UID: \"c8ac159b-feb9-412d-a595-c7fe9641c3fb\") " pod="openshift-must-gather-7k2bn/perf-node-gather-daemonset-c5r2h" Apr 16 17:05:21.069147 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:21.069080 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5c28g\" (UniqueName: \"kubernetes.io/projected/c8ac159b-feb9-412d-a595-c7fe9641c3fb-kube-api-access-5c28g\") pod \"perf-node-gather-daemonset-c5r2h\" (UID: \"c8ac159b-feb9-412d-a595-c7fe9641c3fb\") " pod="openshift-must-gather-7k2bn/perf-node-gather-daemonset-c5r2h" Apr 16 17:05:21.069147 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:21.069105 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c8ac159b-feb9-412d-a595-c7fe9641c3fb-proc\") pod \"perf-node-gather-daemonset-c5r2h\" (UID: \"c8ac159b-feb9-412d-a595-c7fe9641c3fb\") " pod="openshift-must-gather-7k2bn/perf-node-gather-daemonset-c5r2h" Apr 16 17:05:21.069249 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:21.069148 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c8ac159b-feb9-412d-a595-c7fe9641c3fb-podres\") pod \"perf-node-gather-daemonset-c5r2h\" (UID: \"c8ac159b-feb9-412d-a595-c7fe9641c3fb\") " pod="openshift-must-gather-7k2bn/perf-node-gather-daemonset-c5r2h" Apr 16 17:05:21.069249 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:21.069168 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c8ac159b-feb9-412d-a595-c7fe9641c3fb-sys\") pod \"perf-node-gather-daemonset-c5r2h\" (UID: \"c8ac159b-feb9-412d-a595-c7fe9641c3fb\") " pod="openshift-must-gather-7k2bn/perf-node-gather-daemonset-c5r2h" Apr 16 17:05:21.169964 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:21.169930 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5c28g\" (UniqueName: \"kubernetes.io/projected/c8ac159b-feb9-412d-a595-c7fe9641c3fb-kube-api-access-5c28g\") pod \"perf-node-gather-daemonset-c5r2h\" (UID: \"c8ac159b-feb9-412d-a595-c7fe9641c3fb\") " pod="openshift-must-gather-7k2bn/perf-node-gather-daemonset-c5r2h" Apr 16 17:05:21.170081 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:21.169975 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c8ac159b-feb9-412d-a595-c7fe9641c3fb-proc\") pod \"perf-node-gather-daemonset-c5r2h\" (UID: \"c8ac159b-feb9-412d-a595-c7fe9641c3fb\") " pod="openshift-must-gather-7k2bn/perf-node-gather-daemonset-c5r2h" Apr 16 17:05:21.170081 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:21.170009 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c8ac159b-feb9-412d-a595-c7fe9641c3fb-podres\") pod \"perf-node-gather-daemonset-c5r2h\" (UID: \"c8ac159b-feb9-412d-a595-c7fe9641c3fb\") " pod="openshift-must-gather-7k2bn/perf-node-gather-daemonset-c5r2h" Apr 16 17:05:21.170081 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:21.170040 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c8ac159b-feb9-412d-a595-c7fe9641c3fb-sys\") pod \"perf-node-gather-daemonset-c5r2h\" (UID: \"c8ac159b-feb9-412d-a595-c7fe9641c3fb\") " pod="openshift-must-gather-7k2bn/perf-node-gather-daemonset-c5r2h" Apr 16 17:05:21.170271 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:21.170106 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c8ac159b-feb9-412d-a595-c7fe9641c3fb-lib-modules\") pod \"perf-node-gather-daemonset-c5r2h\" (UID: \"c8ac159b-feb9-412d-a595-c7fe9641c3fb\") " pod="openshift-must-gather-7k2bn/perf-node-gather-daemonset-c5r2h" Apr 16 17:05:21.170271 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:21.170114 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c8ac159b-feb9-412d-a595-c7fe9641c3fb-proc\") pod \"perf-node-gather-daemonset-c5r2h\" (UID: \"c8ac159b-feb9-412d-a595-c7fe9641c3fb\") " pod="openshift-must-gather-7k2bn/perf-node-gather-daemonset-c5r2h" Apr 16 17:05:21.170271 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:21.170181 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c8ac159b-feb9-412d-a595-c7fe9641c3fb-podres\") pod \"perf-node-gather-daemonset-c5r2h\" (UID: \"c8ac159b-feb9-412d-a595-c7fe9641c3fb\") " pod="openshift-must-gather-7k2bn/perf-node-gather-daemonset-c5r2h" Apr 16 17:05:21.170271 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:21.170229 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c8ac159b-feb9-412d-a595-c7fe9641c3fb-sys\") pod \"perf-node-gather-daemonset-c5r2h\" (UID: \"c8ac159b-feb9-412d-a595-c7fe9641c3fb\") " pod="openshift-must-gather-7k2bn/perf-node-gather-daemonset-c5r2h" Apr 16 17:05:21.170271 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:21.170252 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c8ac159b-feb9-412d-a595-c7fe9641c3fb-lib-modules\") pod \"perf-node-gather-daemonset-c5r2h\" (UID: \"c8ac159b-feb9-412d-a595-c7fe9641c3fb\") " pod="openshift-must-gather-7k2bn/perf-node-gather-daemonset-c5r2h" Apr 16 17:05:21.176859 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:21.176840 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5c28g\" (UniqueName: \"kubernetes.io/projected/c8ac159b-feb9-412d-a595-c7fe9641c3fb-kube-api-access-5c28g\") pod \"perf-node-gather-daemonset-c5r2h\" (UID: \"c8ac159b-feb9-412d-a595-c7fe9641c3fb\") " pod="openshift-must-gather-7k2bn/perf-node-gather-daemonset-c5r2h" Apr 16 17:05:21.270714 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:21.270617 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7k2bn/perf-node-gather-daemonset-c5r2h" Apr 16 17:05:21.386846 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:21.386815 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7k2bn/perf-node-gather-daemonset-c5r2h"] Apr 16 17:05:21.389762 ip-10-0-137-150 kubenswrapper[2576]: W0416 17:05:21.389718 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc8ac159b_feb9_412d_a595_c7fe9641c3fb.slice/crio-a384332d51828f88b34b2ad0d8a3bbb63f9aee80b52b4a1b44fd95179cb01df4 WatchSource:0}: Error finding container a384332d51828f88b34b2ad0d8a3bbb63f9aee80b52b4a1b44fd95179cb01df4: Status 404 returned error can't find the container with id a384332d51828f88b34b2ad0d8a3bbb63f9aee80b52b4a1b44fd95179cb01df4 Apr 16 17:05:21.391347 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:21.391329 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 17:05:21.439343 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:21.439323 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7k2bn/perf-node-gather-daemonset-c5r2h" event={"ID":"c8ac159b-feb9-412d-a595-c7fe9641c3fb","Type":"ContainerStarted","Data":"a384332d51828f88b34b2ad0d8a3bbb63f9aee80b52b4a1b44fd95179cb01df4"} Apr 16 17:05:21.505513 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:21.505480 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-545c64844d-f4pv2_fe7bede0-0e38-4e70-89e7-de62eb29aaa4/registry/0.log" Apr 16 17:05:21.575298 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:21.575273 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-cgg4d_4b05a2af-d8f2-42c1-a086-851d57791b5f/node-ca/0.log" Apr 16 17:05:22.443738 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:22.443699 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7k2bn/perf-node-gather-daemonset-c5r2h" event={"ID":"c8ac159b-feb9-412d-a595-c7fe9641c3fb","Type":"ContainerStarted","Data":"6c991cf8c653a1fdd79949338153564b974dc26e9aa9cd19f99a86878cbfb3c6"} Apr 16 17:05:22.444130 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:22.443840 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-7k2bn/perf-node-gather-daemonset-c5r2h" Apr 16 17:05:22.459846 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:22.459805 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7k2bn/perf-node-gather-daemonset-c5r2h" podStartSLOduration=2.459792797 podStartE2EDuration="2.459792797s" podCreationTimestamp="2026-04-16 17:05:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:05:22.458574836 +0000 UTC m=+3745.813564965" watchObservedRunningTime="2026-04-16 17:05:22.459792797 +0000 UTC m=+3745.814782926" Apr 16 17:05:22.606860 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:22.606837 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-qg8tx_48e2d2c2-3b35-49d1-bb3f-46840c5001a5/serve-healthcheck-canary/0.log" Apr 16 17:05:22.963400 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:22.963375 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5tv2m_48910414-96d3-4257-899f-e58464821442/kube-rbac-proxy/0.log" Apr 16 17:05:22.983445 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:22.983422 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5tv2m_48910414-96d3-4257-899f-e58464821442/exporter/0.log" Apr 16 17:05:23.002631 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:23.002605 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-5tv2m_48910414-96d3-4257-899f-e58464821442/extractor/0.log" Apr 16 17:05:25.119951 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:25.119898 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-7c68cb4fc8-gwb89_68ccd199-6bbd-41b6-b01a-ca60d98f86e9/manager/0.log" Apr 16 17:05:25.163481 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:25.163457 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-mw9hq_a3adf15a-d6a5-42dc-b9f0-f81e45ce0206/server/0.log" Apr 16 17:05:25.527009 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:25.526977 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-serving-7fd5766db9-n55dz_07eed151-94e9-46aa-a532-2e2460f02e48/seaweedfs-tls-serving/0.log" Apr 16 17:05:28.455941 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:28.455909 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-7k2bn/perf-node-gather-daemonset-c5r2h" Apr 16 17:05:29.096546 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:29.096524 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-knx4c_032c92aa-e753-46b9-9966-8c772b88005b/migrator/0.log" Apr 16 17:05:29.117000 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:29.116981 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-knx4c_032c92aa-e753-46b9-9966-8c772b88005b/graceful-termination/0.log" Apr 16 17:05:30.873880 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:30.873853 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rvbp5_345463d3-76fd-4233-8808-6df63a64c4b5/kube-multus-additional-cni-plugins/0.log" Apr 16 17:05:30.897613 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:30.897590 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rvbp5_345463d3-76fd-4233-8808-6df63a64c4b5/egress-router-binary-copy/0.log" Apr 16 17:05:30.923048 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:30.923023 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rvbp5_345463d3-76fd-4233-8808-6df63a64c4b5/cni-plugins/0.log" Apr 16 17:05:30.942846 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:30.942825 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rvbp5_345463d3-76fd-4233-8808-6df63a64c4b5/bond-cni-plugin/0.log" Apr 16 17:05:30.965816 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:30.965800 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rvbp5_345463d3-76fd-4233-8808-6df63a64c4b5/routeoverride-cni/0.log" Apr 16 17:05:30.988076 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:30.987985 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rvbp5_345463d3-76fd-4233-8808-6df63a64c4b5/whereabouts-cni-bincopy/0.log" Apr 16 17:05:31.007776 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:31.007754 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rvbp5_345463d3-76fd-4233-8808-6df63a64c4b5/whereabouts-cni/0.log" Apr 16 17:05:31.047455 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:31.047424 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mjlcb_4a32280f-6aec-4142-bb8f-1547ac4378ab/kube-multus/0.log" Apr 16 17:05:31.173034 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:31.172998 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-gvndv_916e5e50-1aef-4277-971a-7f2e8ffd2703/network-metrics-daemon/0.log" Apr 16 17:05:31.189164 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:31.189141 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-gvndv_916e5e50-1aef-4277-971a-7f2e8ffd2703/kube-rbac-proxy/0.log" Apr 16 17:05:31.914865 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:31.914842 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5fl2m_d0578a47-b539-4acd-9e68-0468bd267183/ovn-controller/0.log" Apr 16 17:05:31.949933 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:31.949909 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5fl2m_d0578a47-b539-4acd-9e68-0468bd267183/ovn-acl-logging/0.log" Apr 16 17:05:31.970802 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:31.970779 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5fl2m_d0578a47-b539-4acd-9e68-0468bd267183/kube-rbac-proxy-node/0.log" Apr 16 17:05:31.991439 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:31.991418 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5fl2m_d0578a47-b539-4acd-9e68-0468bd267183/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 17:05:32.011377 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:32.011361 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5fl2m_d0578a47-b539-4acd-9e68-0468bd267183/northd/0.log" Apr 16 17:05:32.032133 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:32.032107 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5fl2m_d0578a47-b539-4acd-9e68-0468bd267183/nbdb/0.log" Apr 16 17:05:32.059161 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:32.059094 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5fl2m_d0578a47-b539-4acd-9e68-0468bd267183/sbdb/0.log" Apr 16 17:05:32.166684 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:32.166647 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5fl2m_d0578a47-b539-4acd-9e68-0468bd267183/ovnkube-controller/0.log" Apr 16 17:05:33.722379 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:33.722351 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-7b678d77c7-rqrcp_4dc825b4-ec3d-4c27-aa56-053ec3f50964/check-endpoints/0.log" Apr 16 17:05:33.788860 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:33.788832 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-xjfdj_31ee7aaa-d858-49a4-becd-246ec9f1a8c5/network-check-target-container/0.log" Apr 16 17:05:34.735635 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:34.735605 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-rvgfg_e8c9cbbd-500b-410c-bc31-680efc2b8a0c/iptables-alerter/0.log" Apr 16 17:05:35.377696 ip-10-0-137-150 kubenswrapper[2576]: I0416 17:05:35.377660 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-tw4j2_f2c0205d-61f8-4c10-bd7f-f8fcf4336fba/tuned/0.log"