Apr 20 20:02:44.870075 ip-10-0-143-23 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 20 20:02:44.870088 ip-10-0-143-23 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 20 20:02:44.870098 ip-10-0-143-23 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 20 20:02:44.870331 ip-10-0-143-23 systemd[1]: Failed to start Kubernetes Kubelet. Apr 20 20:02:54.877180 ip-10-0-143-23 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 20 20:02:54.877201 ip-10-0-143-23 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 6dc424ed0f3940f8a4f68f4cc761bbf7 -- Apr 20 20:05:25.148387 ip-10-0-143-23 systemd[1]: Starting Kubernetes Kubelet... Apr 20 20:05:25.659327 ip-10-0-143-23 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 20:05:25.659327 ip-10-0-143-23 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 20:05:25.659327 ip-10-0-143-23 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 20:05:25.659327 ip-10-0-143-23 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 20:05:25.659327 ip-10-0-143-23 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 20:05:25.659961 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.659414 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 20:05:25.662749 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662734 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:05:25.662749 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662749 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:05:25.662808 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662753 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:05:25.662808 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662756 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:05:25.662808 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662759 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:05:25.662808 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662762 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:05:25.662808 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662764 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:05:25.662808 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662768 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:05:25.662808 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662771 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:05:25.662808 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662774 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:05:25.662808 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662777 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:05:25.662808 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662780 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:05:25.662808 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662783 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:05:25.662808 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662791 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:05:25.662808 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662794 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:05:25.662808 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662797 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:05:25.662808 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662800 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:05:25.662808 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662802 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:05:25.662808 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662804 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:05:25.662808 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662807 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:05:25.662808 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662810 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:05:25.662808 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662813 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:05:25.663327 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662816 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:05:25.663327 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662819 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:05:25.663327 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662822 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:05:25.663327 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662824 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:05:25.663327 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662827 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:05:25.663327 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662830 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:05:25.663327 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662832 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:05:25.663327 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662834 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:05:25.663327 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662837 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:05:25.663327 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662839 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:05:25.663327 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662842 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:05:25.663327 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662844 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:05:25.663327 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662846 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:05:25.663327 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662849 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:05:25.663327 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662852 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:05:25.663327 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662854 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:05:25.663327 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662856 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:05:25.663327 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662858 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:05:25.663327 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662861 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:05:25.663327 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662863 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:05:25.664127 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662866 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:05:25.664127 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662868 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:05:25.664127 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662870 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:05:25.664127 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662873 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:05:25.664127 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662876 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:05:25.664127 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662878 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:05:25.664127 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662880 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:05:25.664127 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662883 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:05:25.664127 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662885 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:05:25.664127 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662887 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:05:25.664127 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662890 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:05:25.664127 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662892 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:05:25.664127 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662896 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:05:25.664127 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662899 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:05:25.664127 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662901 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:05:25.664127 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662904 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:05:25.664127 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662906 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:05:25.664127 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662909 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:05:25.664127 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662911 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:05:25.664127 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662914 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:05:25.664830 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662917 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:05:25.664830 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662921 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:05:25.664830 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662925 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:05:25.664830 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662928 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:05:25.664830 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662930 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:05:25.664830 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662933 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:05:25.664830 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662935 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:05:25.664830 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662937 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:05:25.664830 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662940 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:05:25.664830 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662942 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:05:25.664830 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662945 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:05:25.664830 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662947 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:05:25.664830 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662950 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:05:25.664830 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662952 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:05:25.664830 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662955 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:05:25.664830 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662957 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:05:25.664830 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662959 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:05:25.664830 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662964 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:05:25.664830 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662968 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:05:25.665303 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662971 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:05:25.665303 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662974 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:05:25.665303 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662977 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:05:25.665303 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662979 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:05:25.665303 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.662982 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:05:25.665303 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664139 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:05:25.665303 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664159 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:05:25.665303 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664164 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:05:25.665303 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664169 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:05:25.665303 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664174 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:05:25.665303 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664178 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:05:25.665303 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664183 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:05:25.665303 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664188 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:05:25.665303 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664192 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:05:25.665303 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664196 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:05:25.665303 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664201 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:05:25.665303 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664205 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:05:25.665303 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664209 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:05:25.665303 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664213 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:05:25.665303 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664222 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:05:25.665774 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664226 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:05:25.665774 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664230 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:05:25.665774 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664234 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:05:25.665774 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664239 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:05:25.665774 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664243 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:05:25.665774 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664247 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:05:25.665774 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664251 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:05:25.665774 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664255 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:05:25.665774 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664259 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:05:25.665774 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664263 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:05:25.665774 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664267 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:05:25.665774 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664271 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:05:25.665774 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664279 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:05:25.665774 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664284 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:05:25.665774 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664288 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:05:25.665774 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664292 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:05:25.665774 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664296 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:05:25.665774 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664301 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:05:25.665774 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664307 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:05:25.666260 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664313 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:05:25.666260 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664319 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:05:25.666260 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664324 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:05:25.666260 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664328 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:05:25.666260 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664332 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:05:25.666260 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664340 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:05:25.666260 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664390 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:05:25.666260 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664513 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:05:25.666260 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664518 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:05:25.666260 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664521 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:05:25.666260 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664524 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:05:25.666260 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664528 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:05:25.666260 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664534 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:05:25.666260 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664536 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:05:25.666260 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664539 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:05:25.666260 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664542 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:05:25.666260 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664544 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:05:25.666260 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664547 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:05:25.666260 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664550 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:05:25.666260 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664553 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:05:25.666779 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664555 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:05:25.666779 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664558 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:05:25.666779 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664561 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:05:25.666779 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664563 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:05:25.666779 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664568 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:05:25.666779 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664572 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:05:25.666779 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664575 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:05:25.666779 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664578 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:05:25.666779 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664581 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:05:25.666779 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664583 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:05:25.666779 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664587 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:05:25.666779 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664590 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:05:25.666779 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664592 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:05:25.666779 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664595 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:05:25.666779 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664599 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:05:25.666779 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664603 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:05:25.666779 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664605 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:05:25.666779 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664608 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:05:25.666779 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664612 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:05:25.666779 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664615 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:05:25.667271 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664618 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:05:25.667271 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664620 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:05:25.667271 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664623 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:05:25.667271 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664626 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:05:25.667271 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664628 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:05:25.667271 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664631 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:05:25.667271 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664633 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:05:25.667271 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664636 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:05:25.667271 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664638 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:05:25.667271 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664641 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:05:25.667271 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664643 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:05:25.667271 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.664645 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:05:25.667271 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.665835 2576 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 20:05:25.667271 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.665849 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 20:05:25.667271 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.665856 2576 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 20:05:25.667271 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.665860 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 20:05:25.667271 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.665865 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 20:05:25.667271 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.665868 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 20:05:25.667271 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.665873 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 20:05:25.667271 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.665878 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 20:05:25.667271 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.665881 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 20:05:25.667773 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.665885 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 20:05:25.667773 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.665888 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 20:05:25.667773 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.665891 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 20:05:25.667773 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.665894 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 20:05:25.667773 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.665897 2576 flags.go:64] FLAG: --cgroup-root="" Apr 20 20:05:25.667773 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.665900 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 20:05:25.667773 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.665903 2576 flags.go:64] FLAG: --client-ca-file="" Apr 20 20:05:25.667773 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.665906 2576 flags.go:64] FLAG: --cloud-config="" Apr 20 20:05:25.667773 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.665908 2576 flags.go:64] FLAG: --cloud-provider="external" Apr 20 20:05:25.667773 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.665913 2576 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 20:05:25.667773 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.665918 2576 flags.go:64] FLAG: --cluster-domain="" Apr 20 20:05:25.667773 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.665921 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 20:05:25.667773 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.665924 2576 flags.go:64] FLAG: --config-dir="" Apr 20 20:05:25.667773 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.665927 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 20:05:25.667773 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.665931 2576 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 20:05:25.667773 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.665945 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 20:05:25.667773 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.665949 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 20:05:25.667773 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.665952 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 20:05:25.667773 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.665956 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 20:05:25.667773 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.665959 2576 flags.go:64] FLAG: --contention-profiling="false" Apr 20 20:05:25.667773 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.665962 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 20:05:25.667773 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.665965 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 20:05:25.667773 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.665968 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 20:05:25.667773 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.665971 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 20:05:25.667773 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.665975 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 20:05:25.668378 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.665978 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 20:05:25.668378 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.665981 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 20:05:25.668378 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.665984 2576 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 20:05:25.668378 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.665987 2576 flags.go:64] FLAG: --enable-server="true" Apr 20 20:05:25.668378 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.665990 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 20:05:25.668378 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.665995 2576 flags.go:64] FLAG: --event-burst="100" Apr 20 20:05:25.668378 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.665998 2576 flags.go:64] FLAG: --event-qps="50" Apr 20 20:05:25.668378 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666001 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 20:05:25.668378 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666005 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 20:05:25.668378 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666008 2576 flags.go:64] FLAG: --eviction-hard="" Apr 20 20:05:25.668378 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666012 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 20:05:25.668378 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666015 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 20:05:25.668378 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666018 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 20:05:25.668378 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666021 2576 flags.go:64] FLAG: --eviction-soft="" Apr 20 20:05:25.668378 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666024 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 20:05:25.668378 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666027 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 20:05:25.668378 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666030 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 20:05:25.668378 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666033 2576 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 20:05:25.668378 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666035 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 20:05:25.668378 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666038 2576 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 20:05:25.668378 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666041 2576 flags.go:64] FLAG: --feature-gates="" Apr 20 20:05:25.668378 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666045 2576 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 20:05:25.668378 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666048 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 20:05:25.668378 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666051 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 20:05:25.668378 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666055 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 20:05:25.668976 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666059 2576 flags.go:64] FLAG: --healthz-port="10248" Apr 20 20:05:25.668976 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666062 2576 flags.go:64] FLAG: --help="false" Apr 20 20:05:25.668976 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666065 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-143-23.ec2.internal" Apr 20 20:05:25.668976 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666067 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 20:05:25.668976 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666071 2576 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 20:05:25.668976 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666074 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 20:05:25.668976 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666077 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 20:05:25.668976 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666080 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 20:05:25.668976 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666083 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 20:05:25.668976 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666086 2576 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 20:05:25.668976 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666088 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 20:05:25.668976 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666091 2576 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 20:05:25.668976 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666094 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 20:05:25.668976 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666097 2576 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 20:05:25.668976 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666100 2576 flags.go:64] FLAG: --kube-reserved="" Apr 20 20:05:25.668976 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666103 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 20:05:25.668976 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666106 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 20:05:25.668976 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666125 2576 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 20:05:25.668976 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666130 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 20:05:25.668976 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666134 2576 flags.go:64] FLAG: --lock-file="" Apr 20 20:05:25.668976 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666136 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 20:05:25.668976 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666139 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 20:05:25.668976 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666142 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 20:05:25.668976 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666147 2576 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 20:05:25.669560 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666150 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 20:05:25.669560 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666154 2576 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 20:05:25.669560 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666157 2576 flags.go:64] FLAG: --logging-format="text" Apr 20 20:05:25.669560 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666159 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 20:05:25.669560 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666163 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 20:05:25.669560 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666166 2576 flags.go:64] FLAG: --manifest-url="" Apr 20 20:05:25.669560 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666169 2576 flags.go:64] FLAG: --manifest-url-header="" Apr 20 20:05:25.669560 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666173 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 20:05:25.669560 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666177 2576 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 20:05:25.669560 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666185 2576 flags.go:64] FLAG: --max-pods="110" Apr 20 20:05:25.669560 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666188 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 20:05:25.669560 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666191 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 20:05:25.669560 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666194 2576 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 20:05:25.669560 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666197 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 20:05:25.669560 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666200 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 20:05:25.669560 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666203 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 20:05:25.669560 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666206 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 20:05:25.669560 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666214 2576 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 20:05:25.669560 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666217 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 20:05:25.669560 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666220 2576 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 20:05:25.669560 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666223 2576 flags.go:64] FLAG: --pod-cidr="" Apr 20 20:05:25.669560 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666226 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 20:05:25.669560 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666231 2576 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 20:05:25.670134 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666234 2576 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 20:05:25.670134 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666237 2576 flags.go:64] FLAG: --pods-per-core="0" Apr 20 20:05:25.670134 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666240 2576 flags.go:64] FLAG: --port="10250" Apr 20 20:05:25.670134 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666243 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 20:05:25.670134 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666246 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0ce087473093551c4" Apr 20 20:05:25.670134 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666249 2576 flags.go:64] FLAG: --qos-reserved="" Apr 20 20:05:25.670134 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666252 2576 flags.go:64] FLAG: --read-only-port="10255" Apr 20 20:05:25.670134 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666255 2576 flags.go:64] FLAG: --register-node="true" Apr 20 20:05:25.670134 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666257 2576 flags.go:64] FLAG: --register-schedulable="true" Apr 20 20:05:25.670134 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666260 2576 flags.go:64] FLAG: --register-with-taints="" Apr 20 20:05:25.670134 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666264 2576 flags.go:64] FLAG: --registry-burst="10" Apr 20 20:05:25.670134 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666267 2576 flags.go:64] FLAG: --registry-qps="5" Apr 20 20:05:25.670134 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666270 2576 flags.go:64] FLAG: --reserved-cpus="" Apr 20 20:05:25.670134 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666273 2576 flags.go:64] FLAG: --reserved-memory="" Apr 20 20:05:25.670134 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666277 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 20:05:25.670134 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666280 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 20:05:25.670134 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666283 2576 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 20:05:25.670134 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666285 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 20:05:25.670134 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666289 2576 flags.go:64] FLAG: --runonce="false" Apr 20 20:05:25.670134 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666292 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 20:05:25.670134 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666295 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 20:05:25.670134 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666299 2576 flags.go:64] FLAG: --seccomp-default="false" Apr 20 20:05:25.670134 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666302 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 20:05:25.670134 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666305 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 20:05:25.670134 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666308 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 20:05:25.670134 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666311 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 20:05:25.670738 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666314 2576 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 20:05:25.670738 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666317 2576 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 20:05:25.670738 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666320 2576 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 20:05:25.670738 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666323 2576 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 20:05:25.670738 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666326 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 20:05:25.670738 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666329 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 20:05:25.670738 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666332 2576 flags.go:64] FLAG: --system-cgroups="" Apr 20 20:05:25.670738 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666335 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 20:05:25.670738 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666341 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 20:05:25.670738 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666343 2576 flags.go:64] FLAG: --tls-cert-file="" Apr 20 20:05:25.670738 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666346 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 20:05:25.670738 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666350 2576 flags.go:64] FLAG: --tls-min-version="" Apr 20 20:05:25.670738 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666353 2576 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 20:05:25.670738 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666356 2576 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 20:05:25.670738 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666359 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 20:05:25.670738 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666362 2576 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 20:05:25.670738 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666365 2576 flags.go:64] FLAG: --v="2" Apr 20 20:05:25.670738 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666369 2576 flags.go:64] FLAG: --version="false" Apr 20 20:05:25.670738 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666373 2576 flags.go:64] FLAG: --vmodule="" Apr 20 20:05:25.670738 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666377 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 20:05:25.670738 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666380 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 20:05:25.670738 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666488 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:05:25.670738 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666492 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:05:25.670738 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666495 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:05:25.671318 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666498 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:05:25.671318 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666501 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:05:25.671318 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666506 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:05:25.671318 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666509 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:05:25.671318 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666511 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:05:25.671318 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666514 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:05:25.671318 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666517 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:05:25.671318 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666520 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:05:25.671318 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666523 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:05:25.671318 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666525 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:05:25.671318 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666527 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:05:25.671318 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666530 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:05:25.671318 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666534 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:05:25.671318 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666537 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:05:25.671318 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666540 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:05:25.671318 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666542 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:05:25.671318 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666545 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:05:25.671318 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666547 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:05:25.671318 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666550 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:05:25.671318 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666552 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:05:25.671864 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666555 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:05:25.671864 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666557 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:05:25.671864 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666560 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:05:25.671864 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666562 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:05:25.671864 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666564 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:05:25.671864 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666567 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:05:25.671864 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666569 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:05:25.671864 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666572 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:05:25.671864 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666574 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:05:25.671864 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666577 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:05:25.671864 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666579 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:05:25.671864 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666582 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:05:25.671864 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666584 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:05:25.671864 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666586 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:05:25.671864 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666590 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:05:25.671864 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666593 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:05:25.671864 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666596 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:05:25.671864 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666598 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:05:25.671864 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666601 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:05:25.671864 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666603 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:05:25.672362 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666606 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:05:25.672362 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666609 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:05:25.672362 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666611 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:05:25.672362 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666614 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:05:25.672362 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666616 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:05:25.672362 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666618 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:05:25.672362 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666621 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:05:25.672362 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666623 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:05:25.672362 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666625 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:05:25.672362 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666628 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:05:25.672362 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666630 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:05:25.672362 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666633 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:05:25.672362 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666635 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:05:25.672362 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666638 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:05:25.672362 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666640 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:05:25.672362 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666644 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:05:25.672362 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666647 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:05:25.672362 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666650 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:05:25.672362 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666653 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:05:25.672819 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666656 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:05:25.672819 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666658 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:05:25.672819 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666661 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:05:25.672819 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666664 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:05:25.672819 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666666 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:05:25.672819 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666669 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:05:25.672819 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666671 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:05:25.672819 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666674 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:05:25.672819 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666677 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:05:25.672819 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666680 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:05:25.672819 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666683 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:05:25.672819 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666685 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:05:25.672819 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666688 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:05:25.672819 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666690 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:05:25.672819 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666693 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:05:25.672819 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666696 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:05:25.672819 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666698 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:05:25.672819 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666700 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:05:25.672819 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666703 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:05:25.672819 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666705 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:05:25.673302 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666708 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:05:25.673302 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666710 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:05:25.673302 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666713 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:05:25.673302 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.666715 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:05:25.673302 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.666721 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 20:05:25.673302 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.672907 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 20:05:25.673302 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.672922 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 20:05:25.673302 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.672968 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:05:25.673302 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.672972 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:05:25.673302 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.672976 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:05:25.673302 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.672979 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:05:25.673302 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.672981 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:05:25.673302 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.672984 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:05:25.673302 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.672987 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:05:25.673302 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.672989 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:05:25.673302 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.672992 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:05:25.673693 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.672994 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:05:25.673693 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.672997 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:05:25.673693 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.672999 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:05:25.673693 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673002 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:05:25.673693 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673005 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:05:25.673693 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673008 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:05:25.673693 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673010 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:05:25.673693 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673013 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:05:25.673693 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673015 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:05:25.673693 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673017 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:05:25.673693 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673020 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:05:25.673693 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673023 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:05:25.673693 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673025 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:05:25.673693 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673030 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:05:25.673693 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673034 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:05:25.673693 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673038 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:05:25.673693 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673040 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:05:25.673693 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673043 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:05:25.673693 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673046 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:05:25.674173 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673048 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:05:25.674173 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673051 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:05:25.674173 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673053 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:05:25.674173 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673057 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:05:25.674173 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673060 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:05:25.674173 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673062 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:05:25.674173 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673065 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:05:25.674173 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673067 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:05:25.674173 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673069 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:05:25.674173 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673072 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:05:25.674173 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673075 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:05:25.674173 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673078 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:05:25.674173 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673080 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:05:25.674173 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673083 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:05:25.674173 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673085 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:05:25.674173 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673088 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:05:25.674173 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673090 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:05:25.674173 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673093 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:05:25.674173 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673097 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:05:25.674173 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673099 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:05:25.674658 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673102 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:05:25.674658 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673104 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:05:25.674658 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673122 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:05:25.674658 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673127 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:05:25.674658 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673132 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:05:25.674658 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673134 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:05:25.674658 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673137 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:05:25.674658 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673139 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:05:25.674658 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673142 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:05:25.674658 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673145 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:05:25.674658 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673147 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:05:25.674658 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673150 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:05:25.674658 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673153 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:05:25.674658 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673156 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:05:25.674658 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673158 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:05:25.674658 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673162 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:05:25.674658 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673164 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:05:25.674658 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673167 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:05:25.674658 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673170 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:05:25.675127 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673173 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:05:25.675127 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673177 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:05:25.675127 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673179 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:05:25.675127 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673182 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:05:25.675127 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673184 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:05:25.675127 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673187 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:05:25.675127 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673189 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:05:25.675127 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673192 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:05:25.675127 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673195 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:05:25.675127 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673197 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:05:25.675127 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673199 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:05:25.675127 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673202 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:05:25.675127 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673205 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:05:25.675127 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673208 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:05:25.675127 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673211 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:05:25.675127 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673213 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:05:25.675127 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673216 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:05:25.675127 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673218 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:05:25.675127 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673221 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:05:25.675637 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.673226 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 20:05:25.675637 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673315 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:05:25.675637 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673320 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:05:25.675637 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673323 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:05:25.675637 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673326 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:05:25.675637 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673329 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:05:25.675637 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673332 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:05:25.675637 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673335 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:05:25.675637 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673338 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:05:25.675637 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673340 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:05:25.675637 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673343 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:05:25.675637 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673346 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:05:25.675637 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673350 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:05:25.675637 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673354 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:05:25.675637 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673357 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:05:25.676004 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673360 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:05:25.676004 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673363 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:05:25.676004 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673365 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:05:25.676004 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673368 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:05:25.676004 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673370 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:05:25.676004 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673373 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:05:25.676004 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673375 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:05:25.676004 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673378 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:05:25.676004 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673380 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:05:25.676004 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673383 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:05:25.676004 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673385 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:05:25.676004 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673389 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:05:25.676004 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673392 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:05:25.676004 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673394 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:05:25.676004 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673397 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:05:25.676004 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673399 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:05:25.676004 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673402 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:05:25.676004 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673405 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:05:25.676004 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673407 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:05:25.676004 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673410 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:05:25.676479 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673412 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:05:25.676479 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673415 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:05:25.676479 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673417 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:05:25.676479 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673420 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:05:25.676479 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673422 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:05:25.676479 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673425 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:05:25.676479 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673427 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:05:25.676479 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673430 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:05:25.676479 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673434 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:05:25.676479 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673437 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:05:25.676479 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673440 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:05:25.676479 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673442 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:05:25.676479 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673445 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:05:25.676479 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673447 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:05:25.676479 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673450 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:05:25.676479 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673453 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:05:25.676479 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673455 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:05:25.676479 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673458 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:05:25.676479 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673460 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:05:25.676919 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673463 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:05:25.676919 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673465 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:05:25.676919 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673467 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:05:25.676919 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673470 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:05:25.676919 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673472 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:05:25.676919 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673475 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:05:25.676919 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673478 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:05:25.676919 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673480 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:05:25.676919 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673482 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:05:25.676919 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673485 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:05:25.676919 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673487 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:05:25.676919 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673490 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:05:25.676919 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673492 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:05:25.676919 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673495 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:05:25.676919 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673497 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:05:25.676919 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673499 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:05:25.676919 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673502 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:05:25.676919 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673504 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:05:25.676919 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673507 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:05:25.676919 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673509 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:05:25.677416 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673512 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:05:25.677416 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673514 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:05:25.677416 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673517 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:05:25.677416 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673519 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:05:25.677416 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673521 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:05:25.677416 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673524 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:05:25.677416 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673527 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:05:25.677416 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673529 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:05:25.677416 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673532 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:05:25.677416 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673534 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:05:25.677416 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673537 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:05:25.677416 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673539 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:05:25.677416 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:25.673542 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:05:25.677416 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.673547 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 20:05:25.677416 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.673644 2576 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 20:05:25.677786 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.677679 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 20:05:25.678570 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.678558 2576 server.go:1019] "Starting client certificate rotation" Apr 20 20:05:25.678677 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.678660 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 20:05:25.678725 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.678715 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 20:05:25.710388 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.710366 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 20:05:25.715627 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.715610 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 20:05:25.729713 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.729691 2576 log.go:25] "Validated CRI v1 runtime API" Apr 20 20:05:25.735991 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.735977 2576 log.go:25] "Validated CRI v1 image API" Apr 20 20:05:25.737157 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.737141 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 20:05:25.741280 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.741261 2576 fs.go:135] Filesystem UUIDs: map[235c8eb6-456e-4036-9f73-6e69d74e3db6:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 9ed6dad0-0538-466d-870f-49db56b66c37:/dev/nvme0n1p3] Apr 20 20:05:25.741332 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.741281 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 20:05:25.743837 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.743820 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 20:05:25.747122 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.747011 2576 manager.go:217] Machine: {Timestamp:2026-04-20 20:05:25.744730168 +0000 UTC m=+0.457412155 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3198546 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2239df62e5973969e0dedc31c82839 SystemUUID:ec2239df-62e5-9739-69e0-dedc31c82839 BootID:6dc424ed-0f39-40f8-a4f6-8f4cc761bbf7 Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:4e:b5:7e:9b:55 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:4e:b5:7e:9b:55 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ee:99:9a:b9:73:1a Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 20:05:25.747122 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.747102 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 20:05:25.747249 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.747188 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 20:05:25.748346 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.748320 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 20:05:25.748497 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.748350 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-143-23.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 20:05:25.748568 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.748510 2576 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 20:05:25.748568 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.748523 2576 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 20:05:25.748568 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.748541 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 20:05:25.749322 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.749310 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 20:05:25.750101 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.750090 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 20 20:05:25.750240 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.750229 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 20:05:25.753071 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.753060 2576 kubelet.go:491] "Attempting to sync node with API server" Apr 20 20:05:25.753141 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.753082 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 20:05:25.753141 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.753099 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 20:05:25.753141 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.753125 2576 kubelet.go:397] "Adding apiserver pod source" Apr 20 20:05:25.753141 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.753138 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 20:05:25.754239 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.754226 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 20:05:25.754307 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.754248 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 20:05:25.757645 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.757629 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 20:05:25.758904 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.758889 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 20:05:25.760651 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.760638 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 20:05:25.760683 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.760661 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 20:05:25.760683 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.760667 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 20:05:25.760683 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.760674 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 20:05:25.760683 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.760681 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 20:05:25.760792 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.760687 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 20:05:25.760792 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.760692 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 20:05:25.760792 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.760698 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 20:05:25.760792 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.760704 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 20:05:25.760792 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.760710 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 20:05:25.760792 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.760720 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 20:05:25.760792 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.760732 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 20:05:25.761610 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.761598 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 20:05:25.761647 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.761618 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 20:05:25.765899 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.765886 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 20:05:25.765976 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.765942 2576 server.go:1295] "Started kubelet" Apr 20 20:05:25.766041 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.766012 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 20:05:25.766152 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.766091 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 20:05:25.766192 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.766171 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 20:05:25.766662 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:25.766637 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 20:05:25.766744 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:25.766646 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-143-23.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 20:05:25.766789 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.766742 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-143-23.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 20:05:25.767002 ip-10-0-143-23 systemd[1]: Started Kubernetes Kubelet. Apr 20 20:05:25.767287 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.767247 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 20:05:25.772973 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.772952 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 20 20:05:25.776824 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.776808 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 20:05:25.776824 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.776817 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 20:05:25.776964 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:25.774763 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-143-23.ec2.internal.18a8295b7432b6b9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-143-23.ec2.internal,UID:ip-10-0-143-23.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-143-23.ec2.internal,},FirstTimestamp:2026-04-20 20:05:25.765904057 +0000 UTC m=+0.478586043,LastTimestamp:2026-04-20 20:05:25.765904057 +0000 UTC m=+0.478586043,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-143-23.ec2.internal,}" Apr 20 20:05:25.777436 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.777418 2576 factory.go:55] Registering systemd factory Apr 20 20:05:25.777524 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.777442 2576 factory.go:223] Registration of the systemd container factory successfully Apr 20 20:05:25.777524 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.777509 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 20:05:25.777524 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.777507 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 20:05:25.777625 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.777525 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 20:05:25.777662 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.777638 2576 factory.go:153] Registering CRI-O factory Apr 20 20:05:25.777662 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.777646 2576 reconstruct.go:97] "Volume reconstruction finished" Apr 20 20:05:25.777662 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.777651 2576 factory.go:223] Registration of the crio container factory successfully Apr 20 20:05:25.777662 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.777655 2576 reconciler.go:26] "Reconciler: start to sync state" Apr 20 20:05:25.777826 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.777695 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 20:05:25.777826 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.777720 2576 factory.go:103] Registering Raw factory Apr 20 20:05:25.777826 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.777742 2576 manager.go:1196] Started watching for new ooms in manager Apr 20 20:05:25.777826 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:25.777805 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-23.ec2.internal\" not found" Apr 20 20:05:25.778354 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.778340 2576 manager.go:319] Starting recovery of all containers Apr 20 20:05:25.780156 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:25.780133 2576 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 20:05:25.784152 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:25.784127 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 20 20:05:25.784261 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:25.784177 2576 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-143-23.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 20 20:05:25.789426 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.789314 2576 manager.go:324] Recovery completed Apr 20 20:05:25.792007 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.791991 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-7gcrr" Apr 20 20:05:25.793382 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.793371 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:05:25.795625 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.795612 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-23.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:05:25.795692 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.795637 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-23.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:05:25.795692 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.795646 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-23.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:05:25.796125 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.796096 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 20:05:25.796125 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.796121 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 20:05:25.796228 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.796139 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 20 20:05:25.797397 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:25.797331 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-143-23.ec2.internal.18a8295b75f8368d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-143-23.ec2.internal,UID:ip-10-0-143-23.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-143-23.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-143-23.ec2.internal,},FirstTimestamp:2026-04-20 20:05:25.795624589 +0000 UTC m=+0.508306576,LastTimestamp:2026-04-20 20:05:25.795624589 +0000 UTC m=+0.508306576,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-143-23.ec2.internal,}" Apr 20 20:05:25.799068 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.799057 2576 policy_none.go:49] "None policy: Start" Apr 20 20:05:25.799105 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.799072 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 20:05:25.799105 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.799082 2576 state_mem.go:35] "Initializing new in-memory state store" Apr 20 20:05:25.800438 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.800425 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-7gcrr" Apr 20 20:05:25.835652 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.835640 2576 manager.go:341] "Starting Device Plugin manager" Apr 20 20:05:25.864053 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:25.835667 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 20:05:25.864053 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.835676 2576 server.go:85] "Starting device plugin registration server" Apr 20 20:05:25.864053 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.836068 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 20:05:25.864053 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.836084 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 20:05:25.864053 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.836319 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 20:05:25.864053 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.836409 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 20:05:25.864053 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.836421 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 20:05:25.864053 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:25.837392 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 20:05:25.864053 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:25.837438 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-143-23.ec2.internal\" not found" Apr 20 20:05:25.936260 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.936208 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:05:25.937006 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.936990 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-23.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:05:25.937080 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.937017 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-23.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:05:25.937080 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.937027 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-23.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:05:25.937080 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.937051 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-143-23.ec2.internal" Apr 20 20:05:25.937837 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.937806 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 20:05:25.938926 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.938907 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 20:05:25.938926 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.938929 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 20:05:25.939063 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.938944 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 20:05:25.939063 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.938953 2576 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 20:05:25.939063 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:25.938989 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 20:05:25.941563 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.941547 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:05:25.945970 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:25.945955 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-143-23.ec2.internal" Apr 20 20:05:25.946065 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:25.945973 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-143-23.ec2.internal\": node \"ip-10-0-143-23.ec2.internal\" not found" Apr 20 20:05:25.978981 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:25.978965 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-23.ec2.internal\" not found" Apr 20 20:05:26.040055 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:26.040035 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-143-23.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-23.ec2.internal"] Apr 20 20:05:26.040145 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:26.040088 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:05:26.041328 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:26.041314 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-23.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:05:26.041401 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:26.041337 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-23.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:05:26.041401 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:26.041348 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-23.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:05:26.043237 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:26.043226 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:05:26.043355 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:26.043342 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-23.ec2.internal" Apr 20 20:05:26.043391 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:26.043367 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:05:26.043820 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:26.043808 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-23.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:05:26.043820 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:26.043812 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-23.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:05:26.043930 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:26.043830 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-23.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:05:26.043930 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:26.043835 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-23.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:05:26.043930 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:26.043847 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-23.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:05:26.043930 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:26.043848 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-23.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:05:26.045728 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:26.045714 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-23.ec2.internal" Apr 20 20:05:26.045795 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:26.045737 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:05:26.046382 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:26.046369 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-23.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:05:26.046457 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:26.046398 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-23.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:05:26.046457 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:26.046413 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-23.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:05:26.068599 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:26.068586 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-23.ec2.internal\" not found" node="ip-10-0-143-23.ec2.internal" Apr 20 20:05:26.072590 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:26.072577 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-23.ec2.internal\" not found" node="ip-10-0-143-23.ec2.internal" Apr 20 20:05:26.079784 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:26.079770 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-23.ec2.internal\" not found" Apr 20 20:05:26.079847 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:26.079815 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/7fe58c6b7d5dfc478334c93846063b98-config\") pod \"kube-apiserver-proxy-ip-10-0-143-23.ec2.internal\" (UID: \"7fe58c6b7d5dfc478334c93846063b98\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-23.ec2.internal" Apr 20 20:05:26.079847 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:26.079834 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9f836079fce259ee26f3d68a7bc5ab6d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-23.ec2.internal\" (UID: \"9f836079fce259ee26f3d68a7bc5ab6d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-23.ec2.internal" Apr 20 20:05:26.079931 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:26.079858 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9f836079fce259ee26f3d68a7bc5ab6d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-23.ec2.internal\" (UID: \"9f836079fce259ee26f3d68a7bc5ab6d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-23.ec2.internal" Apr 20 20:05:26.180434 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:26.180414 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-23.ec2.internal\" not found" Apr 20 20:05:26.180509 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:26.180457 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/7fe58c6b7d5dfc478334c93846063b98-config\") pod \"kube-apiserver-proxy-ip-10-0-143-23.ec2.internal\" (UID: \"7fe58c6b7d5dfc478334c93846063b98\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-23.ec2.internal" Apr 20 20:05:26.180509 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:26.180476 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9f836079fce259ee26f3d68a7bc5ab6d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-23.ec2.internal\" (UID: \"9f836079fce259ee26f3d68a7bc5ab6d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-23.ec2.internal" Apr 20 20:05:26.180509 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:26.180495 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9f836079fce259ee26f3d68a7bc5ab6d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-23.ec2.internal\" (UID: \"9f836079fce259ee26f3d68a7bc5ab6d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-23.ec2.internal" Apr 20 20:05:26.180604 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:26.180549 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/7fe58c6b7d5dfc478334c93846063b98-config\") pod \"kube-apiserver-proxy-ip-10-0-143-23.ec2.internal\" (UID: \"7fe58c6b7d5dfc478334c93846063b98\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-23.ec2.internal" Apr 20 20:05:26.180604 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:26.180558 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9f836079fce259ee26f3d68a7bc5ab6d-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-23.ec2.internal\" (UID: \"9f836079fce259ee26f3d68a7bc5ab6d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-23.ec2.internal" Apr 20 20:05:26.180604 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:26.180566 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9f836079fce259ee26f3d68a7bc5ab6d-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-23.ec2.internal\" (UID: \"9f836079fce259ee26f3d68a7bc5ab6d\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-23.ec2.internal" Apr 20 20:05:26.281022 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:26.280981 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-23.ec2.internal\" not found" Apr 20 20:05:26.371208 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:26.371186 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-23.ec2.internal" Apr 20 20:05:26.375603 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:26.375586 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-23.ec2.internal" Apr 20 20:05:26.382087 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:26.382069 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-23.ec2.internal\" not found" Apr 20 20:05:26.482518 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:26.482486 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-23.ec2.internal\" not found" Apr 20 20:05:26.582883 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:26.582831 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-23.ec2.internal\" not found" Apr 20 20:05:26.678124 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:26.678083 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 20:05:26.678601 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:26.678234 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 20:05:26.683239 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:26.683218 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-23.ec2.internal\" not found" Apr 20 20:05:26.739413 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:26.739392 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:05:26.747911 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:26.747895 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:05:26.777042 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:26.777021 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 20:05:26.783636 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:26.783618 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-23.ec2.internal\" not found" Apr 20 20:05:26.787362 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:26.787344 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 20:05:26.802475 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:26.802442 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 20:00:25 +0000 UTC" deadline="2027-09-30 08:30:41.303406461 +0000 UTC" Apr 20 20:05:26.802550 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:26.802471 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12660h25m14.500938772s" Apr 20 20:05:26.806796 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:26.806779 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-jt994" Apr 20 20:05:26.815060 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:26.815047 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-jt994" Apr 20 20:05:26.884138 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:26.884062 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-23.ec2.internal\" not found" Apr 20 20:05:26.984715 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:26.984689 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-23.ec2.internal\" not found" Apr 20 20:05:27.016569 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.016550 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:05:27.027475 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:27.027451 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f836079fce259ee26f3d68a7bc5ab6d.slice/crio-d6775a1da80ca5a1a66b5647cce249386359477fe9622d5890a29712e4af21c5 WatchSource:0}: Error finding container d6775a1da80ca5a1a66b5647cce249386359477fe9622d5890a29712e4af21c5: Status 404 returned error can't find the container with id d6775a1da80ca5a1a66b5647cce249386359477fe9622d5890a29712e4af21c5 Apr 20 20:05:27.085775 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:27.085755 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-23.ec2.internal\" not found" Apr 20 20:05:27.086124 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.086090 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:05:27.177554 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.177510 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-23.ec2.internal" Apr 20 20:05:27.187812 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.187796 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 20:05:27.189160 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.189149 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-23.ec2.internal" Apr 20 20:05:27.199642 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.199624 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 20:05:27.526989 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.526912 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:05:27.754351 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.754322 2576 apiserver.go:52] "Watching apiserver" Apr 20 20:05:27.765092 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.765069 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 20:05:27.765516 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.765490 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-nt6rc","openshift-ovn-kubernetes/ovnkube-node-z55qt","kube-system/konnectivity-agent-fsxdh","kube-system/kube-apiserver-proxy-ip-10-0-143-23.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k58dg","openshift-image-registry/node-ca-56wkj","openshift-multus/multus-44qfk","openshift-multus/multus-additional-cni-plugins-68wc5","openshift-multus/network-metrics-daemon-2n29c","kube-system/global-pull-secret-syncer-7wn8c","openshift-cluster-node-tuning-operator/tuned-lzm5k","openshift-dns/node-resolver-kzk2j","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-23.ec2.internal","openshift-network-diagnostics/network-check-target-p8x9m"] Apr 20 20:05:27.770393 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.770368 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-44qfk" Apr 20 20:05:27.772485 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.772460 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2n29c" Apr 20 20:05:27.772605 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:27.772552 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2n29c" podUID="4c96dea8-a54f-4ca2-a3fb-757208554fe3" Apr 20 20:05:27.773289 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.773263 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 20:05:27.773413 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.773383 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-dgxxt\"" Apr 20 20:05:27.773490 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.773451 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 20:05:27.773490 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.773481 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 20:05:27.773587 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.773556 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 20:05:27.776928 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.776907 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k58dg" Apr 20 20:05:27.779248 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.779228 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 20:05:27.779336 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.779260 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-nr8hs\"" Apr 20 20:05:27.779568 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.779552 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 20:05:27.779705 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.779560 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 20:05:27.780567 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.780553 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7wn8c" Apr 20 20:05:27.780672 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:27.780643 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7wn8c" podUID="2f559047-01c4-4f3b-a5a7-1c183af11e8f" Apr 20 20:05:27.782932 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.782907 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-56wkj" Apr 20 20:05:27.783030 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.783010 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-fsxdh" Apr 20 20:05:27.785831 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.785636 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-68wc5" Apr 20 20:05:27.785831 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.785713 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 20:05:27.785998 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.785893 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 20:05:27.785998 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.785924 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-n2vgw\"" Apr 20 20:05:27.786252 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.786193 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-jzfff\"" Apr 20 20:05:27.786252 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.786193 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 20:05:27.786363 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.786338 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 20:05:27.786636 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.786460 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 20:05:27.787877 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.787852 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.788689 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.788668 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 20:05:27.788782 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.788707 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 20:05:27.788782 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.788747 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-prfnm\"" Apr 20 20:05:27.790210 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.790194 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-nt6rc" Apr 20 20:05:27.790648 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.790628 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 20:05:27.790906 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.790737 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d3bb9caf-0865-431c-89be-a2a222a45633-os-release\") pod \"multus-44qfk\" (UID: \"d3bb9caf-0865-431c-89be-a2a222a45633\") " pod="openshift-multus/multus-44qfk" Apr 20 20:05:27.790906 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.790770 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d3bb9caf-0865-431c-89be-a2a222a45633-host-run-netns\") pod \"multus-44qfk\" (UID: \"d3bb9caf-0865-431c-89be-a2a222a45633\") " pod="openshift-multus/multus-44qfk" Apr 20 20:05:27.790906 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.790798 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d3bb9caf-0865-431c-89be-a2a222a45633-host-var-lib-cni-multus\") pod \"multus-44qfk\" (UID: \"d3bb9caf-0865-431c-89be-a2a222a45633\") " pod="openshift-multus/multus-44qfk" Apr 20 20:05:27.790906 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.790823 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d3bb9caf-0865-431c-89be-a2a222a45633-multus-daemon-config\") pod \"multus-44qfk\" (UID: \"d3bb9caf-0865-431c-89be-a2a222a45633\") " pod="openshift-multus/multus-44qfk" Apr 20 20:05:27.790906 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.790865 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nphp\" (UniqueName: \"kubernetes.io/projected/d3bb9caf-0865-431c-89be-a2a222a45633-kube-api-access-4nphp\") pod \"multus-44qfk\" (UID: \"d3bb9caf-0865-431c-89be-a2a222a45633\") " pod="openshift-multus/multus-44qfk" Apr 20 20:05:27.790906 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.790896 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/2f559047-01c4-4f3b-a5a7-1c183af11e8f-kubelet-config\") pod \"global-pull-secret-syncer-7wn8c\" (UID: \"2f559047-01c4-4f3b-a5a7-1c183af11e8f\") " pod="kube-system/global-pull-secret-syncer-7wn8c" Apr 20 20:05:27.791257 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.790921 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/544a3b4c-fde7-4ae8-a14c-e19625554dfe-kubelet-dir\") pod \"aws-ebs-csi-driver-node-k58dg\" (UID: \"544a3b4c-fde7-4ae8-a14c-e19625554dfe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k58dg" Apr 20 20:05:27.791257 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.790943 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/544a3b4c-fde7-4ae8-a14c-e19625554dfe-sys-fs\") pod \"aws-ebs-csi-driver-node-k58dg\" (UID: \"544a3b4c-fde7-4ae8-a14c-e19625554dfe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k58dg" Apr 20 20:05:27.791257 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.790974 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d3bb9caf-0865-431c-89be-a2a222a45633-cnibin\") pod \"multus-44qfk\" (UID: \"d3bb9caf-0865-431c-89be-a2a222a45633\") " pod="openshift-multus/multus-44qfk" Apr 20 20:05:27.791257 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.791000 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d3bb9caf-0865-431c-89be-a2a222a45633-cni-binary-copy\") pod \"multus-44qfk\" (UID: \"d3bb9caf-0865-431c-89be-a2a222a45633\") " pod="openshift-multus/multus-44qfk" Apr 20 20:05:27.791257 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.791026 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d3bb9caf-0865-431c-89be-a2a222a45633-multus-socket-dir-parent\") pod \"multus-44qfk\" (UID: \"d3bb9caf-0865-431c-89be-a2a222a45633\") " pod="openshift-multus/multus-44qfk" Apr 20 20:05:27.791257 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.791069 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d3bb9caf-0865-431c-89be-a2a222a45633-host-var-lib-kubelet\") pod \"multus-44qfk\" (UID: \"d3bb9caf-0865-431c-89be-a2a222a45633\") " pod="openshift-multus/multus-44qfk" Apr 20 20:05:27.791257 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.791097 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d3bb9caf-0865-431c-89be-a2a222a45633-etc-kubernetes\") pod \"multus-44qfk\" (UID: \"d3bb9caf-0865-431c-89be-a2a222a45633\") " pod="openshift-multus/multus-44qfk" Apr 20 20:05:27.791257 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.791142 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4c96dea8-a54f-4ca2-a3fb-757208554fe3-metrics-certs\") pod \"network-metrics-daemon-2n29c\" (UID: \"4c96dea8-a54f-4ca2-a3fb-757208554fe3\") " pod="openshift-multus/network-metrics-daemon-2n29c" Apr 20 20:05:27.791257 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.791173 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vl6jb\" (UniqueName: \"kubernetes.io/projected/4c96dea8-a54f-4ca2-a3fb-757208554fe3-kube-api-access-vl6jb\") pod \"network-metrics-daemon-2n29c\" (UID: \"4c96dea8-a54f-4ca2-a3fb-757208554fe3\") " pod="openshift-multus/network-metrics-daemon-2n29c" Apr 20 20:05:27.791257 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.791224 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/544a3b4c-fde7-4ae8-a14c-e19625554dfe-device-dir\") pod \"aws-ebs-csi-driver-node-k58dg\" (UID: \"544a3b4c-fde7-4ae8-a14c-e19625554dfe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k58dg" Apr 20 20:05:27.791257 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.791261 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d3bb9caf-0865-431c-89be-a2a222a45633-multus-cni-dir\") pod \"multus-44qfk\" (UID: \"d3bb9caf-0865-431c-89be-a2a222a45633\") " pod="openshift-multus/multus-44qfk" Apr 20 20:05:27.791744 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.791288 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d3bb9caf-0865-431c-89be-a2a222a45633-hostroot\") pod \"multus-44qfk\" (UID: \"d3bb9caf-0865-431c-89be-a2a222a45633\") " pod="openshift-multus/multus-44qfk" Apr 20 20:05:27.791744 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.791303 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8c5b0ba6-52e0-4a02-bb42-a4b2b377f250-serviceca\") pod \"node-ca-56wkj\" (UID: \"8c5b0ba6-52e0-4a02-bb42-a4b2b377f250\") " pod="openshift-image-registry/node-ca-56wkj" Apr 20 20:05:27.791744 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.791316 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8c5b0ba6-52e0-4a02-bb42-a4b2b377f250-host\") pod \"node-ca-56wkj\" (UID: \"8c5b0ba6-52e0-4a02-bb42-a4b2b377f250\") " pod="openshift-image-registry/node-ca-56wkj" Apr 20 20:05:27.791744 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.791332 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d3bb9caf-0865-431c-89be-a2a222a45633-system-cni-dir\") pod \"multus-44qfk\" (UID: \"d3bb9caf-0865-431c-89be-a2a222a45633\") " pod="openshift-multus/multus-44qfk" Apr 20 20:05:27.791744 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.791360 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/2f559047-01c4-4f3b-a5a7-1c183af11e8f-dbus\") pod \"global-pull-secret-syncer-7wn8c\" (UID: \"2f559047-01c4-4f3b-a5a7-1c183af11e8f\") " pod="kube-system/global-pull-secret-syncer-7wn8c" Apr 20 20:05:27.791744 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.791393 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2f559047-01c4-4f3b-a5a7-1c183af11e8f-original-pull-secret\") pod \"global-pull-secret-syncer-7wn8c\" (UID: \"2f559047-01c4-4f3b-a5a7-1c183af11e8f\") " pod="kube-system/global-pull-secret-syncer-7wn8c" Apr 20 20:05:27.791744 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.791421 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/544a3b4c-fde7-4ae8-a14c-e19625554dfe-registration-dir\") pod \"aws-ebs-csi-driver-node-k58dg\" (UID: \"544a3b4c-fde7-4ae8-a14c-e19625554dfe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k58dg" Apr 20 20:05:27.791744 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.791445 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hl9mj\" (UniqueName: \"kubernetes.io/projected/8c5b0ba6-52e0-4a02-bb42-a4b2b377f250-kube-api-access-hl9mj\") pod \"node-ca-56wkj\" (UID: \"8c5b0ba6-52e0-4a02-bb42-a4b2b377f250\") " pod="openshift-image-registry/node-ca-56wkj" Apr 20 20:05:27.791744 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.791490 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d3bb9caf-0865-431c-89be-a2a222a45633-multus-conf-dir\") pod \"multus-44qfk\" (UID: \"d3bb9caf-0865-431c-89be-a2a222a45633\") " pod="openshift-multus/multus-44qfk" Apr 20 20:05:27.791744 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.791529 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/544a3b4c-fde7-4ae8-a14c-e19625554dfe-socket-dir\") pod \"aws-ebs-csi-driver-node-k58dg\" (UID: \"544a3b4c-fde7-4ae8-a14c-e19625554dfe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k58dg" Apr 20 20:05:27.791744 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.791572 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngjgb\" (UniqueName: \"kubernetes.io/projected/544a3b4c-fde7-4ae8-a14c-e19625554dfe-kube-api-access-ngjgb\") pod \"aws-ebs-csi-driver-node-k58dg\" (UID: \"544a3b4c-fde7-4ae8-a14c-e19625554dfe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k58dg" Apr 20 20:05:27.791744 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.791613 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d3bb9caf-0865-431c-89be-a2a222a45633-host-run-k8s-cni-cncf-io\") pod \"multus-44qfk\" (UID: \"d3bb9caf-0865-431c-89be-a2a222a45633\") " pod="openshift-multus/multus-44qfk" Apr 20 20:05:27.791744 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.791643 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d3bb9caf-0865-431c-89be-a2a222a45633-host-run-multus-certs\") pod \"multus-44qfk\" (UID: \"d3bb9caf-0865-431c-89be-a2a222a45633\") " pod="openshift-multus/multus-44qfk" Apr 20 20:05:27.791744 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.791680 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/544a3b4c-fde7-4ae8-a14c-e19625554dfe-etc-selinux\") pod \"aws-ebs-csi-driver-node-k58dg\" (UID: \"544a3b4c-fde7-4ae8-a14c-e19625554dfe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k58dg" Apr 20 20:05:27.791744 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.791715 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d3bb9caf-0865-431c-89be-a2a222a45633-host-var-lib-cni-bin\") pod \"multus-44qfk\" (UID: \"d3bb9caf-0865-431c-89be-a2a222a45633\") " pod="openshift-multus/multus-44qfk" Apr 20 20:05:27.792399 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.792047 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 20:05:27.792399 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.792150 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 20:05:27.792399 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.792206 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 20:05:27.792399 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.792393 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 20:05:27.792573 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.792543 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 20:05:27.792756 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.792680 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-vl2jc\"" Apr 20 20:05:27.793165 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.793147 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 20:05:27.793327 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.793310 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-f8h82\"" Apr 20 20:05:27.793388 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.793368 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 20:05:27.793435 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.793151 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 20:05:27.793761 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.793745 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-lzm5k" Apr 20 20:05:27.796347 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.796276 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-kzk2j" Apr 20 20:05:27.796442 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.796360 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-52c62\"" Apr 20 20:05:27.802318 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.798302 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 20:05:27.802318 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.798379 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 20:05:27.802318 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.798937 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-wrl75\"" Apr 20 20:05:27.802318 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.801698 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 20:05:27.802547 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.802326 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 20:05:27.803929 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.803789 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p8x9m" Apr 20 20:05:27.804022 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:27.803993 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p8x9m" podUID="26d6564f-741b-481d-a1c3-a42559981c32" Apr 20 20:05:27.817042 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.817017 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 20:00:26 +0000 UTC" deadline="2027-12-31 06:49:17.320065114 +0000 UTC" Apr 20 20:05:27.817042 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.817040 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14866h43m49.50302732s" Apr 20 20:05:27.879274 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.879256 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 20:05:27.892694 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.892668 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0a9e3157-7a26-4d6c-9f83-58c9eca7c51a-konnectivity-ca\") pod \"konnectivity-agent-fsxdh\" (UID: \"0a9e3157-7a26-4d6c-9f83-58c9eca7c51a\") " pod="kube-system/konnectivity-agent-fsxdh" Apr 20 20:05:27.892798 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.892710 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d3bb9caf-0865-431c-89be-a2a222a45633-system-cni-dir\") pod \"multus-44qfk\" (UID: \"d3bb9caf-0865-431c-89be-a2a222a45633\") " pod="openshift-multus/multus-44qfk" Apr 20 20:05:27.892798 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.892737 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/2f559047-01c4-4f3b-a5a7-1c183af11e8f-dbus\") pod \"global-pull-secret-syncer-7wn8c\" (UID: \"2f559047-01c4-4f3b-a5a7-1c183af11e8f\") " pod="kube-system/global-pull-secret-syncer-7wn8c" Apr 20 20:05:27.892907 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.892808 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d3bb9caf-0865-431c-89be-a2a222a45633-system-cni-dir\") pod \"multus-44qfk\" (UID: \"d3bb9caf-0865-431c-89be-a2a222a45633\") " pod="openshift-multus/multus-44qfk" Apr 20 20:05:27.892907 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.892852 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2f559047-01c4-4f3b-a5a7-1c183af11e8f-original-pull-secret\") pod \"global-pull-secret-syncer-7wn8c\" (UID: \"2f559047-01c4-4f3b-a5a7-1c183af11e8f\") " pod="kube-system/global-pull-secret-syncer-7wn8c" Apr 20 20:05:27.892907 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.892866 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/2f559047-01c4-4f3b-a5a7-1c183af11e8f-dbus\") pod \"global-pull-secret-syncer-7wn8c\" (UID: \"2f559047-01c4-4f3b-a5a7-1c183af11e8f\") " pod="kube-system/global-pull-secret-syncer-7wn8c" Apr 20 20:05:27.892907 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.892881 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/544a3b4c-fde7-4ae8-a14c-e19625554dfe-registration-dir\") pod \"aws-ebs-csi-driver-node-k58dg\" (UID: \"544a3b4c-fde7-4ae8-a14c-e19625554dfe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k58dg" Apr 20 20:05:27.893094 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.892908 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hl9mj\" (UniqueName: \"kubernetes.io/projected/8c5b0ba6-52e0-4a02-bb42-a4b2b377f250-kube-api-access-hl9mj\") pod \"node-ca-56wkj\" (UID: \"8c5b0ba6-52e0-4a02-bb42-a4b2b377f250\") " pod="openshift-image-registry/node-ca-56wkj" Apr 20 20:05:27.893094 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.892924 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/544a3b4c-fde7-4ae8-a14c-e19625554dfe-registration-dir\") pod \"aws-ebs-csi-driver-node-k58dg\" (UID: \"544a3b4c-fde7-4ae8-a14c-e19625554dfe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k58dg" Apr 20 20:05:27.893094 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.892938 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnzv7\" (UniqueName: \"kubernetes.io/projected/45f05212-3e62-445f-af62-b586721d3417-kube-api-access-fnzv7\") pod \"multus-additional-cni-plugins-68wc5\" (UID: \"45f05212-3e62-445f-af62-b586721d3417\") " pod="openshift-multus/multus-additional-cni-plugins-68wc5" Apr 20 20:05:27.893094 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.892968 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-run-openvswitch\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.893094 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.892991 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7f59f5a3-2821-4412-9688-73d69c9bbb4c-etc-sysctl-d\") pod \"tuned-lzm5k\" (UID: \"7f59f5a3-2821-4412-9688-73d69c9bbb4c\") " pod="openshift-cluster-node-tuning-operator/tuned-lzm5k" Apr 20 20:05:27.893094 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:27.892997 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 20:05:27.893094 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.893019 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d3bb9caf-0865-431c-89be-a2a222a45633-multus-conf-dir\") pod \"multus-44qfk\" (UID: \"d3bb9caf-0865-431c-89be-a2a222a45633\") " pod="openshift-multus/multus-44qfk" Apr 20 20:05:27.893094 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.893043 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/544a3b4c-fde7-4ae8-a14c-e19625554dfe-socket-dir\") pod \"aws-ebs-csi-driver-node-k58dg\" (UID: \"544a3b4c-fde7-4ae8-a14c-e19625554dfe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k58dg" Apr 20 20:05:27.893094 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:27.893056 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f559047-01c4-4f3b-a5a7-1c183af11e8f-original-pull-secret podName:2f559047-01c4-4f3b-a5a7-1c183af11e8f nodeName:}" failed. No retries permitted until 2026-04-20 20:05:28.393037443 +0000 UTC m=+3.105719426 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/2f559047-01c4-4f3b-a5a7-1c183af11e8f-original-pull-secret") pod "global-pull-secret-syncer-7wn8c" (UID: "2f559047-01c4-4f3b-a5a7-1c183af11e8f") : object "kube-system"/"original-pull-secret" not registered Apr 20 20:05:27.893094 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.893091 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84l8q\" (UniqueName: \"kubernetes.io/projected/26d6564f-741b-481d-a1c3-a42559981c32-kube-api-access-84l8q\") pod \"network-check-target-p8x9m\" (UID: \"26d6564f-741b-481d-a1c3-a42559981c32\") " pod="openshift-network-diagnostics/network-check-target-p8x9m" Apr 20 20:05:27.893576 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.893135 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/45668015-eebd-4a0f-adbc-9ebb7f100cbd-host-slash\") pod \"iptables-alerter-nt6rc\" (UID: \"45668015-eebd-4a0f-adbc-9ebb7f100cbd\") " pod="openshift-network-operator/iptables-alerter-nt6rc" Apr 20 20:05:27.893576 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.893157 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7f59f5a3-2821-4412-9688-73d69c9bbb4c-etc-sysctl-conf\") pod \"tuned-lzm5k\" (UID: \"7f59f5a3-2821-4412-9688-73d69c9bbb4c\") " pod="openshift-cluster-node-tuning-operator/tuned-lzm5k" Apr 20 20:05:27.893576 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.893168 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/544a3b4c-fde7-4ae8-a14c-e19625554dfe-socket-dir\") pod \"aws-ebs-csi-driver-node-k58dg\" (UID: \"544a3b4c-fde7-4ae8-a14c-e19625554dfe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k58dg" Apr 20 20:05:27.893576 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.893184 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7f59f5a3-2821-4412-9688-73d69c9bbb4c-sys\") pod \"tuned-lzm5k\" (UID: \"7f59f5a3-2821-4412-9688-73d69c9bbb4c\") " pod="openshift-cluster-node-tuning-operator/tuned-lzm5k" Apr 20 20:05:27.893576 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.893216 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9cdd6b10-c252-4482-9ba2-40fdae9ef435-hosts-file\") pod \"node-resolver-kzk2j\" (UID: \"9cdd6b10-c252-4482-9ba2-40fdae9ef435\") " pod="openshift-dns/node-resolver-kzk2j" Apr 20 20:05:27.893576 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.893242 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d3bb9caf-0865-431c-89be-a2a222a45633-multus-conf-dir\") pod \"multus-44qfk\" (UID: \"d3bb9caf-0865-431c-89be-a2a222a45633\") " pod="openshift-multus/multus-44qfk" Apr 20 20:05:27.893576 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.893289 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d3bb9caf-0865-431c-89be-a2a222a45633-host-run-k8s-cni-cncf-io\") pod \"multus-44qfk\" (UID: \"d3bb9caf-0865-431c-89be-a2a222a45633\") " pod="openshift-multus/multus-44qfk" Apr 20 20:05:27.893576 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.893317 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d3bb9caf-0865-431c-89be-a2a222a45633-host-run-multus-certs\") pod \"multus-44qfk\" (UID: \"d3bb9caf-0865-431c-89be-a2a222a45633\") " pod="openshift-multus/multus-44qfk" Apr 20 20:05:27.893576 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.893336 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/45f05212-3e62-445f-af62-b586721d3417-tuning-conf-dir\") pod \"multus-additional-cni-plugins-68wc5\" (UID: \"45f05212-3e62-445f-af62-b586721d3417\") " pod="openshift-multus/multus-additional-cni-plugins-68wc5" Apr 20 20:05:27.893576 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.893346 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d3bb9caf-0865-431c-89be-a2a222a45633-host-run-k8s-cni-cncf-io\") pod \"multus-44qfk\" (UID: \"d3bb9caf-0865-431c-89be-a2a222a45633\") " pod="openshift-multus/multus-44qfk" Apr 20 20:05:27.893576 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.893361 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d3bb9caf-0865-431c-89be-a2a222a45633-host-run-multus-certs\") pod \"multus-44qfk\" (UID: \"d3bb9caf-0865-431c-89be-a2a222a45633\") " pod="openshift-multus/multus-44qfk" Apr 20 20:05:27.893576 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.893352 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-log-socket\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.893576 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.893414 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-ovnkube-script-lib\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.893576 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.893440 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfbmb\" (UniqueName: \"kubernetes.io/projected/7f59f5a3-2821-4412-9688-73d69c9bbb4c-kube-api-access-hfbmb\") pod \"tuned-lzm5k\" (UID: \"7f59f5a3-2821-4412-9688-73d69c9bbb4c\") " pod="openshift-cluster-node-tuning-operator/tuned-lzm5k" Apr 20 20:05:27.893576 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.893470 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-host-run-ovn-kubernetes\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.893576 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.893492 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-ovnkube-config\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.893576 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.893516 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwck5\" (UniqueName: \"kubernetes.io/projected/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-kube-api-access-pwck5\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.894326 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.893537 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7f59f5a3-2821-4412-9688-73d69c9bbb4c-run\") pod \"tuned-lzm5k\" (UID: \"7f59f5a3-2821-4412-9688-73d69c9bbb4c\") " pod="openshift-cluster-node-tuning-operator/tuned-lzm5k" Apr 20 20:05:27.894326 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.893564 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d3bb9caf-0865-431c-89be-a2a222a45633-os-release\") pod \"multus-44qfk\" (UID: \"d3bb9caf-0865-431c-89be-a2a222a45633\") " pod="openshift-multus/multus-44qfk" Apr 20 20:05:27.894326 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.893599 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d3bb9caf-0865-431c-89be-a2a222a45633-host-run-netns\") pod \"multus-44qfk\" (UID: \"d3bb9caf-0865-431c-89be-a2a222a45633\") " pod="openshift-multus/multus-44qfk" Apr 20 20:05:27.894326 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.893666 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d3bb9caf-0865-431c-89be-a2a222a45633-host-var-lib-cni-multus\") pod \"multus-44qfk\" (UID: \"d3bb9caf-0865-431c-89be-a2a222a45633\") " pod="openshift-multus/multus-44qfk" Apr 20 20:05:27.894326 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.893684 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d3bb9caf-0865-431c-89be-a2a222a45633-os-release\") pod \"multus-44qfk\" (UID: \"d3bb9caf-0865-431c-89be-a2a222a45633\") " pod="openshift-multus/multus-44qfk" Apr 20 20:05:27.894326 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.893696 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d3bb9caf-0865-431c-89be-a2a222a45633-host-run-netns\") pod \"multus-44qfk\" (UID: \"d3bb9caf-0865-431c-89be-a2a222a45633\") " pod="openshift-multus/multus-44qfk" Apr 20 20:05:27.894326 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.893743 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/544a3b4c-fde7-4ae8-a14c-e19625554dfe-kubelet-dir\") pod \"aws-ebs-csi-driver-node-k58dg\" (UID: \"544a3b4c-fde7-4ae8-a14c-e19625554dfe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k58dg" Apr 20 20:05:27.894326 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.893769 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/45f05212-3e62-445f-af62-b586721d3417-system-cni-dir\") pod \"multus-additional-cni-plugins-68wc5\" (UID: \"45f05212-3e62-445f-af62-b586721d3417\") " pod="openshift-multus/multus-additional-cni-plugins-68wc5" Apr 20 20:05:27.894326 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.893770 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d3bb9caf-0865-431c-89be-a2a222a45633-host-var-lib-cni-multus\") pod \"multus-44qfk\" (UID: \"d3bb9caf-0865-431c-89be-a2a222a45633\") " pod="openshift-multus/multus-44qfk" Apr 20 20:05:27.894326 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.893785 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-systemd-units\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.894326 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.893802 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-var-lib-openvswitch\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.894326 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.893824 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-host-cni-netd\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.894326 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.893851 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d3bb9caf-0865-431c-89be-a2a222a45633-cnibin\") pod \"multus-44qfk\" (UID: \"d3bb9caf-0865-431c-89be-a2a222a45633\") " pod="openshift-multus/multus-44qfk" Apr 20 20:05:27.894326 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.893852 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/544a3b4c-fde7-4ae8-a14c-e19625554dfe-kubelet-dir\") pod \"aws-ebs-csi-driver-node-k58dg\" (UID: \"544a3b4c-fde7-4ae8-a14c-e19625554dfe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k58dg" Apr 20 20:05:27.894326 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.893874 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d3bb9caf-0865-431c-89be-a2a222a45633-etc-kubernetes\") pod \"multus-44qfk\" (UID: \"d3bb9caf-0865-431c-89be-a2a222a45633\") " pod="openshift-multus/multus-44qfk" Apr 20 20:05:27.894326 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.893905 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vl6jb\" (UniqueName: \"kubernetes.io/projected/4c96dea8-a54f-4ca2-a3fb-757208554fe3-kube-api-access-vl6jb\") pod \"network-metrics-daemon-2n29c\" (UID: \"4c96dea8-a54f-4ca2-a3fb-757208554fe3\") " pod="openshift-multus/network-metrics-daemon-2n29c" Apr 20 20:05:27.894326 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.893929 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-host-cni-bin\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.895072 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.893962 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ph8q\" (UniqueName: \"kubernetes.io/projected/45668015-eebd-4a0f-adbc-9ebb7f100cbd-kube-api-access-6ph8q\") pod \"iptables-alerter-nt6rc\" (UID: \"45668015-eebd-4a0f-adbc-9ebb7f100cbd\") " pod="openshift-network-operator/iptables-alerter-nt6rc" Apr 20 20:05:27.895072 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.893966 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d3bb9caf-0865-431c-89be-a2a222a45633-cnibin\") pod \"multus-44qfk\" (UID: \"d3bb9caf-0865-431c-89be-a2a222a45633\") " pod="openshift-multus/multus-44qfk" Apr 20 20:05:27.895072 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.893983 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d3bb9caf-0865-431c-89be-a2a222a45633-multus-cni-dir\") pod \"multus-44qfk\" (UID: \"d3bb9caf-0865-431c-89be-a2a222a45633\") " pod="openshift-multus/multus-44qfk" Apr 20 20:05:27.895072 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.894019 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d3bb9caf-0865-431c-89be-a2a222a45633-hostroot\") pod \"multus-44qfk\" (UID: \"d3bb9caf-0865-431c-89be-a2a222a45633\") " pod="openshift-multus/multus-44qfk" Apr 20 20:05:27.895072 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.894078 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-etc-openvswitch\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.895072 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.894081 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d3bb9caf-0865-431c-89be-a2a222a45633-etc-kubernetes\") pod \"multus-44qfk\" (UID: \"d3bb9caf-0865-431c-89be-a2a222a45633\") " pod="openshift-multus/multus-44qfk" Apr 20 20:05:27.895072 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.894129 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9cdd6b10-c252-4482-9ba2-40fdae9ef435-tmp-dir\") pod \"node-resolver-kzk2j\" (UID: \"9cdd6b10-c252-4482-9ba2-40fdae9ef435\") " pod="openshift-dns/node-resolver-kzk2j" Apr 20 20:05:27.895072 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.894141 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d3bb9caf-0865-431c-89be-a2a222a45633-multus-cni-dir\") pod \"multus-44qfk\" (UID: \"d3bb9caf-0865-431c-89be-a2a222a45633\") " pod="openshift-multus/multus-44qfk" Apr 20 20:05:27.895072 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.894148 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d3bb9caf-0865-431c-89be-a2a222a45633-hostroot\") pod \"multus-44qfk\" (UID: \"d3bb9caf-0865-431c-89be-a2a222a45633\") " pod="openshift-multus/multus-44qfk" Apr 20 20:05:27.895072 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.894156 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8c5b0ba6-52e0-4a02-bb42-a4b2b377f250-host\") pod \"node-ca-56wkj\" (UID: \"8c5b0ba6-52e0-4a02-bb42-a4b2b377f250\") " pod="openshift-image-registry/node-ca-56wkj" Apr 20 20:05:27.895072 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.894179 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-host-kubelet\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.895072 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.894201 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8c5b0ba6-52e0-4a02-bb42-a4b2b377f250-host\") pod \"node-ca-56wkj\" (UID: \"8c5b0ba6-52e0-4a02-bb42-a4b2b377f250\") " pod="openshift-image-registry/node-ca-56wkj" Apr 20 20:05:27.895072 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.894218 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.895072 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.894245 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfx99\" (UniqueName: \"kubernetes.io/projected/9cdd6b10-c252-4482-9ba2-40fdae9ef435-kube-api-access-mfx99\") pod \"node-resolver-kzk2j\" (UID: \"9cdd6b10-c252-4482-9ba2-40fdae9ef435\") " pod="openshift-dns/node-resolver-kzk2j" Apr 20 20:05:27.895072 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.894270 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/45f05212-3e62-445f-af62-b586721d3417-cnibin\") pod \"multus-additional-cni-plugins-68wc5\" (UID: \"45f05212-3e62-445f-af62-b586721d3417\") " pod="openshift-multus/multus-additional-cni-plugins-68wc5" Apr 20 20:05:27.895072 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.894294 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/45f05212-3e62-445f-af62-b586721d3417-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-68wc5\" (UID: \"45f05212-3e62-445f-af62-b586721d3417\") " pod="openshift-multus/multus-additional-cni-plugins-68wc5" Apr 20 20:05:27.895072 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.894315 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-run-ovn\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.895866 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.894336 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-node-log\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.895866 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.894373 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7f59f5a3-2821-4412-9688-73d69c9bbb4c-etc-modprobe-d\") pod \"tuned-lzm5k\" (UID: \"7f59f5a3-2821-4412-9688-73d69c9bbb4c\") " pod="openshift-cluster-node-tuning-operator/tuned-lzm5k" Apr 20 20:05:27.895866 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.894414 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ngjgb\" (UniqueName: \"kubernetes.io/projected/544a3b4c-fde7-4ae8-a14c-e19625554dfe-kube-api-access-ngjgb\") pod \"aws-ebs-csi-driver-node-k58dg\" (UID: \"544a3b4c-fde7-4ae8-a14c-e19625554dfe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k58dg" Apr 20 20:05:27.895866 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.894438 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/45f05212-3e62-445f-af62-b586721d3417-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-68wc5\" (UID: \"45f05212-3e62-445f-af62-b586721d3417\") " pod="openshift-multus/multus-additional-cni-plugins-68wc5" Apr 20 20:05:27.895866 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.894462 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7f59f5a3-2821-4412-9688-73d69c9bbb4c-etc-systemd\") pod \"tuned-lzm5k\" (UID: \"7f59f5a3-2821-4412-9688-73d69c9bbb4c\") " pod="openshift-cluster-node-tuning-operator/tuned-lzm5k" Apr 20 20:05:27.895866 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.894497 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/544a3b4c-fde7-4ae8-a14c-e19625554dfe-etc-selinux\") pod \"aws-ebs-csi-driver-node-k58dg\" (UID: \"544a3b4c-fde7-4ae8-a14c-e19625554dfe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k58dg" Apr 20 20:05:27.895866 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.894520 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0a9e3157-7a26-4d6c-9f83-58c9eca7c51a-agent-certs\") pod \"konnectivity-agent-fsxdh\" (UID: \"0a9e3157-7a26-4d6c-9f83-58c9eca7c51a\") " pod="kube-system/konnectivity-agent-fsxdh" Apr 20 20:05:27.895866 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.894550 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7f59f5a3-2821-4412-9688-73d69c9bbb4c-etc-kubernetes\") pod \"tuned-lzm5k\" (UID: \"7f59f5a3-2821-4412-9688-73d69c9bbb4c\") " pod="openshift-cluster-node-tuning-operator/tuned-lzm5k" Apr 20 20:05:27.895866 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.894577 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d3bb9caf-0865-431c-89be-a2a222a45633-host-var-lib-cni-bin\") pod \"multus-44qfk\" (UID: \"d3bb9caf-0865-431c-89be-a2a222a45633\") " pod="openshift-multus/multus-44qfk" Apr 20 20:05:27.895866 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.894601 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/45f05212-3e62-445f-af62-b586721d3417-os-release\") pod \"multus-additional-cni-plugins-68wc5\" (UID: \"45f05212-3e62-445f-af62-b586721d3417\") " pod="openshift-multus/multus-additional-cni-plugins-68wc5" Apr 20 20:05:27.895866 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.894656 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/45f05212-3e62-445f-af62-b586721d3417-cni-binary-copy\") pod \"multus-additional-cni-plugins-68wc5\" (UID: \"45f05212-3e62-445f-af62-b586721d3417\") " pod="openshift-multus/multus-additional-cni-plugins-68wc5" Apr 20 20:05:27.895866 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.894673 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/544a3b4c-fde7-4ae8-a14c-e19625554dfe-etc-selinux\") pod \"aws-ebs-csi-driver-node-k58dg\" (UID: \"544a3b4c-fde7-4ae8-a14c-e19625554dfe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k58dg" Apr 20 20:05:27.895866 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.894688 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-host-slash\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.895866 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.894693 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d3bb9caf-0865-431c-89be-a2a222a45633-host-var-lib-cni-bin\") pod \"multus-44qfk\" (UID: \"d3bb9caf-0865-431c-89be-a2a222a45633\") " pod="openshift-multus/multus-44qfk" Apr 20 20:05:27.895866 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.894718 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-run-systemd\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.895866 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.894750 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d3bb9caf-0865-431c-89be-a2a222a45633-multus-daemon-config\") pod \"multus-44qfk\" (UID: \"d3bb9caf-0865-431c-89be-a2a222a45633\") " pod="openshift-multus/multus-44qfk" Apr 20 20:05:27.895866 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.894791 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4nphp\" (UniqueName: \"kubernetes.io/projected/d3bb9caf-0865-431c-89be-a2a222a45633-kube-api-access-4nphp\") pod \"multus-44qfk\" (UID: \"d3bb9caf-0865-431c-89be-a2a222a45633\") " pod="openshift-multus/multus-44qfk" Apr 20 20:05:27.896613 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.894815 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/2f559047-01c4-4f3b-a5a7-1c183af11e8f-kubelet-config\") pod \"global-pull-secret-syncer-7wn8c\" (UID: \"2f559047-01c4-4f3b-a5a7-1c183af11e8f\") " pod="kube-system/global-pull-secret-syncer-7wn8c" Apr 20 20:05:27.896613 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.894848 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/544a3b4c-fde7-4ae8-a14c-e19625554dfe-sys-fs\") pod \"aws-ebs-csi-driver-node-k58dg\" (UID: \"544a3b4c-fde7-4ae8-a14c-e19625554dfe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k58dg" Apr 20 20:05:27.896613 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.894890 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/45668015-eebd-4a0f-adbc-9ebb7f100cbd-iptables-alerter-script\") pod \"iptables-alerter-nt6rc\" (UID: \"45668015-eebd-4a0f-adbc-9ebb7f100cbd\") " pod="openshift-network-operator/iptables-alerter-nt6rc" Apr 20 20:05:27.896613 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.894961 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7f59f5a3-2821-4412-9688-73d69c9bbb4c-etc-sysconfig\") pod \"tuned-lzm5k\" (UID: \"7f59f5a3-2821-4412-9688-73d69c9bbb4c\") " pod="openshift-cluster-node-tuning-operator/tuned-lzm5k" Apr 20 20:05:27.896613 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.894996 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7f59f5a3-2821-4412-9688-73d69c9bbb4c-lib-modules\") pod \"tuned-lzm5k\" (UID: \"7f59f5a3-2821-4412-9688-73d69c9bbb4c\") " pod="openshift-cluster-node-tuning-operator/tuned-lzm5k" Apr 20 20:05:27.896613 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.895018 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7f59f5a3-2821-4412-9688-73d69c9bbb4c-host\") pod \"tuned-lzm5k\" (UID: \"7f59f5a3-2821-4412-9688-73d69c9bbb4c\") " pod="openshift-cluster-node-tuning-operator/tuned-lzm5k" Apr 20 20:05:27.896613 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.895043 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d3bb9caf-0865-431c-89be-a2a222a45633-cni-binary-copy\") pod \"multus-44qfk\" (UID: \"d3bb9caf-0865-431c-89be-a2a222a45633\") " pod="openshift-multus/multus-44qfk" Apr 20 20:05:27.896613 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.895067 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d3bb9caf-0865-431c-89be-a2a222a45633-multus-socket-dir-parent\") pod \"multus-44qfk\" (UID: \"d3bb9caf-0865-431c-89be-a2a222a45633\") " pod="openshift-multus/multus-44qfk" Apr 20 20:05:27.896613 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.895151 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d3bb9caf-0865-431c-89be-a2a222a45633-host-var-lib-kubelet\") pod \"multus-44qfk\" (UID: \"d3bb9caf-0865-431c-89be-a2a222a45633\") " pod="openshift-multus/multus-44qfk" Apr 20 20:05:27.896613 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.895178 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4c96dea8-a54f-4ca2-a3fb-757208554fe3-metrics-certs\") pod \"network-metrics-daemon-2n29c\" (UID: \"4c96dea8-a54f-4ca2-a3fb-757208554fe3\") " pod="openshift-multus/network-metrics-daemon-2n29c" Apr 20 20:05:27.896613 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.895240 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d3bb9caf-0865-431c-89be-a2a222a45633-host-var-lib-kubelet\") pod \"multus-44qfk\" (UID: \"d3bb9caf-0865-431c-89be-a2a222a45633\") " pod="openshift-multus/multus-44qfk" Apr 20 20:05:27.896613 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.895299 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/544a3b4c-fde7-4ae8-a14c-e19625554dfe-device-dir\") pod \"aws-ebs-csi-driver-node-k58dg\" (UID: \"544a3b4c-fde7-4ae8-a14c-e19625554dfe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k58dg" Apr 20 20:05:27.896613 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.895329 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8c5b0ba6-52e0-4a02-bb42-a4b2b377f250-serviceca\") pod \"node-ca-56wkj\" (UID: \"8c5b0ba6-52e0-4a02-bb42-a4b2b377f250\") " pod="openshift-image-registry/node-ca-56wkj" Apr 20 20:05:27.896613 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.895360 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-env-overrides\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.896613 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.895400 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-ovn-node-metrics-cert\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.896613 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.895440 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/2f559047-01c4-4f3b-a5a7-1c183af11e8f-kubelet-config\") pod \"global-pull-secret-syncer-7wn8c\" (UID: \"2f559047-01c4-4f3b-a5a7-1c183af11e8f\") " pod="kube-system/global-pull-secret-syncer-7wn8c" Apr 20 20:05:27.896613 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.895477 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-host-run-netns\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.897377 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.895515 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7f59f5a3-2821-4412-9688-73d69c9bbb4c-var-lib-kubelet\") pod \"tuned-lzm5k\" (UID: \"7f59f5a3-2821-4412-9688-73d69c9bbb4c\") " pod="openshift-cluster-node-tuning-operator/tuned-lzm5k" Apr 20 20:05:27.897377 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.895549 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7f59f5a3-2821-4412-9688-73d69c9bbb4c-etc-tuned\") pod \"tuned-lzm5k\" (UID: \"7f59f5a3-2821-4412-9688-73d69c9bbb4c\") " pod="openshift-cluster-node-tuning-operator/tuned-lzm5k" Apr 20 20:05:27.897377 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.895617 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/544a3b4c-fde7-4ae8-a14c-e19625554dfe-sys-fs\") pod \"aws-ebs-csi-driver-node-k58dg\" (UID: \"544a3b4c-fde7-4ae8-a14c-e19625554dfe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k58dg" Apr 20 20:05:27.897377 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.895675 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7f59f5a3-2821-4412-9688-73d69c9bbb4c-tmp\") pod \"tuned-lzm5k\" (UID: \"7f59f5a3-2821-4412-9688-73d69c9bbb4c\") " pod="openshift-cluster-node-tuning-operator/tuned-lzm5k" Apr 20 20:05:27.897377 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.895778 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d3bb9caf-0865-431c-89be-a2a222a45633-multus-socket-dir-parent\") pod \"multus-44qfk\" (UID: \"d3bb9caf-0865-431c-89be-a2a222a45633\") " pod="openshift-multus/multus-44qfk" Apr 20 20:05:27.897377 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.895818 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/544a3b4c-fde7-4ae8-a14c-e19625554dfe-device-dir\") pod \"aws-ebs-csi-driver-node-k58dg\" (UID: \"544a3b4c-fde7-4ae8-a14c-e19625554dfe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k58dg" Apr 20 20:05:27.897377 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.895981 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d3bb9caf-0865-431c-89be-a2a222a45633-multus-daemon-config\") pod \"multus-44qfk\" (UID: \"d3bb9caf-0865-431c-89be-a2a222a45633\") " pod="openshift-multus/multus-44qfk" Apr 20 20:05:27.897377 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.896100 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8c5b0ba6-52e0-4a02-bb42-a4b2b377f250-serviceca\") pod \"node-ca-56wkj\" (UID: \"8c5b0ba6-52e0-4a02-bb42-a4b2b377f250\") " pod="openshift-image-registry/node-ca-56wkj" Apr 20 20:05:27.897377 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.896182 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d3bb9caf-0865-431c-89be-a2a222a45633-cni-binary-copy\") pod \"multus-44qfk\" (UID: \"d3bb9caf-0865-431c-89be-a2a222a45633\") " pod="openshift-multus/multus-44qfk" Apr 20 20:05:27.897377 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:27.896313 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:27.897377 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:27.896431 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c96dea8-a54f-4ca2-a3fb-757208554fe3-metrics-certs podName:4c96dea8-a54f-4ca2-a3fb-757208554fe3 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:28.396414493 +0000 UTC m=+3.109096492 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4c96dea8-a54f-4ca2-a3fb-757208554fe3-metrics-certs") pod "network-metrics-daemon-2n29c" (UID: "4c96dea8-a54f-4ca2-a3fb-757208554fe3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:27.901479 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.901455 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 20:05:27.905486 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.905457 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nphp\" (UniqueName: \"kubernetes.io/projected/d3bb9caf-0865-431c-89be-a2a222a45633-kube-api-access-4nphp\") pod \"multus-44qfk\" (UID: \"d3bb9caf-0865-431c-89be-a2a222a45633\") " pod="openshift-multus/multus-44qfk" Apr 20 20:05:27.905590 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.905463 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hl9mj\" (UniqueName: \"kubernetes.io/projected/8c5b0ba6-52e0-4a02-bb42-a4b2b377f250-kube-api-access-hl9mj\") pod \"node-ca-56wkj\" (UID: \"8c5b0ba6-52e0-4a02-bb42-a4b2b377f250\") " pod="openshift-image-registry/node-ca-56wkj" Apr 20 20:05:27.905764 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.905742 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngjgb\" (UniqueName: \"kubernetes.io/projected/544a3b4c-fde7-4ae8-a14c-e19625554dfe-kube-api-access-ngjgb\") pod \"aws-ebs-csi-driver-node-k58dg\" (UID: \"544a3b4c-fde7-4ae8-a14c-e19625554dfe\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k58dg" Apr 20 20:05:27.906560 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.906539 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vl6jb\" (UniqueName: \"kubernetes.io/projected/4c96dea8-a54f-4ca2-a3fb-757208554fe3-kube-api-access-vl6jb\") pod \"network-metrics-daemon-2n29c\" (UID: \"4c96dea8-a54f-4ca2-a3fb-757208554fe3\") " pod="openshift-multus/network-metrics-daemon-2n29c" Apr 20 20:05:27.947200 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.947152 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-23.ec2.internal" event={"ID":"7fe58c6b7d5dfc478334c93846063b98","Type":"ContainerStarted","Data":"8ece4fb182e4da627371008fb07444bac10d76d65416ca0245962645d67cb724"} Apr 20 20:05:27.948285 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.948253 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-23.ec2.internal" event={"ID":"9f836079fce259ee26f3d68a7bc5ab6d","Type":"ContainerStarted","Data":"d6775a1da80ca5a1a66b5647cce249386359477fe9622d5890a29712e4af21c5"} Apr 20 20:05:27.996249 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.996219 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/45668015-eebd-4a0f-adbc-9ebb7f100cbd-iptables-alerter-script\") pod \"iptables-alerter-nt6rc\" (UID: \"45668015-eebd-4a0f-adbc-9ebb7f100cbd\") " pod="openshift-network-operator/iptables-alerter-nt6rc" Apr 20 20:05:27.996348 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.996257 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7f59f5a3-2821-4412-9688-73d69c9bbb4c-etc-sysconfig\") pod \"tuned-lzm5k\" (UID: \"7f59f5a3-2821-4412-9688-73d69c9bbb4c\") " pod="openshift-cluster-node-tuning-operator/tuned-lzm5k" Apr 20 20:05:27.996348 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.996282 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7f59f5a3-2821-4412-9688-73d69c9bbb4c-lib-modules\") pod \"tuned-lzm5k\" (UID: \"7f59f5a3-2821-4412-9688-73d69c9bbb4c\") " pod="openshift-cluster-node-tuning-operator/tuned-lzm5k" Apr 20 20:05:27.996348 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.996304 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7f59f5a3-2821-4412-9688-73d69c9bbb4c-host\") pod \"tuned-lzm5k\" (UID: \"7f59f5a3-2821-4412-9688-73d69c9bbb4c\") " pod="openshift-cluster-node-tuning-operator/tuned-lzm5k" Apr 20 20:05:27.996348 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.996346 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-env-overrides\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.996497 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.996370 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-ovn-node-metrics-cert\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.996497 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.996377 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7f59f5a3-2821-4412-9688-73d69c9bbb4c-etc-sysconfig\") pod \"tuned-lzm5k\" (UID: \"7f59f5a3-2821-4412-9688-73d69c9bbb4c\") " pod="openshift-cluster-node-tuning-operator/tuned-lzm5k" Apr 20 20:05:27.996497 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.996393 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-host-run-netns\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.996497 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.996434 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7f59f5a3-2821-4412-9688-73d69c9bbb4c-var-lib-kubelet\") pod \"tuned-lzm5k\" (UID: \"7f59f5a3-2821-4412-9688-73d69c9bbb4c\") " pod="openshift-cluster-node-tuning-operator/tuned-lzm5k" Apr 20 20:05:27.996497 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.996438 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-host-run-netns\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.996497 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.996460 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7f59f5a3-2821-4412-9688-73d69c9bbb4c-etc-tuned\") pod \"tuned-lzm5k\" (UID: \"7f59f5a3-2821-4412-9688-73d69c9bbb4c\") " pod="openshift-cluster-node-tuning-operator/tuned-lzm5k" Apr 20 20:05:27.996497 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.996483 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7f59f5a3-2821-4412-9688-73d69c9bbb4c-tmp\") pod \"tuned-lzm5k\" (UID: \"7f59f5a3-2821-4412-9688-73d69c9bbb4c\") " pod="openshift-cluster-node-tuning-operator/tuned-lzm5k" Apr 20 20:05:27.996741 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.996508 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0a9e3157-7a26-4d6c-9f83-58c9eca7c51a-konnectivity-ca\") pod \"konnectivity-agent-fsxdh\" (UID: \"0a9e3157-7a26-4d6c-9f83-58c9eca7c51a\") " pod="kube-system/konnectivity-agent-fsxdh" Apr 20 20:05:27.996741 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.996533 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7f59f5a3-2821-4412-9688-73d69c9bbb4c-lib-modules\") pod \"tuned-lzm5k\" (UID: \"7f59f5a3-2821-4412-9688-73d69c9bbb4c\") " pod="openshift-cluster-node-tuning-operator/tuned-lzm5k" Apr 20 20:05:27.996741 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.996554 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fnzv7\" (UniqueName: \"kubernetes.io/projected/45f05212-3e62-445f-af62-b586721d3417-kube-api-access-fnzv7\") pod \"multus-additional-cni-plugins-68wc5\" (UID: \"45f05212-3e62-445f-af62-b586721d3417\") " pod="openshift-multus/multus-additional-cni-plugins-68wc5" Apr 20 20:05:27.996741 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.996578 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7f59f5a3-2821-4412-9688-73d69c9bbb4c-host\") pod \"tuned-lzm5k\" (UID: \"7f59f5a3-2821-4412-9688-73d69c9bbb4c\") " pod="openshift-cluster-node-tuning-operator/tuned-lzm5k" Apr 20 20:05:27.996741 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.996624 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7f59f5a3-2821-4412-9688-73d69c9bbb4c-var-lib-kubelet\") pod \"tuned-lzm5k\" (UID: \"7f59f5a3-2821-4412-9688-73d69c9bbb4c\") " pod="openshift-cluster-node-tuning-operator/tuned-lzm5k" Apr 20 20:05:27.996952 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.996579 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-run-openvswitch\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.996952 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.996801 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7f59f5a3-2821-4412-9688-73d69c9bbb4c-etc-sysctl-d\") pod \"tuned-lzm5k\" (UID: \"7f59f5a3-2821-4412-9688-73d69c9bbb4c\") " pod="openshift-cluster-node-tuning-operator/tuned-lzm5k" Apr 20 20:05:27.996952 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.996838 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84l8q\" (UniqueName: \"kubernetes.io/projected/26d6564f-741b-481d-a1c3-a42559981c32-kube-api-access-84l8q\") pod \"network-check-target-p8x9m\" (UID: \"26d6564f-741b-481d-a1c3-a42559981c32\") " pod="openshift-network-diagnostics/network-check-target-p8x9m" Apr 20 20:05:27.996952 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.996865 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/45668015-eebd-4a0f-adbc-9ebb7f100cbd-host-slash\") pod \"iptables-alerter-nt6rc\" (UID: \"45668015-eebd-4a0f-adbc-9ebb7f100cbd\") " pod="openshift-network-operator/iptables-alerter-nt6rc" Apr 20 20:05:27.996952 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.996889 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7f59f5a3-2821-4412-9688-73d69c9bbb4c-etc-sysctl-conf\") pod \"tuned-lzm5k\" (UID: \"7f59f5a3-2821-4412-9688-73d69c9bbb4c\") " pod="openshift-cluster-node-tuning-operator/tuned-lzm5k" Apr 20 20:05:27.996952 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.996914 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7f59f5a3-2821-4412-9688-73d69c9bbb4c-sys\") pod \"tuned-lzm5k\" (UID: \"7f59f5a3-2821-4412-9688-73d69c9bbb4c\") " pod="openshift-cluster-node-tuning-operator/tuned-lzm5k" Apr 20 20:05:27.996952 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.996925 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-run-openvswitch\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.996952 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.996938 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9cdd6b10-c252-4482-9ba2-40fdae9ef435-hosts-file\") pod \"node-resolver-kzk2j\" (UID: \"9cdd6b10-c252-4482-9ba2-40fdae9ef435\") " pod="openshift-dns/node-resolver-kzk2j" Apr 20 20:05:27.997333 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.996982 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/45f05212-3e62-445f-af62-b586721d3417-tuning-conf-dir\") pod \"multus-additional-cni-plugins-68wc5\" (UID: \"45f05212-3e62-445f-af62-b586721d3417\") " pod="openshift-multus/multus-additional-cni-plugins-68wc5" Apr 20 20:05:27.997333 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.996987 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9cdd6b10-c252-4482-9ba2-40fdae9ef435-hosts-file\") pod \"node-resolver-kzk2j\" (UID: \"9cdd6b10-c252-4482-9ba2-40fdae9ef435\") " pod="openshift-dns/node-resolver-kzk2j" Apr 20 20:05:27.997333 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.997010 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-log-socket\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.997333 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.997040 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-ovnkube-script-lib\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.997333 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.997061 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-env-overrides\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.997333 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.997065 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hfbmb\" (UniqueName: \"kubernetes.io/projected/7f59f5a3-2821-4412-9688-73d69c9bbb4c-kube-api-access-hfbmb\") pod \"tuned-lzm5k\" (UID: \"7f59f5a3-2821-4412-9688-73d69c9bbb4c\") " pod="openshift-cluster-node-tuning-operator/tuned-lzm5k" Apr 20 20:05:27.997333 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.997128 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0a9e3157-7a26-4d6c-9f83-58c9eca7c51a-konnectivity-ca\") pod \"konnectivity-agent-fsxdh\" (UID: \"0a9e3157-7a26-4d6c-9f83-58c9eca7c51a\") " pod="kube-system/konnectivity-agent-fsxdh" Apr 20 20:05:27.997333 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.997133 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-host-run-ovn-kubernetes\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.997333 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.997168 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-host-run-ovn-kubernetes\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.997333 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.997172 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-ovnkube-config\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.997333 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.997209 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pwck5\" (UniqueName: \"kubernetes.io/projected/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-kube-api-access-pwck5\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.997333 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.997215 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/45668015-eebd-4a0f-adbc-9ebb7f100cbd-host-slash\") pod \"iptables-alerter-nt6rc\" (UID: \"45668015-eebd-4a0f-adbc-9ebb7f100cbd\") " pod="openshift-network-operator/iptables-alerter-nt6rc" Apr 20 20:05:27.997333 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.997236 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7f59f5a3-2821-4412-9688-73d69c9bbb4c-run\") pod \"tuned-lzm5k\" (UID: \"7f59f5a3-2821-4412-9688-73d69c9bbb4c\") " pod="openshift-cluster-node-tuning-operator/tuned-lzm5k" Apr 20 20:05:27.997333 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.997270 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/45f05212-3e62-445f-af62-b586721d3417-system-cni-dir\") pod \"multus-additional-cni-plugins-68wc5\" (UID: \"45f05212-3e62-445f-af62-b586721d3417\") " pod="openshift-multus/multus-additional-cni-plugins-68wc5" Apr 20 20:05:27.997333 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.997293 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-systemd-units\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.997333 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.997316 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-var-lib-openvswitch\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.997333 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.997324 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7f59f5a3-2821-4412-9688-73d69c9bbb4c-etc-sysctl-d\") pod \"tuned-lzm5k\" (UID: \"7f59f5a3-2821-4412-9688-73d69c9bbb4c\") " pod="openshift-cluster-node-tuning-operator/tuned-lzm5k" Apr 20 20:05:27.998078 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.997339 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-host-cni-netd\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.998078 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.997386 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-host-cni-bin\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.998078 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.997411 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6ph8q\" (UniqueName: \"kubernetes.io/projected/45668015-eebd-4a0f-adbc-9ebb7f100cbd-kube-api-access-6ph8q\") pod \"iptables-alerter-nt6rc\" (UID: \"45668015-eebd-4a0f-adbc-9ebb7f100cbd\") " pod="openshift-network-operator/iptables-alerter-nt6rc" Apr 20 20:05:27.998078 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.997438 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-etc-openvswitch\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.998078 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.997461 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9cdd6b10-c252-4482-9ba2-40fdae9ef435-tmp-dir\") pod \"node-resolver-kzk2j\" (UID: \"9cdd6b10-c252-4482-9ba2-40fdae9ef435\") " pod="openshift-dns/node-resolver-kzk2j" Apr 20 20:05:27.998078 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.997486 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-host-kubelet\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.998078 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.997512 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.998078 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.997539 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mfx99\" (UniqueName: \"kubernetes.io/projected/9cdd6b10-c252-4482-9ba2-40fdae9ef435-kube-api-access-mfx99\") pod \"node-resolver-kzk2j\" (UID: \"9cdd6b10-c252-4482-9ba2-40fdae9ef435\") " pod="openshift-dns/node-resolver-kzk2j" Apr 20 20:05:27.998078 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.997565 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/45f05212-3e62-445f-af62-b586721d3417-cnibin\") pod \"multus-additional-cni-plugins-68wc5\" (UID: \"45f05212-3e62-445f-af62-b586721d3417\") " pod="openshift-multus/multus-additional-cni-plugins-68wc5" Apr 20 20:05:27.998078 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.997621 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/45f05212-3e62-445f-af62-b586721d3417-cnibin\") pod \"multus-additional-cni-plugins-68wc5\" (UID: \"45f05212-3e62-445f-af62-b586721d3417\") " pod="openshift-multus/multus-additional-cni-plugins-68wc5" Apr 20 20:05:27.998078 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.997609 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-host-cni-netd\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.998078 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.997339 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-log-socket\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.998078 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.997646 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/45f05212-3e62-445f-af62-b586721d3417-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-68wc5\" (UID: \"45f05212-3e62-445f-af62-b586721d3417\") " pod="openshift-multus/multus-additional-cni-plugins-68wc5" Apr 20 20:05:27.998078 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.997677 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-run-ovn\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.998078 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.997681 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-host-cni-bin\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.998078 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.997704 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-node-log\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.998078 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.997718 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-run-ovn\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.998837 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.997730 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7f59f5a3-2821-4412-9688-73d69c9bbb4c-etc-modprobe-d\") pod \"tuned-lzm5k\" (UID: \"7f59f5a3-2821-4412-9688-73d69c9bbb4c\") " pod="openshift-cluster-node-tuning-operator/tuned-lzm5k" Apr 20 20:05:27.998837 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.997757 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/45f05212-3e62-445f-af62-b586721d3417-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-68wc5\" (UID: \"45f05212-3e62-445f-af62-b586721d3417\") " pod="openshift-multus/multus-additional-cni-plugins-68wc5" Apr 20 20:05:27.998837 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.997784 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7f59f5a3-2821-4412-9688-73d69c9bbb4c-etc-systemd\") pod \"tuned-lzm5k\" (UID: \"7f59f5a3-2821-4412-9688-73d69c9bbb4c\") " pod="openshift-cluster-node-tuning-operator/tuned-lzm5k" Apr 20 20:05:27.998837 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.997810 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0a9e3157-7a26-4d6c-9f83-58c9eca7c51a-agent-certs\") pod \"konnectivity-agent-fsxdh\" (UID: \"0a9e3157-7a26-4d6c-9f83-58c9eca7c51a\") " pod="kube-system/konnectivity-agent-fsxdh" Apr 20 20:05:27.998837 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.997814 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-ovnkube-script-lib\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.998837 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.997835 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7f59f5a3-2821-4412-9688-73d69c9bbb4c-etc-kubernetes\") pod \"tuned-lzm5k\" (UID: \"7f59f5a3-2821-4412-9688-73d69c9bbb4c\") " pod="openshift-cluster-node-tuning-operator/tuned-lzm5k" Apr 20 20:05:27.998837 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.997862 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/45f05212-3e62-445f-af62-b586721d3417-os-release\") pod \"multus-additional-cni-plugins-68wc5\" (UID: \"45f05212-3e62-445f-af62-b586721d3417\") " pod="openshift-multus/multus-additional-cni-plugins-68wc5" Apr 20 20:05:27.998837 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.997887 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/45f05212-3e62-445f-af62-b586721d3417-cni-binary-copy\") pod \"multus-additional-cni-plugins-68wc5\" (UID: \"45f05212-3e62-445f-af62-b586721d3417\") " pod="openshift-multus/multus-additional-cni-plugins-68wc5" Apr 20 20:05:27.998837 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.997914 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-host-slash\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.998837 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.997957 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-run-systemd\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.998837 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.998032 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-run-systemd\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.998837 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.998069 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-node-log\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.998837 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.998173 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7f59f5a3-2821-4412-9688-73d69c9bbb4c-etc-modprobe-d\") pod \"tuned-lzm5k\" (UID: \"7f59f5a3-2821-4412-9688-73d69c9bbb4c\") " pod="openshift-cluster-node-tuning-operator/tuned-lzm5k" Apr 20 20:05:27.998837 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.998253 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-etc-openvswitch\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.998837 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.998403 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/45f05212-3e62-445f-af62-b586721d3417-system-cni-dir\") pod \"multus-additional-cni-plugins-68wc5\" (UID: \"45f05212-3e62-445f-af62-b586721d3417\") " pod="openshift-multus/multus-additional-cni-plugins-68wc5" Apr 20 20:05:27.998837 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.998447 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-systemd-units\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.998837 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.998526 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9cdd6b10-c252-4482-9ba2-40fdae9ef435-tmp-dir\") pod \"node-resolver-kzk2j\" (UID: \"9cdd6b10-c252-4482-9ba2-40fdae9ef435\") " pod="openshift-dns/node-resolver-kzk2j" Apr 20 20:05:27.999642 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.998541 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/45f05212-3e62-445f-af62-b586721d3417-tuning-conf-dir\") pod \"multus-additional-cni-plugins-68wc5\" (UID: \"45f05212-3e62-445f-af62-b586721d3417\") " pod="openshift-multus/multus-additional-cni-plugins-68wc5" Apr 20 20:05:27.999642 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.998570 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-var-lib-openvswitch\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.999642 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.998609 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-host-kubelet\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.999642 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.998652 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/45f05212-3e62-445f-af62-b586721d3417-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-68wc5\" (UID: \"45f05212-3e62-445f-af62-b586721d3417\") " pod="openshift-multus/multus-additional-cni-plugins-68wc5" Apr 20 20:05:27.999642 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.998654 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.999642 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.998745 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7f59f5a3-2821-4412-9688-73d69c9bbb4c-etc-systemd\") pod \"tuned-lzm5k\" (UID: \"7f59f5a3-2821-4412-9688-73d69c9bbb4c\") " pod="openshift-cluster-node-tuning-operator/tuned-lzm5k" Apr 20 20:05:27.999642 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.998771 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7f59f5a3-2821-4412-9688-73d69c9bbb4c-etc-sysctl-conf\") pod \"tuned-lzm5k\" (UID: \"7f59f5a3-2821-4412-9688-73d69c9bbb4c\") " pod="openshift-cluster-node-tuning-operator/tuned-lzm5k" Apr 20 20:05:27.999642 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.998883 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/45f05212-3e62-445f-af62-b586721d3417-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-68wc5\" (UID: \"45f05212-3e62-445f-af62-b586721d3417\") " pod="openshift-multus/multus-additional-cni-plugins-68wc5" Apr 20 20:05:27.999642 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.998177 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-ovnkube-config\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.999642 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.998937 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-host-slash\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:27.999642 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.998972 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7f59f5a3-2821-4412-9688-73d69c9bbb4c-etc-kubernetes\") pod \"tuned-lzm5k\" (UID: \"7f59f5a3-2821-4412-9688-73d69c9bbb4c\") " pod="openshift-cluster-node-tuning-operator/tuned-lzm5k" Apr 20 20:05:27.999642 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.999011 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7f59f5a3-2821-4412-9688-73d69c9bbb4c-run\") pod \"tuned-lzm5k\" (UID: \"7f59f5a3-2821-4412-9688-73d69c9bbb4c\") " pod="openshift-cluster-node-tuning-operator/tuned-lzm5k" Apr 20 20:05:27.999642 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.999045 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7f59f5a3-2821-4412-9688-73d69c9bbb4c-sys\") pod \"tuned-lzm5k\" (UID: \"7f59f5a3-2821-4412-9688-73d69c9bbb4c\") " pod="openshift-cluster-node-tuning-operator/tuned-lzm5k" Apr 20 20:05:27.999642 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.999123 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/45f05212-3e62-445f-af62-b586721d3417-os-release\") pod \"multus-additional-cni-plugins-68wc5\" (UID: \"45f05212-3e62-445f-af62-b586721d3417\") " pod="openshift-multus/multus-additional-cni-plugins-68wc5" Apr 20 20:05:27.999642 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.999243 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/45f05212-3e62-445f-af62-b586721d3417-cni-binary-copy\") pod \"multus-additional-cni-plugins-68wc5\" (UID: \"45f05212-3e62-445f-af62-b586721d3417\") " pod="openshift-multus/multus-additional-cni-plugins-68wc5" Apr 20 20:05:27.999642 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:27.999244 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/45668015-eebd-4a0f-adbc-9ebb7f100cbd-iptables-alerter-script\") pod \"iptables-alerter-nt6rc\" (UID: \"45668015-eebd-4a0f-adbc-9ebb7f100cbd\") " pod="openshift-network-operator/iptables-alerter-nt6rc" Apr 20 20:05:28.000992 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:28.000413 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7f59f5a3-2821-4412-9688-73d69c9bbb4c-etc-tuned\") pod \"tuned-lzm5k\" (UID: \"7f59f5a3-2821-4412-9688-73d69c9bbb4c\") " pod="openshift-cluster-node-tuning-operator/tuned-lzm5k" Apr 20 20:05:28.001288 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:28.001264 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7f59f5a3-2821-4412-9688-73d69c9bbb4c-tmp\") pod \"tuned-lzm5k\" (UID: \"7f59f5a3-2821-4412-9688-73d69c9bbb4c\") " pod="openshift-cluster-node-tuning-operator/tuned-lzm5k" Apr 20 20:05:28.001691 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:28.001669 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-ovn-node-metrics-cert\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:28.001777 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:28.001714 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0a9e3157-7a26-4d6c-9f83-58c9eca7c51a-agent-certs\") pod \"konnectivity-agent-fsxdh\" (UID: \"0a9e3157-7a26-4d6c-9f83-58c9eca7c51a\") " pod="kube-system/konnectivity-agent-fsxdh" Apr 20 20:05:28.003426 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:28.003176 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:05:28.003426 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:28.003196 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:05:28.003426 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:28.003209 2576 projected.go:194] Error preparing data for projected volume kube-api-access-84l8q for pod openshift-network-diagnostics/network-check-target-p8x9m: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:28.003426 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:28.003266 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/26d6564f-741b-481d-a1c3-a42559981c32-kube-api-access-84l8q podName:26d6564f-741b-481d-a1c3-a42559981c32 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:28.503249317 +0000 UTC m=+3.215931295 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-84l8q" (UniqueName: "kubernetes.io/projected/26d6564f-741b-481d-a1c3-a42559981c32-kube-api-access-84l8q") pod "network-check-target-p8x9m" (UID: "26d6564f-741b-481d-a1c3-a42559981c32") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:28.006004 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:28.005964 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfx99\" (UniqueName: \"kubernetes.io/projected/9cdd6b10-c252-4482-9ba2-40fdae9ef435-kube-api-access-mfx99\") pod \"node-resolver-kzk2j\" (UID: \"9cdd6b10-c252-4482-9ba2-40fdae9ef435\") " pod="openshift-dns/node-resolver-kzk2j" Apr 20 20:05:28.006419 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:28.006400 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfbmb\" (UniqueName: \"kubernetes.io/projected/7f59f5a3-2821-4412-9688-73d69c9bbb4c-kube-api-access-hfbmb\") pod \"tuned-lzm5k\" (UID: \"7f59f5a3-2821-4412-9688-73d69c9bbb4c\") " pod="openshift-cluster-node-tuning-operator/tuned-lzm5k" Apr 20 20:05:28.007125 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:28.007086 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnzv7\" (UniqueName: \"kubernetes.io/projected/45f05212-3e62-445f-af62-b586721d3417-kube-api-access-fnzv7\") pod \"multus-additional-cni-plugins-68wc5\" (UID: \"45f05212-3e62-445f-af62-b586721d3417\") " pod="openshift-multus/multus-additional-cni-plugins-68wc5" Apr 20 20:05:28.007491 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:28.007470 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwck5\" (UniqueName: \"kubernetes.io/projected/f78ac3d9-bcf1-43dd-aac7-1678831ee3ba-kube-api-access-pwck5\") pod \"ovnkube-node-z55qt\" (UID: \"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba\") " pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:28.007644 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:28.007621 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ph8q\" (UniqueName: \"kubernetes.io/projected/45668015-eebd-4a0f-adbc-9ebb7f100cbd-kube-api-access-6ph8q\") pod \"iptables-alerter-nt6rc\" (UID: \"45668015-eebd-4a0f-adbc-9ebb7f100cbd\") " pod="openshift-network-operator/iptables-alerter-nt6rc" Apr 20 20:05:28.083311 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:28.083243 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-44qfk" Apr 20 20:05:28.090025 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:28.090001 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k58dg" Apr 20 20:05:28.100733 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:28.100711 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-56wkj" Apr 20 20:05:28.110750 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:28.110493 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-fsxdh" Apr 20 20:05:28.117223 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:28.117205 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-68wc5" Apr 20 20:05:28.123773 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:28.123758 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:28.132237 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:28.132220 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-nt6rc" Apr 20 20:05:28.139725 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:28.139709 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-lzm5k" Apr 20 20:05:28.145226 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:28.145207 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-kzk2j" Apr 20 20:05:28.401805 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:28.401737 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4c96dea8-a54f-4ca2-a3fb-757208554fe3-metrics-certs\") pod \"network-metrics-daemon-2n29c\" (UID: \"4c96dea8-a54f-4ca2-a3fb-757208554fe3\") " pod="openshift-multus/network-metrics-daemon-2n29c" Apr 20 20:05:28.401805 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:28.401780 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2f559047-01c4-4f3b-a5a7-1c183af11e8f-original-pull-secret\") pod \"global-pull-secret-syncer-7wn8c\" (UID: \"2f559047-01c4-4f3b-a5a7-1c183af11e8f\") " pod="kube-system/global-pull-secret-syncer-7wn8c" Apr 20 20:05:28.402001 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:28.401919 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:28.402001 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:28.401953 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 20:05:28.402081 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:28.402008 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c96dea8-a54f-4ca2-a3fb-757208554fe3-metrics-certs podName:4c96dea8-a54f-4ca2-a3fb-757208554fe3 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:29.401986541 +0000 UTC m=+4.114668523 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4c96dea8-a54f-4ca2-a3fb-757208554fe3-metrics-certs") pod "network-metrics-daemon-2n29c" (UID: "4c96dea8-a54f-4ca2-a3fb-757208554fe3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:28.402081 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:28.402028 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f559047-01c4-4f3b-a5a7-1c183af11e8f-original-pull-secret podName:2f559047-01c4-4f3b-a5a7-1c183af11e8f nodeName:}" failed. No retries permitted until 2026-04-20 20:05:29.402018016 +0000 UTC m=+4.114699996 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/2f559047-01c4-4f3b-a5a7-1c183af11e8f-original-pull-secret") pod "global-pull-secret-syncer-7wn8c" (UID: "2f559047-01c4-4f3b-a5a7-1c183af11e8f") : object "kube-system"/"original-pull-secret" not registered Apr 20 20:05:28.604263 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:28.604231 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84l8q\" (UniqueName: \"kubernetes.io/projected/26d6564f-741b-481d-a1c3-a42559981c32-kube-api-access-84l8q\") pod \"network-check-target-p8x9m\" (UID: \"26d6564f-741b-481d-a1c3-a42559981c32\") " pod="openshift-network-diagnostics/network-check-target-p8x9m" Apr 20 20:05:28.604407 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:28.604384 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:05:28.604407 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:28.604405 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:05:28.604482 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:28.604414 2576 projected.go:194] Error preparing data for projected volume kube-api-access-84l8q for pod openshift-network-diagnostics/network-check-target-p8x9m: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:28.604482 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:28.604465 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/26d6564f-741b-481d-a1c3-a42559981c32-kube-api-access-84l8q podName:26d6564f-741b-481d-a1c3-a42559981c32 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:29.604451829 +0000 UTC m=+4.317133803 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-84l8q" (UniqueName: "kubernetes.io/projected/26d6564f-741b-481d-a1c3-a42559981c32-kube-api-access-84l8q") pod "network-check-target-p8x9m" (UID: "26d6564f-741b-481d-a1c3-a42559981c32") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:28.665607 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:28.665581 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f59f5a3_2821_4412_9688_73d69c9bbb4c.slice/crio-313164c5ee628ddb90d6a42c2f4fd2b201534150eccfce41ea1656f712ddbf46 WatchSource:0}: Error finding container 313164c5ee628ddb90d6a42c2f4fd2b201534150eccfce41ea1656f712ddbf46: Status 404 returned error can't find the container with id 313164c5ee628ddb90d6a42c2f4fd2b201534150eccfce41ea1656f712ddbf46 Apr 20 20:05:28.666796 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:28.666766 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf78ac3d9_bcf1_43dd_aac7_1678831ee3ba.slice/crio-d2bd8d68c378574c701c7a74f0fddb4107f37af8b20a6bca808a51df74b035e9 WatchSource:0}: Error finding container d2bd8d68c378574c701c7a74f0fddb4107f37af8b20a6bca808a51df74b035e9: Status 404 returned error can't find the container with id d2bd8d68c378574c701c7a74f0fddb4107f37af8b20a6bca808a51df74b035e9 Apr 20 20:05:28.668553 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:28.668532 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a9e3157_7a26_4d6c_9f83_58c9eca7c51a.slice/crio-0141c27b2cca7e7862206ec6676860127e858c6a590bd881ff84a6190c8145a0 WatchSource:0}: Error finding container 0141c27b2cca7e7862206ec6676860127e858c6a590bd881ff84a6190c8145a0: Status 404 returned error can't find the container with id 0141c27b2cca7e7862206ec6676860127e858c6a590bd881ff84a6190c8145a0 Apr 20 20:05:28.671595 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:28.671476 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45f05212_3e62_445f_af62_b586721d3417.slice/crio-dee832cdfa6039c42819ffa113feb08fadbe220f143e2ab198e7f20c63812885 WatchSource:0}: Error finding container dee832cdfa6039c42819ffa113feb08fadbe220f143e2ab198e7f20c63812885: Status 404 returned error can't find the container with id dee832cdfa6039c42819ffa113feb08fadbe220f143e2ab198e7f20c63812885 Apr 20 20:05:28.672932 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:28.672912 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45668015_eebd_4a0f_adbc_9ebb7f100cbd.slice/crio-60d914ad23c9f9f46fe704cbf8d9ebce05655a28e53beb8b37eba44044358f0d WatchSource:0}: Error finding container 60d914ad23c9f9f46fe704cbf8d9ebce05655a28e53beb8b37eba44044358f0d: Status 404 returned error can't find the container with id 60d914ad23c9f9f46fe704cbf8d9ebce05655a28e53beb8b37eba44044358f0d Apr 20 20:05:28.673319 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:28.673216 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c5b0ba6_52e0_4a02_bb42_a4b2b377f250.slice/crio-473bf7511f7ead3daa65c8b08422b9323b5c0a77b34f0373c2b2523e0b92e9b4 WatchSource:0}: Error finding container 473bf7511f7ead3daa65c8b08422b9323b5c0a77b34f0373c2b2523e0b92e9b4: Status 404 returned error can't find the container with id 473bf7511f7ead3daa65c8b08422b9323b5c0a77b34f0373c2b2523e0b92e9b4 Apr 20 20:05:28.674166 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:28.674145 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cdd6b10_c252_4482_9ba2_40fdae9ef435.slice/crio-2e58d590ffccdffe729f0e1e671ae5973ef9eca4ba1d1000328ce35dfcfcfc78 WatchSource:0}: Error finding container 2e58d590ffccdffe729f0e1e671ae5973ef9eca4ba1d1000328ce35dfcfcfc78: Status 404 returned error can't find the container with id 2e58d590ffccdffe729f0e1e671ae5973ef9eca4ba1d1000328ce35dfcfcfc78 Apr 20 20:05:28.674836 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:28.674811 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod544a3b4c_fde7_4ae8_a14c_e19625554dfe.slice/crio-f271dcfa4d79584a664a55f708864f6f7202637fd7cdf7c9068177860997d677 WatchSource:0}: Error finding container f271dcfa4d79584a664a55f708864f6f7202637fd7cdf7c9068177860997d677: Status 404 returned error can't find the container with id f271dcfa4d79584a664a55f708864f6f7202637fd7cdf7c9068177860997d677 Apr 20 20:05:28.675807 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:05:28.675713 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3bb9caf_0865_431c_89be_a2a222a45633.slice/crio-3ecb9874541598616e0c2076c0860d6461ef463927abab47c3e748aa20b49759 WatchSource:0}: Error finding container 3ecb9874541598616e0c2076c0860d6461ef463927abab47c3e748aa20b49759: Status 404 returned error can't find the container with id 3ecb9874541598616e0c2076c0860d6461ef463927abab47c3e748aa20b49759 Apr 20 20:05:28.817696 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:28.817553 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 20:00:26 +0000 UTC" deadline="2028-01-08 03:38:19.1536663 +0000 UTC" Apr 20 20:05:28.818028 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:28.817697 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15055h32m50.335973816s" Apr 20 20:05:28.940150 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:28.940042 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7wn8c" Apr 20 20:05:28.940150 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:28.940049 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2n29c" Apr 20 20:05:28.940511 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:28.940178 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7wn8c" podUID="2f559047-01c4-4f3b-a5a7-1c183af11e8f" Apr 20 20:05:28.940511 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:28.940231 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2n29c" podUID="4c96dea8-a54f-4ca2-a3fb-757208554fe3" Apr 20 20:05:28.952244 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:28.952217 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k58dg" event={"ID":"544a3b4c-fde7-4ae8-a14c-e19625554dfe","Type":"ContainerStarted","Data":"f271dcfa4d79584a664a55f708864f6f7202637fd7cdf7c9068177860997d677"} Apr 20 20:05:28.953658 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:28.953625 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-kzk2j" event={"ID":"9cdd6b10-c252-4482-9ba2-40fdae9ef435","Type":"ContainerStarted","Data":"2e58d590ffccdffe729f0e1e671ae5973ef9eca4ba1d1000328ce35dfcfcfc78"} Apr 20 20:05:28.954629 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:28.954594 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" event={"ID":"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba","Type":"ContainerStarted","Data":"d2bd8d68c378574c701c7a74f0fddb4107f37af8b20a6bca808a51df74b035e9"} Apr 20 20:05:28.955546 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:28.955530 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-lzm5k" event={"ID":"7f59f5a3-2821-4412-9688-73d69c9bbb4c","Type":"ContainerStarted","Data":"313164c5ee628ddb90d6a42c2f4fd2b201534150eccfce41ea1656f712ddbf46"} Apr 20 20:05:28.957040 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:28.957015 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-23.ec2.internal" event={"ID":"7fe58c6b7d5dfc478334c93846063b98","Type":"ContainerStarted","Data":"5fb0bef680efc7649a4b813cafe923c904de067b3fc069e1b2451eb47247f4a6"} Apr 20 20:05:28.958023 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:28.958005 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-nt6rc" event={"ID":"45668015-eebd-4a0f-adbc-9ebb7f100cbd","Type":"ContainerStarted","Data":"60d914ad23c9f9f46fe704cbf8d9ebce05655a28e53beb8b37eba44044358f0d"} Apr 20 20:05:28.958849 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:28.958815 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-56wkj" event={"ID":"8c5b0ba6-52e0-4a02-bb42-a4b2b377f250","Type":"ContainerStarted","Data":"473bf7511f7ead3daa65c8b08422b9323b5c0a77b34f0373c2b2523e0b92e9b4"} Apr 20 20:05:28.959652 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:28.959635 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-44qfk" event={"ID":"d3bb9caf-0865-431c-89be-a2a222a45633","Type":"ContainerStarted","Data":"3ecb9874541598616e0c2076c0860d6461ef463927abab47c3e748aa20b49759"} Apr 20 20:05:28.960500 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:28.960476 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-68wc5" event={"ID":"45f05212-3e62-445f-af62-b586721d3417","Type":"ContainerStarted","Data":"dee832cdfa6039c42819ffa113feb08fadbe220f143e2ab198e7f20c63812885"} Apr 20 20:05:28.961268 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:28.961251 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-fsxdh" event={"ID":"0a9e3157-7a26-4d6c-9f83-58c9eca7c51a","Type":"ContainerStarted","Data":"0141c27b2cca7e7862206ec6676860127e858c6a590bd881ff84a6190c8145a0"} Apr 20 20:05:28.974103 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:28.974060 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-23.ec2.internal" podStartSLOduration=1.97404727 podStartE2EDuration="1.97404727s" podCreationTimestamp="2026-04-20 20:05:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:05:28.974015979 +0000 UTC m=+3.686697975" watchObservedRunningTime="2026-04-20 20:05:28.97404727 +0000 UTC m=+3.686729266" Apr 20 20:05:29.413536 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:29.413507 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4c96dea8-a54f-4ca2-a3fb-757208554fe3-metrics-certs\") pod \"network-metrics-daemon-2n29c\" (UID: \"4c96dea8-a54f-4ca2-a3fb-757208554fe3\") " pod="openshift-multus/network-metrics-daemon-2n29c" Apr 20 20:05:29.413665 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:29.413561 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2f559047-01c4-4f3b-a5a7-1c183af11e8f-original-pull-secret\") pod \"global-pull-secret-syncer-7wn8c\" (UID: \"2f559047-01c4-4f3b-a5a7-1c183af11e8f\") " pod="kube-system/global-pull-secret-syncer-7wn8c" Apr 20 20:05:29.413729 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:29.413694 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 20:05:29.413777 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:29.413752 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f559047-01c4-4f3b-a5a7-1c183af11e8f-original-pull-secret podName:2f559047-01c4-4f3b-a5a7-1c183af11e8f nodeName:}" failed. No retries permitted until 2026-04-20 20:05:31.413735726 +0000 UTC m=+6.126417703 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/2f559047-01c4-4f3b-a5a7-1c183af11e8f-original-pull-secret") pod "global-pull-secret-syncer-7wn8c" (UID: "2f559047-01c4-4f3b-a5a7-1c183af11e8f") : object "kube-system"/"original-pull-secret" not registered Apr 20 20:05:29.414146 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:29.414129 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:29.414247 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:29.414180 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c96dea8-a54f-4ca2-a3fb-757208554fe3-metrics-certs podName:4c96dea8-a54f-4ca2-a3fb-757208554fe3 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:31.414165564 +0000 UTC m=+6.126847541 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4c96dea8-a54f-4ca2-a3fb-757208554fe3-metrics-certs") pod "network-metrics-daemon-2n29c" (UID: "4c96dea8-a54f-4ca2-a3fb-757208554fe3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:29.616434 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:29.615812 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84l8q\" (UniqueName: \"kubernetes.io/projected/26d6564f-741b-481d-a1c3-a42559981c32-kube-api-access-84l8q\") pod \"network-check-target-p8x9m\" (UID: \"26d6564f-741b-481d-a1c3-a42559981c32\") " pod="openshift-network-diagnostics/network-check-target-p8x9m" Apr 20 20:05:29.616434 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:29.616004 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:05:29.616434 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:29.616021 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:05:29.616434 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:29.616034 2576 projected.go:194] Error preparing data for projected volume kube-api-access-84l8q for pod openshift-network-diagnostics/network-check-target-p8x9m: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:29.616434 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:29.616084 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/26d6564f-741b-481d-a1c3-a42559981c32-kube-api-access-84l8q podName:26d6564f-741b-481d-a1c3-a42559981c32 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:31.616068372 +0000 UTC m=+6.328750362 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-84l8q" (UniqueName: "kubernetes.io/projected/26d6564f-741b-481d-a1c3-a42559981c32-kube-api-access-84l8q") pod "network-check-target-p8x9m" (UID: "26d6564f-741b-481d-a1c3-a42559981c32") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:29.942130 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:29.940269 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p8x9m" Apr 20 20:05:29.942130 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:29.940397 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p8x9m" podUID="26d6564f-741b-481d-a1c3-a42559981c32" Apr 20 20:05:29.987935 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:29.987900 2576 generic.go:358] "Generic (PLEG): container finished" podID="9f836079fce259ee26f3d68a7bc5ab6d" containerID="abfb8dfe560c0d33a3c209c58f6bb89fbeb613ddc94a1ad3cb20e26bec3042cd" exitCode=0 Apr 20 20:05:29.988834 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:29.988802 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-23.ec2.internal" event={"ID":"9f836079fce259ee26f3d68a7bc5ab6d","Type":"ContainerDied","Data":"abfb8dfe560c0d33a3c209c58f6bb89fbeb613ddc94a1ad3cb20e26bec3042cd"} Apr 20 20:05:30.940847 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:30.940304 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2n29c" Apr 20 20:05:30.940847 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:30.940339 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7wn8c" Apr 20 20:05:30.940847 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:30.940428 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2n29c" podUID="4c96dea8-a54f-4ca2-a3fb-757208554fe3" Apr 20 20:05:30.940847 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:30.940520 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7wn8c" podUID="2f559047-01c4-4f3b-a5a7-1c183af11e8f" Apr 20 20:05:31.009104 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:31.008492 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-23.ec2.internal" event={"ID":"9f836079fce259ee26f3d68a7bc5ab6d","Type":"ContainerStarted","Data":"bf3dd7fb6f1503db449d4fe2671e0199cc8925d693bbdede0252c1166265ce36"} Apr 20 20:05:31.431439 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:31.431358 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4c96dea8-a54f-4ca2-a3fb-757208554fe3-metrics-certs\") pod \"network-metrics-daemon-2n29c\" (UID: \"4c96dea8-a54f-4ca2-a3fb-757208554fe3\") " pod="openshift-multus/network-metrics-daemon-2n29c" Apr 20 20:05:31.431439 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:31.431424 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2f559047-01c4-4f3b-a5a7-1c183af11e8f-original-pull-secret\") pod \"global-pull-secret-syncer-7wn8c\" (UID: \"2f559047-01c4-4f3b-a5a7-1c183af11e8f\") " pod="kube-system/global-pull-secret-syncer-7wn8c" Apr 20 20:05:31.431667 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:31.431604 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 20:05:31.431667 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:31.431665 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f559047-01c4-4f3b-a5a7-1c183af11e8f-original-pull-secret podName:2f559047-01c4-4f3b-a5a7-1c183af11e8f nodeName:}" failed. No retries permitted until 2026-04-20 20:05:35.431647322 +0000 UTC m=+10.144329298 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/2f559047-01c4-4f3b-a5a7-1c183af11e8f-original-pull-secret") pod "global-pull-secret-syncer-7wn8c" (UID: "2f559047-01c4-4f3b-a5a7-1c183af11e8f") : object "kube-system"/"original-pull-secret" not registered Apr 20 20:05:31.432129 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:31.432087 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:31.432223 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:31.432159 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c96dea8-a54f-4ca2-a3fb-757208554fe3-metrics-certs podName:4c96dea8-a54f-4ca2-a3fb-757208554fe3 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:35.432142893 +0000 UTC m=+10.144824870 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4c96dea8-a54f-4ca2-a3fb-757208554fe3-metrics-certs") pod "network-metrics-daemon-2n29c" (UID: "4c96dea8-a54f-4ca2-a3fb-757208554fe3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:31.633256 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:31.633085 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84l8q\" (UniqueName: \"kubernetes.io/projected/26d6564f-741b-481d-a1c3-a42559981c32-kube-api-access-84l8q\") pod \"network-check-target-p8x9m\" (UID: \"26d6564f-741b-481d-a1c3-a42559981c32\") " pod="openshift-network-diagnostics/network-check-target-p8x9m" Apr 20 20:05:31.633256 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:31.633251 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:05:31.633513 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:31.633274 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:05:31.633513 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:31.633288 2576 projected.go:194] Error preparing data for projected volume kube-api-access-84l8q for pod openshift-network-diagnostics/network-check-target-p8x9m: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:31.633513 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:31.633346 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/26d6564f-741b-481d-a1c3-a42559981c32-kube-api-access-84l8q podName:26d6564f-741b-481d-a1c3-a42559981c32 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:35.633326414 +0000 UTC m=+10.346008390 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-84l8q" (UniqueName: "kubernetes.io/projected/26d6564f-741b-481d-a1c3-a42559981c32-kube-api-access-84l8q") pod "network-check-target-p8x9m" (UID: "26d6564f-741b-481d-a1c3-a42559981c32") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:31.941048 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:31.941016 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p8x9m" Apr 20 20:05:31.941237 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:31.941172 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p8x9m" podUID="26d6564f-741b-481d-a1c3-a42559981c32" Apr 20 20:05:32.939359 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:32.939326 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7wn8c" Apr 20 20:05:32.939807 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:32.939464 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7wn8c" podUID="2f559047-01c4-4f3b-a5a7-1c183af11e8f" Apr 20 20:05:32.939807 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:32.939524 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2n29c" Apr 20 20:05:32.939807 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:32.939656 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2n29c" podUID="4c96dea8-a54f-4ca2-a3fb-757208554fe3" Apr 20 20:05:33.940303 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:33.940205 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p8x9m" Apr 20 20:05:33.940750 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:33.940344 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p8x9m" podUID="26d6564f-741b-481d-a1c3-a42559981c32" Apr 20 20:05:34.940418 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:34.939721 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7wn8c" Apr 20 20:05:34.940418 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:34.939841 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7wn8c" podUID="2f559047-01c4-4f3b-a5a7-1c183af11e8f" Apr 20 20:05:34.940418 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:34.940273 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2n29c" Apr 20 20:05:34.940418 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:34.940374 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2n29c" podUID="4c96dea8-a54f-4ca2-a3fb-757208554fe3" Apr 20 20:05:35.467831 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:35.467093 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4c96dea8-a54f-4ca2-a3fb-757208554fe3-metrics-certs\") pod \"network-metrics-daemon-2n29c\" (UID: \"4c96dea8-a54f-4ca2-a3fb-757208554fe3\") " pod="openshift-multus/network-metrics-daemon-2n29c" Apr 20 20:05:35.467831 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:35.467168 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2f559047-01c4-4f3b-a5a7-1c183af11e8f-original-pull-secret\") pod \"global-pull-secret-syncer-7wn8c\" (UID: \"2f559047-01c4-4f3b-a5a7-1c183af11e8f\") " pod="kube-system/global-pull-secret-syncer-7wn8c" Apr 20 20:05:35.467831 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:35.467309 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 20:05:35.467831 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:35.467363 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f559047-01c4-4f3b-a5a7-1c183af11e8f-original-pull-secret podName:2f559047-01c4-4f3b-a5a7-1c183af11e8f nodeName:}" failed. No retries permitted until 2026-04-20 20:05:43.467346724 +0000 UTC m=+18.180028701 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/2f559047-01c4-4f3b-a5a7-1c183af11e8f-original-pull-secret") pod "global-pull-secret-syncer-7wn8c" (UID: "2f559047-01c4-4f3b-a5a7-1c183af11e8f") : object "kube-system"/"original-pull-secret" not registered Apr 20 20:05:35.467831 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:35.467746 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:35.467831 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:35.467795 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c96dea8-a54f-4ca2-a3fb-757208554fe3-metrics-certs podName:4c96dea8-a54f-4ca2-a3fb-757208554fe3 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:43.467779938 +0000 UTC m=+18.180461916 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4c96dea8-a54f-4ca2-a3fb-757208554fe3-metrics-certs") pod "network-metrics-daemon-2n29c" (UID: "4c96dea8-a54f-4ca2-a3fb-757208554fe3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:35.668763 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:35.668726 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84l8q\" (UniqueName: \"kubernetes.io/projected/26d6564f-741b-481d-a1c3-a42559981c32-kube-api-access-84l8q\") pod \"network-check-target-p8x9m\" (UID: \"26d6564f-741b-481d-a1c3-a42559981c32\") " pod="openshift-network-diagnostics/network-check-target-p8x9m" Apr 20 20:05:35.668909 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:35.668899 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:05:35.668969 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:35.668916 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:05:35.668969 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:35.668929 2576 projected.go:194] Error preparing data for projected volume kube-api-access-84l8q for pod openshift-network-diagnostics/network-check-target-p8x9m: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:35.669073 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:35.668988 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/26d6564f-741b-481d-a1c3-a42559981c32-kube-api-access-84l8q podName:26d6564f-741b-481d-a1c3-a42559981c32 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:43.668970339 +0000 UTC m=+18.381652319 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-84l8q" (UniqueName: "kubernetes.io/projected/26d6564f-741b-481d-a1c3-a42559981c32-kube-api-access-84l8q") pod "network-check-target-p8x9m" (UID: "26d6564f-741b-481d-a1c3-a42559981c32") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:35.941150 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:35.940640 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p8x9m" Apr 20 20:05:35.941150 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:35.940749 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p8x9m" podUID="26d6564f-741b-481d-a1c3-a42559981c32" Apr 20 20:05:36.939498 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:36.939420 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7wn8c" Apr 20 20:05:36.939660 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:36.939551 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7wn8c" podUID="2f559047-01c4-4f3b-a5a7-1c183af11e8f" Apr 20 20:05:36.939660 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:36.939622 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2n29c" Apr 20 20:05:36.939787 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:36.939746 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2n29c" podUID="4c96dea8-a54f-4ca2-a3fb-757208554fe3" Apr 20 20:05:37.939227 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:37.939189 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p8x9m" Apr 20 20:05:37.939655 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:37.939306 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p8x9m" podUID="26d6564f-741b-481d-a1c3-a42559981c32" Apr 20 20:05:38.939967 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:38.939934 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7wn8c" Apr 20 20:05:38.940355 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:38.940045 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7wn8c" podUID="2f559047-01c4-4f3b-a5a7-1c183af11e8f" Apr 20 20:05:38.940355 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:38.940153 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2n29c" Apr 20 20:05:38.940355 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:38.940269 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2n29c" podUID="4c96dea8-a54f-4ca2-a3fb-757208554fe3" Apr 20 20:05:39.940162 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:39.940127 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p8x9m" Apr 20 20:05:39.940548 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:39.940240 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p8x9m" podUID="26d6564f-741b-481d-a1c3-a42559981c32" Apr 20 20:05:40.939823 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:40.939739 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7wn8c" Apr 20 20:05:40.939823 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:40.939757 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2n29c" Apr 20 20:05:40.940017 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:40.939839 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7wn8c" podUID="2f559047-01c4-4f3b-a5a7-1c183af11e8f" Apr 20 20:05:40.940017 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:40.939981 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2n29c" podUID="4c96dea8-a54f-4ca2-a3fb-757208554fe3" Apr 20 20:05:41.939943 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:41.939916 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p8x9m" Apr 20 20:05:41.940344 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:41.940028 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p8x9m" podUID="26d6564f-741b-481d-a1c3-a42559981c32" Apr 20 20:05:42.939350 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:42.939313 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7wn8c" Apr 20 20:05:42.939596 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:42.939319 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2n29c" Apr 20 20:05:42.939596 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:42.939427 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7wn8c" podUID="2f559047-01c4-4f3b-a5a7-1c183af11e8f" Apr 20 20:05:42.939596 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:42.939510 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2n29c" podUID="4c96dea8-a54f-4ca2-a3fb-757208554fe3" Apr 20 20:05:43.533095 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:43.533056 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4c96dea8-a54f-4ca2-a3fb-757208554fe3-metrics-certs\") pod \"network-metrics-daemon-2n29c\" (UID: \"4c96dea8-a54f-4ca2-a3fb-757208554fe3\") " pod="openshift-multus/network-metrics-daemon-2n29c" Apr 20 20:05:43.533538 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:43.533127 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2f559047-01c4-4f3b-a5a7-1c183af11e8f-original-pull-secret\") pod \"global-pull-secret-syncer-7wn8c\" (UID: \"2f559047-01c4-4f3b-a5a7-1c183af11e8f\") " pod="kube-system/global-pull-secret-syncer-7wn8c" Apr 20 20:05:43.533538 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:43.533230 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:43.533538 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:43.533297 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c96dea8-a54f-4ca2-a3fb-757208554fe3-metrics-certs podName:4c96dea8-a54f-4ca2-a3fb-757208554fe3 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:59.533276488 +0000 UTC m=+34.245958474 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4c96dea8-a54f-4ca2-a3fb-757208554fe3-metrics-certs") pod "network-metrics-daemon-2n29c" (UID: "4c96dea8-a54f-4ca2-a3fb-757208554fe3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:43.533538 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:43.533233 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 20:05:43.533538 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:43.533401 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f559047-01c4-4f3b-a5a7-1c183af11e8f-original-pull-secret podName:2f559047-01c4-4f3b-a5a7-1c183af11e8f nodeName:}" failed. No retries permitted until 2026-04-20 20:05:59.533381682 +0000 UTC m=+34.246063674 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/2f559047-01c4-4f3b-a5a7-1c183af11e8f-original-pull-secret") pod "global-pull-secret-syncer-7wn8c" (UID: "2f559047-01c4-4f3b-a5a7-1c183af11e8f") : object "kube-system"/"original-pull-secret" not registered Apr 20 20:05:43.734524 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:43.734487 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84l8q\" (UniqueName: \"kubernetes.io/projected/26d6564f-741b-481d-a1c3-a42559981c32-kube-api-access-84l8q\") pod \"network-check-target-p8x9m\" (UID: \"26d6564f-741b-481d-a1c3-a42559981c32\") " pod="openshift-network-diagnostics/network-check-target-p8x9m" Apr 20 20:05:43.734675 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:43.734656 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:05:43.734727 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:43.734684 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:05:43.734727 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:43.734696 2576 projected.go:194] Error preparing data for projected volume kube-api-access-84l8q for pod openshift-network-diagnostics/network-check-target-p8x9m: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:43.734805 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:43.734757 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/26d6564f-741b-481d-a1c3-a42559981c32-kube-api-access-84l8q podName:26d6564f-741b-481d-a1c3-a42559981c32 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:59.734736398 +0000 UTC m=+34.447418395 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-84l8q" (UniqueName: "kubernetes.io/projected/26d6564f-741b-481d-a1c3-a42559981c32-kube-api-access-84l8q") pod "network-check-target-p8x9m" (UID: "26d6564f-741b-481d-a1c3-a42559981c32") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:43.939859 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:43.939784 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p8x9m" Apr 20 20:05:43.940011 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:43.939903 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p8x9m" podUID="26d6564f-741b-481d-a1c3-a42559981c32" Apr 20 20:05:44.940055 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:44.940030 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7wn8c" Apr 20 20:05:44.940338 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:44.940091 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2n29c" Apr 20 20:05:44.940338 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:44.940198 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2n29c" podUID="4c96dea8-a54f-4ca2-a3fb-757208554fe3" Apr 20 20:05:44.940338 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:44.940324 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7wn8c" podUID="2f559047-01c4-4f3b-a5a7-1c183af11e8f" Apr 20 20:05:45.941206 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:45.940755 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p8x9m" Apr 20 20:05:45.941875 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:45.941215 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p8x9m" podUID="26d6564f-741b-481d-a1c3-a42559981c32" Apr 20 20:05:46.031402 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:46.031361 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-44qfk" event={"ID":"d3bb9caf-0865-431c-89be-a2a222a45633","Type":"ContainerStarted","Data":"f2cb784af5dea7d2e7a2d1e0ca58072f5ec4994995ffb978179cfbfa90671aff"} Apr 20 20:05:46.032780 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:46.032754 2576 generic.go:358] "Generic (PLEG): container finished" podID="45f05212-3e62-445f-af62-b586721d3417" containerID="0491c4de2f30421fa9dffd23d812bc8f138659cfab92e17b4221325a3d6a0a46" exitCode=0 Apr 20 20:05:46.032881 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:46.032834 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-68wc5" event={"ID":"45f05212-3e62-445f-af62-b586721d3417","Type":"ContainerDied","Data":"0491c4de2f30421fa9dffd23d812bc8f138659cfab92e17b4221325a3d6a0a46"} Apr 20 20:05:46.034280 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:46.034125 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-fsxdh" event={"ID":"0a9e3157-7a26-4d6c-9f83-58c9eca7c51a","Type":"ContainerStarted","Data":"d6deae6fbf8c125f1cc108d991fbbd3ed674cb488c8d2d2665bf5e409137f659"} Apr 20 20:05:46.035783 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:46.035702 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k58dg" event={"ID":"544a3b4c-fde7-4ae8-a14c-e19625554dfe","Type":"ContainerStarted","Data":"1cecc45ef7c912ef2ea77f454ce903af9a4016c81de2452be04cb501f6e1be72"} Apr 20 20:05:46.037901 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:46.037879 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-kzk2j" event={"ID":"9cdd6b10-c252-4482-9ba2-40fdae9ef435","Type":"ContainerStarted","Data":"3fad70710e3d6d4f3c99daeb4ae26e09159c30d234bf34042e636fd64e4a0c65"} Apr 20 20:05:46.040499 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:46.040483 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z55qt_f78ac3d9-bcf1-43dd-aac7-1678831ee3ba/ovn-acl-logging/0.log" Apr 20 20:05:46.040813 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:46.040789 2576 generic.go:358] "Generic (PLEG): container finished" podID="f78ac3d9-bcf1-43dd-aac7-1678831ee3ba" containerID="4de1677256134922ffbe8e9ccf43a4c97217bdff7c8e292ade7db3bc1183f009" exitCode=1 Apr 20 20:05:46.040896 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:46.040818 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" event={"ID":"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba","Type":"ContainerStarted","Data":"516cfbcad62696888a47ce62de39f8a70b4a462529058f4a00dc085011666fc8"} Apr 20 20:05:46.040896 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:46.040851 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" event={"ID":"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba","Type":"ContainerStarted","Data":"d461c7bc629dbcf237f34fd3fa77f9eeede1cfb18dc54e249264f43d8d4d2804"} Apr 20 20:05:46.040896 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:46.040864 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" event={"ID":"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba","Type":"ContainerStarted","Data":"e137d806f5166babccc09275efcaa7dc34ce3df5ba4fd6916dba27530b73c6e5"} Apr 20 20:05:46.040896 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:46.040878 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" event={"ID":"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba","Type":"ContainerStarted","Data":"dba9e29236f930dec03c8d4b4f9ccef0d620fc4804c38089eca3298b24c71b83"} Apr 20 20:05:46.040896 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:46.040890 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" event={"ID":"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba","Type":"ContainerDied","Data":"4de1677256134922ffbe8e9ccf43a4c97217bdff7c8e292ade7db3bc1183f009"} Apr 20 20:05:46.041089 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:46.040904 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" event={"ID":"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba","Type":"ContainerStarted","Data":"78847917ded9606eaf580e7b73735751034dd45be43f02ebd1d11f666c07cd27"} Apr 20 20:05:46.042314 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:46.042290 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-lzm5k" event={"ID":"7f59f5a3-2821-4412-9688-73d69c9bbb4c","Type":"ContainerStarted","Data":"d8db3bf544567aa6a7c78ad2976ab4cc948323bab03c06b47e72234eac54fcc2"} Apr 20 20:05:46.043503 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:46.043484 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-56wkj" event={"ID":"8c5b0ba6-52e0-4a02-bb42-a4b2b377f250","Type":"ContainerStarted","Data":"291e2b06977203efb0bdc65a6e7aefe0cf7d11b8c4392cc3d3e8f5d86324d2d1"} Apr 20 20:05:46.049013 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:46.048973 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-23.ec2.internal" podStartSLOduration=19.048960554 podStartE2EDuration="19.048960554s" podCreationTimestamp="2026-04-20 20:05:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:05:31.023545509 +0000 UTC m=+5.736227506" watchObservedRunningTime="2026-04-20 20:05:46.048960554 +0000 UTC m=+20.761642553" Apr 20 20:05:46.049437 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:46.049411 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-44qfk" podStartSLOduration=4.6084848 podStartE2EDuration="21.04940421s" podCreationTimestamp="2026-04-20 20:05:25 +0000 UTC" firstStartedPulling="2026-04-20 20:05:28.677804198 +0000 UTC m=+3.390486187" lastFinishedPulling="2026-04-20 20:05:45.118723606 +0000 UTC m=+19.831405597" observedRunningTime="2026-04-20 20:05:46.048760308 +0000 UTC m=+20.761442303" watchObservedRunningTime="2026-04-20 20:05:46.04940421 +0000 UTC m=+20.762086205" Apr 20 20:05:46.063667 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:46.063627 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-kzk2j" podStartSLOduration=3.638798649 podStartE2EDuration="20.063615995s" podCreationTimestamp="2026-04-20 20:05:26 +0000 UTC" firstStartedPulling="2026-04-20 20:05:28.676711189 +0000 UTC m=+3.389393163" lastFinishedPulling="2026-04-20 20:05:45.101528522 +0000 UTC m=+19.814210509" observedRunningTime="2026-04-20 20:05:46.063174207 +0000 UTC m=+20.775856203" watchObservedRunningTime="2026-04-20 20:05:46.063615995 +0000 UTC m=+20.776297989" Apr 20 20:05:46.081733 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:46.081689 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-fsxdh" podStartSLOduration=3.69290625 podStartE2EDuration="20.081679206s" podCreationTimestamp="2026-04-20 20:05:26 +0000 UTC" firstStartedPulling="2026-04-20 20:05:28.671795789 +0000 UTC m=+3.384477779" lastFinishedPulling="2026-04-20 20:05:45.060568739 +0000 UTC m=+19.773250735" observedRunningTime="2026-04-20 20:05:46.081332931 +0000 UTC m=+20.794014928" watchObservedRunningTime="2026-04-20 20:05:46.081679206 +0000 UTC m=+20.794361202" Apr 20 20:05:46.114799 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:46.114760 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-56wkj" podStartSLOduration=3.729427342 podStartE2EDuration="20.114751417s" podCreationTimestamp="2026-04-20 20:05:26 +0000 UTC" firstStartedPulling="2026-04-20 20:05:28.675330678 +0000 UTC m=+3.388012663" lastFinishedPulling="2026-04-20 20:05:45.06065475 +0000 UTC m=+19.773336738" observedRunningTime="2026-04-20 20:05:46.114545453 +0000 UTC m=+20.827227450" watchObservedRunningTime="2026-04-20 20:05:46.114751417 +0000 UTC m=+20.827433411" Apr 20 20:05:46.131429 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:46.131390 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-lzm5k" podStartSLOduration=3.73844078 podStartE2EDuration="20.131376889s" podCreationTimestamp="2026-04-20 20:05:26 +0000 UTC" firstStartedPulling="2026-04-20 20:05:28.667696657 +0000 UTC m=+3.380378635" lastFinishedPulling="2026-04-20 20:05:45.060632756 +0000 UTC m=+19.773314744" observedRunningTime="2026-04-20 20:05:46.131344252 +0000 UTC m=+20.844026261" watchObservedRunningTime="2026-04-20 20:05:46.131376889 +0000 UTC m=+20.844058889" Apr 20 20:05:46.635968 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:46.635933 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 20:05:46.847652 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:46.847515 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T20:05:46.635953888Z","UUID":"8aa548e2-b29d-437d-b3ec-295a2b4c9f04","Handler":null,"Name":"","Endpoint":""} Apr 20 20:05:46.849210 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:46.849186 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 20:05:46.849210 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:46.849216 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 20:05:46.939821 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:46.939788 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2n29c" Apr 20 20:05:46.939975 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:46.939787 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7wn8c" Apr 20 20:05:46.939975 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:46.939900 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2n29c" podUID="4c96dea8-a54f-4ca2-a3fb-757208554fe3" Apr 20 20:05:46.939975 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:46.939953 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7wn8c" podUID="2f559047-01c4-4f3b-a5a7-1c183af11e8f" Apr 20 20:05:47.047885 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:47.047852 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k58dg" event={"ID":"544a3b4c-fde7-4ae8-a14c-e19625554dfe","Type":"ContainerStarted","Data":"916424766afa549e8556b8fb6b25511cfb2f6823c22f168c6be0df8f837487c7"} Apr 20 20:05:47.049153 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:47.049105 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-nt6rc" event={"ID":"45668015-eebd-4a0f-adbc-9ebb7f100cbd","Type":"ContainerStarted","Data":"35039ce1888896ecbc2aec153866daec8079dc69e178725e6a94541c8d632d46"} Apr 20 20:05:47.065486 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:47.065446 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-nt6rc" podStartSLOduration=4.640302922 podStartE2EDuration="21.065434511s" podCreationTimestamp="2026-04-20 20:05:26 +0000 UTC" firstStartedPulling="2026-04-20 20:05:28.674445465 +0000 UTC m=+3.387127442" lastFinishedPulling="2026-04-20 20:05:45.099577044 +0000 UTC m=+19.812259031" observedRunningTime="2026-04-20 20:05:47.065176326 +0000 UTC m=+21.777858323" watchObservedRunningTime="2026-04-20 20:05:47.065434511 +0000 UTC m=+21.778116545" Apr 20 20:05:47.939424 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:47.939397 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p8x9m" Apr 20 20:05:47.939559 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:47.939512 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p8x9m" podUID="26d6564f-741b-481d-a1c3-a42559981c32" Apr 20 20:05:48.939630 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:48.939441 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7wn8c" Apr 20 20:05:48.940089 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:48.939451 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2n29c" Apr 20 20:05:48.940089 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:48.939716 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7wn8c" podUID="2f559047-01c4-4f3b-a5a7-1c183af11e8f" Apr 20 20:05:48.940089 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:48.939833 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2n29c" podUID="4c96dea8-a54f-4ca2-a3fb-757208554fe3" Apr 20 20:05:49.054581 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:49.054553 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k58dg" event={"ID":"544a3b4c-fde7-4ae8-a14c-e19625554dfe","Type":"ContainerStarted","Data":"f5687be8bfef6e44d774d4c1bb13b06a2ca2a5ba53113a9e75d4370a5a9a2569"} Apr 20 20:05:49.057731 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:49.057709 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z55qt_f78ac3d9-bcf1-43dd-aac7-1678831ee3ba/ovn-acl-logging/0.log" Apr 20 20:05:49.058208 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:49.058069 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" event={"ID":"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba","Type":"ContainerStarted","Data":"36a25af57224e648cc47d907563a07f53fdb970a12eb14109e75b247844c90e6"} Apr 20 20:05:49.085276 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:49.085253 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-fsxdh" Apr 20 20:05:49.085972 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:49.085954 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-fsxdh" Apr 20 20:05:49.086048 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:49.085985 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-k58dg" podStartSLOduration=3.7380759660000002 podStartE2EDuration="23.085973314s" podCreationTimestamp="2026-04-20 20:05:26 +0000 UTC" firstStartedPulling="2026-04-20 20:05:28.676676232 +0000 UTC m=+3.389358210" lastFinishedPulling="2026-04-20 20:05:48.024573567 +0000 UTC m=+22.737255558" observedRunningTime="2026-04-20 20:05:49.085508912 +0000 UTC m=+23.798190911" watchObservedRunningTime="2026-04-20 20:05:49.085973314 +0000 UTC m=+23.798655334" Apr 20 20:05:49.942324 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:49.942288 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p8x9m" Apr 20 20:05:49.942747 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:49.942398 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p8x9m" podUID="26d6564f-741b-481d-a1c3-a42559981c32" Apr 20 20:05:50.060039 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:50.060014 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-fsxdh" Apr 20 20:05:50.060340 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:50.060320 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-fsxdh" Apr 20 20:05:50.939598 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:50.939428 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2n29c" Apr 20 20:05:50.939756 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:50.939428 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7wn8c" Apr 20 20:05:50.939756 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:50.939691 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2n29c" podUID="4c96dea8-a54f-4ca2-a3fb-757208554fe3" Apr 20 20:05:50.939756 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:50.939736 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7wn8c" podUID="2f559047-01c4-4f3b-a5a7-1c183af11e8f" Apr 20 20:05:51.062955 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:51.062924 2576 generic.go:358] "Generic (PLEG): container finished" podID="45f05212-3e62-445f-af62-b586721d3417" containerID="1a27d1ebc879f7c2cd3a1811f4f626bb3e6a3c416dbe24327f837e096f4744bc" exitCode=0 Apr 20 20:05:51.063635 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:51.063008 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-68wc5" event={"ID":"45f05212-3e62-445f-af62-b586721d3417","Type":"ContainerDied","Data":"1a27d1ebc879f7c2cd3a1811f4f626bb3e6a3c416dbe24327f837e096f4744bc"} Apr 20 20:05:51.066011 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:51.065998 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z55qt_f78ac3d9-bcf1-43dd-aac7-1678831ee3ba/ovn-acl-logging/0.log" Apr 20 20:05:51.066388 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:51.066370 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" event={"ID":"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba","Type":"ContainerStarted","Data":"2bf08fd70e08479216f63bd2d031a827bbb9a56c66f58bed0bf51b83cf866c9e"} Apr 20 20:05:51.066597 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:51.066575 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:51.066664 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:51.066609 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:51.066757 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:51.066741 2576 scope.go:117] "RemoveContainer" containerID="4de1677256134922ffbe8e9ccf43a4c97217bdff7c8e292ade7db3bc1183f009" Apr 20 20:05:51.082331 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:51.082314 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:51.942515 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:51.942485 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p8x9m" Apr 20 20:05:51.942669 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:51.942585 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p8x9m" podUID="26d6564f-741b-481d-a1c3-a42559981c32" Apr 20 20:05:52.072327 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:52.072303 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z55qt_f78ac3d9-bcf1-43dd-aac7-1678831ee3ba/ovn-acl-logging/0.log" Apr 20 20:05:52.072699 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:52.072674 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" event={"ID":"f78ac3d9-bcf1-43dd-aac7-1678831ee3ba","Type":"ContainerStarted","Data":"3882bf7a27e4843353e060eea560d7e7e5e1e8ebfb285ffc7ee5621b8917d2b2"} Apr 20 20:05:52.073040 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:52.073009 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:52.086274 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:52.086254 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:05:52.103709 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:52.103670 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" podStartSLOduration=9.630893335 podStartE2EDuration="26.103651497s" podCreationTimestamp="2026-04-20 20:05:26 +0000 UTC" firstStartedPulling="2026-04-20 20:05:28.670669517 +0000 UTC m=+3.383351501" lastFinishedPulling="2026-04-20 20:05:45.143427673 +0000 UTC m=+19.856109663" observedRunningTime="2026-04-20 20:05:52.101769156 +0000 UTC m=+26.814451188" watchObservedRunningTime="2026-04-20 20:05:52.103651497 +0000 UTC m=+26.816333495" Apr 20 20:05:52.261937 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:52.261868 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-p8x9m"] Apr 20 20:05:52.262032 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:52.261967 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p8x9m" Apr 20 20:05:52.262072 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:52.262051 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p8x9m" podUID="26d6564f-741b-481d-a1c3-a42559981c32" Apr 20 20:05:52.266152 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:52.266126 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-7wn8c"] Apr 20 20:05:52.266259 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:52.266238 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7wn8c" Apr 20 20:05:52.266363 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:52.266329 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7wn8c" podUID="2f559047-01c4-4f3b-a5a7-1c183af11e8f" Apr 20 20:05:52.266838 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:52.266815 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2n29c"] Apr 20 20:05:52.266938 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:52.266924 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2n29c" Apr 20 20:05:52.267057 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:52.267038 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2n29c" podUID="4c96dea8-a54f-4ca2-a3fb-757208554fe3" Apr 20 20:05:53.076632 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:53.076598 2576 generic.go:358] "Generic (PLEG): container finished" podID="45f05212-3e62-445f-af62-b586721d3417" containerID="651e3eeb9f408141d2740051ef5309d7bdca380328490bc657df0f4dc75ead85" exitCode=0 Apr 20 20:05:53.077156 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:53.076672 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-68wc5" event={"ID":"45f05212-3e62-445f-af62-b586721d3417","Type":"ContainerDied","Data":"651e3eeb9f408141d2740051ef5309d7bdca380328490bc657df0f4dc75ead85"} Apr 20 20:05:53.939962 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:53.939931 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p8x9m" Apr 20 20:05:53.940127 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:53.939934 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2n29c" Apr 20 20:05:53.940127 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:53.939943 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7wn8c" Apr 20 20:05:53.940204 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:53.940155 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2n29c" podUID="4c96dea8-a54f-4ca2-a3fb-757208554fe3" Apr 20 20:05:53.940204 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:53.940020 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p8x9m" podUID="26d6564f-741b-481d-a1c3-a42559981c32" Apr 20 20:05:53.940270 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:53.940210 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7wn8c" podUID="2f559047-01c4-4f3b-a5a7-1c183af11e8f" Apr 20 20:05:55.081367 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:55.081191 2576 generic.go:358] "Generic (PLEG): container finished" podID="45f05212-3e62-445f-af62-b586721d3417" containerID="0589e185f136da6d41e8c7d1d8bf6b2bb6775ce879b487c6161d5889c7d16d19" exitCode=0 Apr 20 20:05:55.081788 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:55.081278 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-68wc5" event={"ID":"45f05212-3e62-445f-af62-b586721d3417","Type":"ContainerDied","Data":"0589e185f136da6d41e8c7d1d8bf6b2bb6775ce879b487c6161d5889c7d16d19"} Apr 20 20:05:55.942856 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:55.942829 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p8x9m" Apr 20 20:05:55.943019 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:55.942829 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7wn8c" Apr 20 20:05:55.943019 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:55.942830 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2n29c" Apr 20 20:05:55.943019 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:55.943006 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7wn8c" podUID="2f559047-01c4-4f3b-a5a7-1c183af11e8f" Apr 20 20:05:55.943224 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:55.942916 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p8x9m" podUID="26d6564f-741b-481d-a1c3-a42559981c32" Apr 20 20:05:55.943224 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:55.943092 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2n29c" podUID="4c96dea8-a54f-4ca2-a3fb-757208554fe3" Apr 20 20:05:57.939699 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:57.939673 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7wn8c" Apr 20 20:05:57.940297 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:57.939725 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2n29c" Apr 20 20:05:57.940297 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:57.939729 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p8x9m" Apr 20 20:05:57.940297 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:57.939830 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-2n29c" podUID="4c96dea8-a54f-4ca2-a3fb-757208554fe3" Apr 20 20:05:57.940297 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:57.940067 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-7wn8c" podUID="2f559047-01c4-4f3b-a5a7-1c183af11e8f" Apr 20 20:05:57.940297 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:57.940164 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-p8x9m" podUID="26d6564f-741b-481d-a1c3-a42559981c32" Apr 20 20:05:58.061852 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.061828 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-23.ec2.internal" event="NodeReady" Apr 20 20:05:58.062026 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.061959 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 20:05:58.103619 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.103580 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6f7749bc57-7ktsn"] Apr 20 20:05:58.142577 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.142548 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6f7749bc57-7ktsn"] Apr 20 20:05:58.142577 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.142582 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-s4dgk"] Apr 20 20:05:58.142793 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.142717 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6f7749bc57-7ktsn" Apr 20 20:05:58.145867 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.145846 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 20 20:05:58.146168 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.146151 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 20 20:05:58.146261 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.146189 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-dhv99\"" Apr 20 20:05:58.146261 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.146240 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 20 20:05:58.156340 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.156292 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a78ff420-4c0e-42f9-a564-7d110a40f9d0-image-registry-private-configuration\") pod \"image-registry-6f7749bc57-7ktsn\" (UID: \"a78ff420-4c0e-42f9-a564-7d110a40f9d0\") " pod="openshift-image-registry/image-registry-6f7749bc57-7ktsn" Apr 20 20:05:58.156340 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.156326 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a78ff420-4c0e-42f9-a564-7d110a40f9d0-trusted-ca\") pod \"image-registry-6f7749bc57-7ktsn\" (UID: \"a78ff420-4c0e-42f9-a564-7d110a40f9d0\") " pod="openshift-image-registry/image-registry-6f7749bc57-7ktsn" Apr 20 20:05:58.156492 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.156347 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a78ff420-4c0e-42f9-a564-7d110a40f9d0-ca-trust-extracted\") pod \"image-registry-6f7749bc57-7ktsn\" (UID: \"a78ff420-4c0e-42f9-a564-7d110a40f9d0\") " pod="openshift-image-registry/image-registry-6f7749bc57-7ktsn" Apr 20 20:05:58.156492 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.156400 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a78ff420-4c0e-42f9-a564-7d110a40f9d0-bound-sa-token\") pod \"image-registry-6f7749bc57-7ktsn\" (UID: \"a78ff420-4c0e-42f9-a564-7d110a40f9d0\") " pod="openshift-image-registry/image-registry-6f7749bc57-7ktsn" Apr 20 20:05:58.156492 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.156459 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a78ff420-4c0e-42f9-a564-7d110a40f9d0-registry-tls\") pod \"image-registry-6f7749bc57-7ktsn\" (UID: \"a78ff420-4c0e-42f9-a564-7d110a40f9d0\") " pod="openshift-image-registry/image-registry-6f7749bc57-7ktsn" Apr 20 20:05:58.156492 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.156485 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a78ff420-4c0e-42f9-a564-7d110a40f9d0-installation-pull-secrets\") pod \"image-registry-6f7749bc57-7ktsn\" (UID: \"a78ff420-4c0e-42f9-a564-7d110a40f9d0\") " pod="openshift-image-registry/image-registry-6f7749bc57-7ktsn" Apr 20 20:05:58.156677 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.156505 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsprl\" (UniqueName: \"kubernetes.io/projected/a78ff420-4c0e-42f9-a564-7d110a40f9d0-kube-api-access-lsprl\") pod \"image-registry-6f7749bc57-7ktsn\" (UID: \"a78ff420-4c0e-42f9-a564-7d110a40f9d0\") " pod="openshift-image-registry/image-registry-6f7749bc57-7ktsn" Apr 20 20:05:58.156677 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.156534 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a78ff420-4c0e-42f9-a564-7d110a40f9d0-registry-certificates\") pod \"image-registry-6f7749bc57-7ktsn\" (UID: \"a78ff420-4c0e-42f9-a564-7d110a40f9d0\") " pod="openshift-image-registry/image-registry-6f7749bc57-7ktsn" Apr 20 20:05:58.158360 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.158342 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 20 20:05:58.170579 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.170550 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-s4dgk"] Apr 20 20:05:58.170694 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.170679 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-s4dgk" Apr 20 20:05:58.174745 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.174725 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 20:05:58.174906 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.174812 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-wkgqm\"" Apr 20 20:05:58.174906 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.174872 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 20:05:58.175244 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.175200 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 20:05:58.222407 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.222387 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-x2zsb"] Apr 20 20:05:58.239248 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.239226 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-x2zsb"] Apr 20 20:05:58.239352 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.239330 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-x2zsb" Apr 20 20:05:58.242490 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.242471 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 20:05:58.242585 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.242477 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 20:05:58.242765 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.242754 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-m22kk\"" Apr 20 20:05:58.257192 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.257174 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a78ff420-4c0e-42f9-a564-7d110a40f9d0-installation-pull-secrets\") pod \"image-registry-6f7749bc57-7ktsn\" (UID: \"a78ff420-4c0e-42f9-a564-7d110a40f9d0\") " pod="openshift-image-registry/image-registry-6f7749bc57-7ktsn" Apr 20 20:05:58.257276 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.257199 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lsprl\" (UniqueName: \"kubernetes.io/projected/a78ff420-4c0e-42f9-a564-7d110a40f9d0-kube-api-access-lsprl\") pod \"image-registry-6f7749bc57-7ktsn\" (UID: \"a78ff420-4c0e-42f9-a564-7d110a40f9d0\") " pod="openshift-image-registry/image-registry-6f7749bc57-7ktsn" Apr 20 20:05:58.257276 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.257221 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a78ff420-4c0e-42f9-a564-7d110a40f9d0-registry-certificates\") pod \"image-registry-6f7749bc57-7ktsn\" (UID: \"a78ff420-4c0e-42f9-a564-7d110a40f9d0\") " pod="openshift-image-registry/image-registry-6f7749bc57-7ktsn" Apr 20 20:05:58.257276 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.257242 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/88bdb2a3-8afb-427f-bc63-e22166098be9-config-volume\") pod \"dns-default-x2zsb\" (UID: \"88bdb2a3-8afb-427f-bc63-e22166098be9\") " pod="openshift-dns/dns-default-x2zsb" Apr 20 20:05:58.257276 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.257268 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7g57t\" (UniqueName: \"kubernetes.io/projected/88bdb2a3-8afb-427f-bc63-e22166098be9-kube-api-access-7g57t\") pod \"dns-default-x2zsb\" (UID: \"88bdb2a3-8afb-427f-bc63-e22166098be9\") " pod="openshift-dns/dns-default-x2zsb" Apr 20 20:05:58.257471 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.257364 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a78ff420-4c0e-42f9-a564-7d110a40f9d0-image-registry-private-configuration\") pod \"image-registry-6f7749bc57-7ktsn\" (UID: \"a78ff420-4c0e-42f9-a564-7d110a40f9d0\") " pod="openshift-image-registry/image-registry-6f7749bc57-7ktsn" Apr 20 20:05:58.257471 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.257416 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/88bdb2a3-8afb-427f-bc63-e22166098be9-metrics-tls\") pod \"dns-default-x2zsb\" (UID: \"88bdb2a3-8afb-427f-bc63-e22166098be9\") " pod="openshift-dns/dns-default-x2zsb" Apr 20 20:05:58.257471 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.257459 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3dfe201c-4c10-4a3b-98f1-39ca2bed620f-cert\") pod \"ingress-canary-s4dgk\" (UID: \"3dfe201c-4c10-4a3b-98f1-39ca2bed620f\") " pod="openshift-ingress-canary/ingress-canary-s4dgk" Apr 20 20:05:58.257601 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.257494 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a78ff420-4c0e-42f9-a564-7d110a40f9d0-trusted-ca\") pod \"image-registry-6f7749bc57-7ktsn\" (UID: \"a78ff420-4c0e-42f9-a564-7d110a40f9d0\") " pod="openshift-image-registry/image-registry-6f7749bc57-7ktsn" Apr 20 20:05:58.257601 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.257541 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a78ff420-4c0e-42f9-a564-7d110a40f9d0-ca-trust-extracted\") pod \"image-registry-6f7749bc57-7ktsn\" (UID: \"a78ff420-4c0e-42f9-a564-7d110a40f9d0\") " pod="openshift-image-registry/image-registry-6f7749bc57-7ktsn" Apr 20 20:05:58.257699 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.257669 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/88bdb2a3-8afb-427f-bc63-e22166098be9-tmp-dir\") pod \"dns-default-x2zsb\" (UID: \"88bdb2a3-8afb-427f-bc63-e22166098be9\") " pod="openshift-dns/dns-default-x2zsb" Apr 20 20:05:58.257748 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.257731 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2pws\" (UniqueName: \"kubernetes.io/projected/3dfe201c-4c10-4a3b-98f1-39ca2bed620f-kube-api-access-f2pws\") pod \"ingress-canary-s4dgk\" (UID: \"3dfe201c-4c10-4a3b-98f1-39ca2bed620f\") " pod="openshift-ingress-canary/ingress-canary-s4dgk" Apr 20 20:05:58.257813 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.257792 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a78ff420-4c0e-42f9-a564-7d110a40f9d0-bound-sa-token\") pod \"image-registry-6f7749bc57-7ktsn\" (UID: \"a78ff420-4c0e-42f9-a564-7d110a40f9d0\") " pod="openshift-image-registry/image-registry-6f7749bc57-7ktsn" Apr 20 20:05:58.257898 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.257882 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a78ff420-4c0e-42f9-a564-7d110a40f9d0-registry-tls\") pod \"image-registry-6f7749bc57-7ktsn\" (UID: \"a78ff420-4c0e-42f9-a564-7d110a40f9d0\") " pod="openshift-image-registry/image-registry-6f7749bc57-7ktsn" Apr 20 20:05:58.257963 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.257900 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a78ff420-4c0e-42f9-a564-7d110a40f9d0-registry-certificates\") pod \"image-registry-6f7749bc57-7ktsn\" (UID: \"a78ff420-4c0e-42f9-a564-7d110a40f9d0\") " pod="openshift-image-registry/image-registry-6f7749bc57-7ktsn" Apr 20 20:05:58.257963 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.257915 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a78ff420-4c0e-42f9-a564-7d110a40f9d0-ca-trust-extracted\") pod \"image-registry-6f7749bc57-7ktsn\" (UID: \"a78ff420-4c0e-42f9-a564-7d110a40f9d0\") " pod="openshift-image-registry/image-registry-6f7749bc57-7ktsn" Apr 20 20:05:58.258068 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:58.258051 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 20:05:58.258134 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:58.258070 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6f7749bc57-7ktsn: secret "image-registry-tls" not found Apr 20 20:05:58.258240 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:58.258220 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a78ff420-4c0e-42f9-a564-7d110a40f9d0-registry-tls podName:a78ff420-4c0e-42f9-a564-7d110a40f9d0 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:58.758141526 +0000 UTC m=+33.470823500 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a78ff420-4c0e-42f9-a564-7d110a40f9d0-registry-tls") pod "image-registry-6f7749bc57-7ktsn" (UID: "a78ff420-4c0e-42f9-a564-7d110a40f9d0") : secret "image-registry-tls" not found Apr 20 20:05:58.258367 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.258350 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a78ff420-4c0e-42f9-a564-7d110a40f9d0-trusted-ca\") pod \"image-registry-6f7749bc57-7ktsn\" (UID: \"a78ff420-4c0e-42f9-a564-7d110a40f9d0\") " pod="openshift-image-registry/image-registry-6f7749bc57-7ktsn" Apr 20 20:05:58.261129 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.261091 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a78ff420-4c0e-42f9-a564-7d110a40f9d0-image-registry-private-configuration\") pod \"image-registry-6f7749bc57-7ktsn\" (UID: \"a78ff420-4c0e-42f9-a564-7d110a40f9d0\") " pod="openshift-image-registry/image-registry-6f7749bc57-7ktsn" Apr 20 20:05:58.261204 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.261132 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a78ff420-4c0e-42f9-a564-7d110a40f9d0-installation-pull-secrets\") pod \"image-registry-6f7749bc57-7ktsn\" (UID: \"a78ff420-4c0e-42f9-a564-7d110a40f9d0\") " pod="openshift-image-registry/image-registry-6f7749bc57-7ktsn" Apr 20 20:05:58.270017 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.269996 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a78ff420-4c0e-42f9-a564-7d110a40f9d0-bound-sa-token\") pod \"image-registry-6f7749bc57-7ktsn\" (UID: \"a78ff420-4c0e-42f9-a564-7d110a40f9d0\") " pod="openshift-image-registry/image-registry-6f7749bc57-7ktsn" Apr 20 20:05:58.270443 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.270426 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsprl\" (UniqueName: \"kubernetes.io/projected/a78ff420-4c0e-42f9-a564-7d110a40f9d0-kube-api-access-lsprl\") pod \"image-registry-6f7749bc57-7ktsn\" (UID: \"a78ff420-4c0e-42f9-a564-7d110a40f9d0\") " pod="openshift-image-registry/image-registry-6f7749bc57-7ktsn" Apr 20 20:05:58.358904 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.358877 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/88bdb2a3-8afb-427f-bc63-e22166098be9-config-volume\") pod \"dns-default-x2zsb\" (UID: \"88bdb2a3-8afb-427f-bc63-e22166098be9\") " pod="openshift-dns/dns-default-x2zsb" Apr 20 20:05:58.359041 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.358917 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7g57t\" (UniqueName: \"kubernetes.io/projected/88bdb2a3-8afb-427f-bc63-e22166098be9-kube-api-access-7g57t\") pod \"dns-default-x2zsb\" (UID: \"88bdb2a3-8afb-427f-bc63-e22166098be9\") " pod="openshift-dns/dns-default-x2zsb" Apr 20 20:05:58.359041 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.358943 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/88bdb2a3-8afb-427f-bc63-e22166098be9-metrics-tls\") pod \"dns-default-x2zsb\" (UID: \"88bdb2a3-8afb-427f-bc63-e22166098be9\") " pod="openshift-dns/dns-default-x2zsb" Apr 20 20:05:58.359041 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.358971 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3dfe201c-4c10-4a3b-98f1-39ca2bed620f-cert\") pod \"ingress-canary-s4dgk\" (UID: \"3dfe201c-4c10-4a3b-98f1-39ca2bed620f\") " pod="openshift-ingress-canary/ingress-canary-s4dgk" Apr 20 20:05:58.359041 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.359004 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/88bdb2a3-8afb-427f-bc63-e22166098be9-tmp-dir\") pod \"dns-default-x2zsb\" (UID: \"88bdb2a3-8afb-427f-bc63-e22166098be9\") " pod="openshift-dns/dns-default-x2zsb" Apr 20 20:05:58.359251 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.359056 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f2pws\" (UniqueName: \"kubernetes.io/projected/3dfe201c-4c10-4a3b-98f1-39ca2bed620f-kube-api-access-f2pws\") pod \"ingress-canary-s4dgk\" (UID: \"3dfe201c-4c10-4a3b-98f1-39ca2bed620f\") " pod="openshift-ingress-canary/ingress-canary-s4dgk" Apr 20 20:05:58.359251 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:58.359127 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:05:58.359251 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:58.359199 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88bdb2a3-8afb-427f-bc63-e22166098be9-metrics-tls podName:88bdb2a3-8afb-427f-bc63-e22166098be9 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:58.85917924 +0000 UTC m=+33.571861237 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/88bdb2a3-8afb-427f-bc63-e22166098be9-metrics-tls") pod "dns-default-x2zsb" (UID: "88bdb2a3-8afb-427f-bc63-e22166098be9") : secret "dns-default-metrics-tls" not found Apr 20 20:05:58.359398 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:58.359273 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:05:58.359398 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:58.359331 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3dfe201c-4c10-4a3b-98f1-39ca2bed620f-cert podName:3dfe201c-4c10-4a3b-98f1-39ca2bed620f nodeName:}" failed. No retries permitted until 2026-04-20 20:05:58.859317016 +0000 UTC m=+33.571998995 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3dfe201c-4c10-4a3b-98f1-39ca2bed620f-cert") pod "ingress-canary-s4dgk" (UID: "3dfe201c-4c10-4a3b-98f1-39ca2bed620f") : secret "canary-serving-cert" not found Apr 20 20:05:58.359537 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.359517 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/88bdb2a3-8afb-427f-bc63-e22166098be9-tmp-dir\") pod \"dns-default-x2zsb\" (UID: \"88bdb2a3-8afb-427f-bc63-e22166098be9\") " pod="openshift-dns/dns-default-x2zsb" Apr 20 20:05:58.369900 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.369871 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/88bdb2a3-8afb-427f-bc63-e22166098be9-config-volume\") pod \"dns-default-x2zsb\" (UID: \"88bdb2a3-8afb-427f-bc63-e22166098be9\") " pod="openshift-dns/dns-default-x2zsb" Apr 20 20:05:58.370179 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.370159 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2pws\" (UniqueName: \"kubernetes.io/projected/3dfe201c-4c10-4a3b-98f1-39ca2bed620f-kube-api-access-f2pws\") pod \"ingress-canary-s4dgk\" (UID: \"3dfe201c-4c10-4a3b-98f1-39ca2bed620f\") " pod="openshift-ingress-canary/ingress-canary-s4dgk" Apr 20 20:05:58.371268 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.371248 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7g57t\" (UniqueName: \"kubernetes.io/projected/88bdb2a3-8afb-427f-bc63-e22166098be9-kube-api-access-7g57t\") pod \"dns-default-x2zsb\" (UID: \"88bdb2a3-8afb-427f-bc63-e22166098be9\") " pod="openshift-dns/dns-default-x2zsb" Apr 20 20:05:58.763321 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.763271 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a78ff420-4c0e-42f9-a564-7d110a40f9d0-registry-tls\") pod \"image-registry-6f7749bc57-7ktsn\" (UID: \"a78ff420-4c0e-42f9-a564-7d110a40f9d0\") " pod="openshift-image-registry/image-registry-6f7749bc57-7ktsn" Apr 20 20:05:58.763551 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:58.763418 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 20:05:58.763551 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:58.763432 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6f7749bc57-7ktsn: secret "image-registry-tls" not found Apr 20 20:05:58.763551 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:58.763491 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a78ff420-4c0e-42f9-a564-7d110a40f9d0-registry-tls podName:a78ff420-4c0e-42f9-a564-7d110a40f9d0 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:59.763475967 +0000 UTC m=+34.476157956 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a78ff420-4c0e-42f9-a564-7d110a40f9d0-registry-tls") pod "image-registry-6f7749bc57-7ktsn" (UID: "a78ff420-4c0e-42f9-a564-7d110a40f9d0") : secret "image-registry-tls" not found Apr 20 20:05:58.863686 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.863658 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3dfe201c-4c10-4a3b-98f1-39ca2bed620f-cert\") pod \"ingress-canary-s4dgk\" (UID: \"3dfe201c-4c10-4a3b-98f1-39ca2bed620f\") " pod="openshift-ingress-canary/ingress-canary-s4dgk" Apr 20 20:05:58.863865 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:58.863759 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/88bdb2a3-8afb-427f-bc63-e22166098be9-metrics-tls\") pod \"dns-default-x2zsb\" (UID: \"88bdb2a3-8afb-427f-bc63-e22166098be9\") " pod="openshift-dns/dns-default-x2zsb" Apr 20 20:05:58.864347 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:58.864140 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:05:58.864347 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:58.864149 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:05:58.864347 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:58.864247 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88bdb2a3-8afb-427f-bc63-e22166098be9-metrics-tls podName:88bdb2a3-8afb-427f-bc63-e22166098be9 nodeName:}" failed. No retries permitted until 2026-04-20 20:05:59.864216977 +0000 UTC m=+34.576898964 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/88bdb2a3-8afb-427f-bc63-e22166098be9-metrics-tls") pod "dns-default-x2zsb" (UID: "88bdb2a3-8afb-427f-bc63-e22166098be9") : secret "dns-default-metrics-tls" not found Apr 20 20:05:58.867049 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:58.866605 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3dfe201c-4c10-4a3b-98f1-39ca2bed620f-cert podName:3dfe201c-4c10-4a3b-98f1-39ca2bed620f nodeName:}" failed. No retries permitted until 2026-04-20 20:05:59.864306932 +0000 UTC m=+34.576988917 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3dfe201c-4c10-4a3b-98f1-39ca2bed620f-cert") pod "ingress-canary-s4dgk" (UID: "3dfe201c-4c10-4a3b-98f1-39ca2bed620f") : secret "canary-serving-cert" not found Apr 20 20:05:59.569154 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:59.569100 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4c96dea8-a54f-4ca2-a3fb-757208554fe3-metrics-certs\") pod \"network-metrics-daemon-2n29c\" (UID: \"4c96dea8-a54f-4ca2-a3fb-757208554fe3\") " pod="openshift-multus/network-metrics-daemon-2n29c" Apr 20 20:05:59.569777 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:59.569265 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:59.569777 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:59.569282 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2f559047-01c4-4f3b-a5a7-1c183af11e8f-original-pull-secret\") pod \"global-pull-secret-syncer-7wn8c\" (UID: \"2f559047-01c4-4f3b-a5a7-1c183af11e8f\") " pod="kube-system/global-pull-secret-syncer-7wn8c" Apr 20 20:05:59.569777 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:59.569333 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c96dea8-a54f-4ca2-a3fb-757208554fe3-metrics-certs podName:4c96dea8-a54f-4ca2-a3fb-757208554fe3 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:31.569319301 +0000 UTC m=+66.282001279 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4c96dea8-a54f-4ca2-a3fb-757208554fe3-metrics-certs") pod "network-metrics-daemon-2n29c" (UID: "4c96dea8-a54f-4ca2-a3fb-757208554fe3") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:05:59.569777 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:59.569395 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 20:05:59.569777 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:59.569443 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f559047-01c4-4f3b-a5a7-1c183af11e8f-original-pull-secret podName:2f559047-01c4-4f3b-a5a7-1c183af11e8f nodeName:}" failed. No retries permitted until 2026-04-20 20:06:31.569429175 +0000 UTC m=+66.282111162 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/2f559047-01c4-4f3b-a5a7-1c183af11e8f-original-pull-secret") pod "global-pull-secret-syncer-7wn8c" (UID: "2f559047-01c4-4f3b-a5a7-1c183af11e8f") : object "kube-system"/"original-pull-secret" not registered Apr 20 20:05:59.771755 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:59.771709 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84l8q\" (UniqueName: \"kubernetes.io/projected/26d6564f-741b-481d-a1c3-a42559981c32-kube-api-access-84l8q\") pod \"network-check-target-p8x9m\" (UID: \"26d6564f-741b-481d-a1c3-a42559981c32\") " pod="openshift-network-diagnostics/network-check-target-p8x9m" Apr 20 20:05:59.772019 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:59.771777 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a78ff420-4c0e-42f9-a564-7d110a40f9d0-registry-tls\") pod \"image-registry-6f7749bc57-7ktsn\" (UID: \"a78ff420-4c0e-42f9-a564-7d110a40f9d0\") " pod="openshift-image-registry/image-registry-6f7749bc57-7ktsn" Apr 20 20:05:59.772019 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:59.771837 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:05:59.772019 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:59.771860 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:05:59.772019 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:59.771874 2576 projected.go:194] Error preparing data for projected volume kube-api-access-84l8q for pod openshift-network-diagnostics/network-check-target-p8x9m: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:59.772019 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:59.771935 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 20:05:59.772019 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:59.771952 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/26d6564f-741b-481d-a1c3-a42559981c32-kube-api-access-84l8q podName:26d6564f-741b-481d-a1c3-a42559981c32 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:31.771932471 +0000 UTC m=+66.484614462 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-84l8q" (UniqueName: "kubernetes.io/projected/26d6564f-741b-481d-a1c3-a42559981c32-kube-api-access-84l8q") pod "network-check-target-p8x9m" (UID: "26d6564f-741b-481d-a1c3-a42559981c32") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:05:59.772019 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:59.771954 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6f7749bc57-7ktsn: secret "image-registry-tls" not found Apr 20 20:05:59.772019 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:59.771992 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a78ff420-4c0e-42f9-a564-7d110a40f9d0-registry-tls podName:a78ff420-4c0e-42f9-a564-7d110a40f9d0 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:01.771982345 +0000 UTC m=+36.484664320 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a78ff420-4c0e-42f9-a564-7d110a40f9d0-registry-tls") pod "image-registry-6f7749bc57-7ktsn" (UID: "a78ff420-4c0e-42f9-a564-7d110a40f9d0") : secret "image-registry-tls" not found Apr 20 20:05:59.872351 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:59.872270 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/88bdb2a3-8afb-427f-bc63-e22166098be9-metrics-tls\") pod \"dns-default-x2zsb\" (UID: \"88bdb2a3-8afb-427f-bc63-e22166098be9\") " pod="openshift-dns/dns-default-x2zsb" Apr 20 20:05:59.872351 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:59.872322 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3dfe201c-4c10-4a3b-98f1-39ca2bed620f-cert\") pod \"ingress-canary-s4dgk\" (UID: \"3dfe201c-4c10-4a3b-98f1-39ca2bed620f\") " pod="openshift-ingress-canary/ingress-canary-s4dgk" Apr 20 20:05:59.872552 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:59.872441 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:05:59.872552 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:59.872497 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:05:59.872552 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:59.872520 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88bdb2a3-8afb-427f-bc63-e22166098be9-metrics-tls podName:88bdb2a3-8afb-427f-bc63-e22166098be9 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:01.872496123 +0000 UTC m=+36.585178126 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/88bdb2a3-8afb-427f-bc63-e22166098be9-metrics-tls") pod "dns-default-x2zsb" (UID: "88bdb2a3-8afb-427f-bc63-e22166098be9") : secret "dns-default-metrics-tls" not found Apr 20 20:05:59.872552 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:05:59.872549 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3dfe201c-4c10-4a3b-98f1-39ca2bed620f-cert podName:3dfe201c-4c10-4a3b-98f1-39ca2bed620f nodeName:}" failed. No retries permitted until 2026-04-20 20:06:01.872534718 +0000 UTC m=+36.585216692 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3dfe201c-4c10-4a3b-98f1-39ca2bed620f-cert") pod "ingress-canary-s4dgk" (UID: "3dfe201c-4c10-4a3b-98f1-39ca2bed620f") : secret "canary-serving-cert" not found Apr 20 20:05:59.939681 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:59.939644 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2n29c" Apr 20 20:05:59.939880 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:59.939856 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7wn8c" Apr 20 20:05:59.942734 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:59.942704 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 20:05:59.942853 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:59.942758 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 20:05:59.942925 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:59.942859 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-gwc22\"" Apr 20 20:05:59.962631 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:59.962609 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p8x9m" Apr 20 20:05:59.965488 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:59.965468 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-5w7vj\"" Apr 20 20:05:59.965598 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:59.965505 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 20:05:59.965598 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:05:59.965473 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 20:06:01.094335 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:01.094162 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-68wc5" event={"ID":"45f05212-3e62-445f-af62-b586721d3417","Type":"ContainerStarted","Data":"57655d27091d6698259a6d9e65f377e57f418089233d2a3fd55993bf19fd1fdb"} Apr 20 20:06:01.785681 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:01.785646 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a78ff420-4c0e-42f9-a564-7d110a40f9d0-registry-tls\") pod \"image-registry-6f7749bc57-7ktsn\" (UID: \"a78ff420-4c0e-42f9-a564-7d110a40f9d0\") " pod="openshift-image-registry/image-registry-6f7749bc57-7ktsn" Apr 20 20:06:01.785846 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:06:01.785755 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 20:06:01.785846 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:06:01.785767 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6f7749bc57-7ktsn: secret "image-registry-tls" not found Apr 20 20:06:01.785846 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:06:01.785813 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a78ff420-4c0e-42f9-a564-7d110a40f9d0-registry-tls podName:a78ff420-4c0e-42f9-a564-7d110a40f9d0 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:05.785800882 +0000 UTC m=+40.498482856 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a78ff420-4c0e-42f9-a564-7d110a40f9d0-registry-tls") pod "image-registry-6f7749bc57-7ktsn" (UID: "a78ff420-4c0e-42f9-a564-7d110a40f9d0") : secret "image-registry-tls" not found Apr 20 20:06:01.886123 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:01.886085 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/88bdb2a3-8afb-427f-bc63-e22166098be9-metrics-tls\") pod \"dns-default-x2zsb\" (UID: \"88bdb2a3-8afb-427f-bc63-e22166098be9\") " pod="openshift-dns/dns-default-x2zsb" Apr 20 20:06:01.886238 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:01.886138 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3dfe201c-4c10-4a3b-98f1-39ca2bed620f-cert\") pod \"ingress-canary-s4dgk\" (UID: \"3dfe201c-4c10-4a3b-98f1-39ca2bed620f\") " pod="openshift-ingress-canary/ingress-canary-s4dgk" Apr 20 20:06:01.886275 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:06:01.886238 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:06:01.886275 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:06:01.886244 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:06:01.886333 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:06:01.886286 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3dfe201c-4c10-4a3b-98f1-39ca2bed620f-cert podName:3dfe201c-4c10-4a3b-98f1-39ca2bed620f nodeName:}" failed. No retries permitted until 2026-04-20 20:06:05.886273434 +0000 UTC m=+40.598955408 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3dfe201c-4c10-4a3b-98f1-39ca2bed620f-cert") pod "ingress-canary-s4dgk" (UID: "3dfe201c-4c10-4a3b-98f1-39ca2bed620f") : secret "canary-serving-cert" not found Apr 20 20:06:01.886333 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:06:01.886298 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88bdb2a3-8afb-427f-bc63-e22166098be9-metrics-tls podName:88bdb2a3-8afb-427f-bc63-e22166098be9 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:05.8862922 +0000 UTC m=+40.598974174 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/88bdb2a3-8afb-427f-bc63-e22166098be9-metrics-tls") pod "dns-default-x2zsb" (UID: "88bdb2a3-8afb-427f-bc63-e22166098be9") : secret "dns-default-metrics-tls" not found Apr 20 20:06:02.098143 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:02.098065 2576 generic.go:358] "Generic (PLEG): container finished" podID="45f05212-3e62-445f-af62-b586721d3417" containerID="57655d27091d6698259a6d9e65f377e57f418089233d2a3fd55993bf19fd1fdb" exitCode=0 Apr 20 20:06:02.098143 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:02.098130 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-68wc5" event={"ID":"45f05212-3e62-445f-af62-b586721d3417","Type":"ContainerDied","Data":"57655d27091d6698259a6d9e65f377e57f418089233d2a3fd55993bf19fd1fdb"} Apr 20 20:06:03.102223 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:03.102189 2576 generic.go:358] "Generic (PLEG): container finished" podID="45f05212-3e62-445f-af62-b586721d3417" containerID="c0da959dc968864e9d68a79eb941e314cc98a5131929681f29a03715e6bfe3da" exitCode=0 Apr 20 20:06:03.102613 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:03.102242 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-68wc5" event={"ID":"45f05212-3e62-445f-af62-b586721d3417","Type":"ContainerDied","Data":"c0da959dc968864e9d68a79eb941e314cc98a5131929681f29a03715e6bfe3da"} Apr 20 20:06:04.106214 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:04.106182 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-68wc5" event={"ID":"45f05212-3e62-445f-af62-b586721d3417","Type":"ContainerStarted","Data":"2dd8591cf5f7faa74e85b6b71a3bfeccf0f6e95ec1a8fd17a64c7f2c4240ffa7"} Apr 20 20:06:04.131044 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:04.130995 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-68wc5" podStartSLOduration=6.004975395 podStartE2EDuration="38.130982109s" podCreationTimestamp="2026-04-20 20:05:26 +0000 UTC" firstStartedPulling="2026-04-20 20:05:28.673793804 +0000 UTC m=+3.386475792" lastFinishedPulling="2026-04-20 20:06:00.799800519 +0000 UTC m=+35.512482506" observedRunningTime="2026-04-20 20:06:04.129220461 +0000 UTC m=+38.841902457" watchObservedRunningTime="2026-04-20 20:06:04.130982109 +0000 UTC m=+38.843664111" Apr 20 20:06:05.815049 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:05.815016 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a78ff420-4c0e-42f9-a564-7d110a40f9d0-registry-tls\") pod \"image-registry-6f7749bc57-7ktsn\" (UID: \"a78ff420-4c0e-42f9-a564-7d110a40f9d0\") " pod="openshift-image-registry/image-registry-6f7749bc57-7ktsn" Apr 20 20:06:05.815424 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:06:05.815169 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 20:06:05.815424 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:06:05.815183 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6f7749bc57-7ktsn: secret "image-registry-tls" not found Apr 20 20:06:05.815424 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:06:05.815228 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a78ff420-4c0e-42f9-a564-7d110a40f9d0-registry-tls podName:a78ff420-4c0e-42f9-a564-7d110a40f9d0 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:13.815214848 +0000 UTC m=+48.527896822 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a78ff420-4c0e-42f9-a564-7d110a40f9d0-registry-tls") pod "image-registry-6f7749bc57-7ktsn" (UID: "a78ff420-4c0e-42f9-a564-7d110a40f9d0") : secret "image-registry-tls" not found Apr 20 20:06:05.916318 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:05.916296 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/88bdb2a3-8afb-427f-bc63-e22166098be9-metrics-tls\") pod \"dns-default-x2zsb\" (UID: \"88bdb2a3-8afb-427f-bc63-e22166098be9\") " pod="openshift-dns/dns-default-x2zsb" Apr 20 20:06:05.916318 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:05.916322 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3dfe201c-4c10-4a3b-98f1-39ca2bed620f-cert\") pod \"ingress-canary-s4dgk\" (UID: \"3dfe201c-4c10-4a3b-98f1-39ca2bed620f\") " pod="openshift-ingress-canary/ingress-canary-s4dgk" Apr 20 20:06:05.916516 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:06:05.916447 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:06:05.916516 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:06:05.916459 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:06:05.916705 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:06:05.916516 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3dfe201c-4c10-4a3b-98f1-39ca2bed620f-cert podName:3dfe201c-4c10-4a3b-98f1-39ca2bed620f nodeName:}" failed. No retries permitted until 2026-04-20 20:06:13.916502204 +0000 UTC m=+48.629184178 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3dfe201c-4c10-4a3b-98f1-39ca2bed620f-cert") pod "ingress-canary-s4dgk" (UID: "3dfe201c-4c10-4a3b-98f1-39ca2bed620f") : secret "canary-serving-cert" not found Apr 20 20:06:05.919250 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:06:05.916818 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88bdb2a3-8afb-427f-bc63-e22166098be9-metrics-tls podName:88bdb2a3-8afb-427f-bc63-e22166098be9 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:13.916795511 +0000 UTC m=+48.629477500 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/88bdb2a3-8afb-427f-bc63-e22166098be9-metrics-tls") pod "dns-default-x2zsb" (UID: "88bdb2a3-8afb-427f-bc63-e22166098be9") : secret "dns-default-metrics-tls" not found Apr 20 20:06:07.120163 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:07.120133 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-f4898cd7d-ljxhs"] Apr 20 20:06:07.123779 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:07.123764 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f4898cd7d-ljxhs" Apr 20 20:06:07.126555 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:07.126539 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 20 20:06:07.126555 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:07.126545 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 20 20:06:07.126716 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:07.126538 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 20 20:06:07.126716 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:07.126539 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 20 20:06:07.132976 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:07.132956 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-f4898cd7d-ljxhs"] Apr 20 20:06:07.225921 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:07.225891 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m98jf\" (UniqueName: \"kubernetes.io/projected/ee6a48cf-727f-4b03-a512-585f80242787-kube-api-access-m98jf\") pod \"klusterlet-addon-workmgr-f4898cd7d-ljxhs\" (UID: \"ee6a48cf-727f-4b03-a512-585f80242787\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f4898cd7d-ljxhs" Apr 20 20:06:07.226017 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:07.225965 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ee6a48cf-727f-4b03-a512-585f80242787-tmp\") pod \"klusterlet-addon-workmgr-f4898cd7d-ljxhs\" (UID: \"ee6a48cf-727f-4b03-a512-585f80242787\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f4898cd7d-ljxhs" Apr 20 20:06:07.226056 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:07.226019 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/ee6a48cf-727f-4b03-a512-585f80242787-klusterlet-config\") pod \"klusterlet-addon-workmgr-f4898cd7d-ljxhs\" (UID: \"ee6a48cf-727f-4b03-a512-585f80242787\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f4898cd7d-ljxhs" Apr 20 20:06:07.326430 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:07.326408 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m98jf\" (UniqueName: \"kubernetes.io/projected/ee6a48cf-727f-4b03-a512-585f80242787-kube-api-access-m98jf\") pod \"klusterlet-addon-workmgr-f4898cd7d-ljxhs\" (UID: \"ee6a48cf-727f-4b03-a512-585f80242787\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f4898cd7d-ljxhs" Apr 20 20:06:07.326586 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:07.326461 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ee6a48cf-727f-4b03-a512-585f80242787-tmp\") pod \"klusterlet-addon-workmgr-f4898cd7d-ljxhs\" (UID: \"ee6a48cf-727f-4b03-a512-585f80242787\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f4898cd7d-ljxhs" Apr 20 20:06:07.326586 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:07.326563 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/ee6a48cf-727f-4b03-a512-585f80242787-klusterlet-config\") pod \"klusterlet-addon-workmgr-f4898cd7d-ljxhs\" (UID: \"ee6a48cf-727f-4b03-a512-585f80242787\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f4898cd7d-ljxhs" Apr 20 20:06:07.327640 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:07.327618 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ee6a48cf-727f-4b03-a512-585f80242787-tmp\") pod \"klusterlet-addon-workmgr-f4898cd7d-ljxhs\" (UID: \"ee6a48cf-727f-4b03-a512-585f80242787\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f4898cd7d-ljxhs" Apr 20 20:06:07.329734 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:07.329714 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/ee6a48cf-727f-4b03-a512-585f80242787-klusterlet-config\") pod \"klusterlet-addon-workmgr-f4898cd7d-ljxhs\" (UID: \"ee6a48cf-727f-4b03-a512-585f80242787\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f4898cd7d-ljxhs" Apr 20 20:06:07.334754 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:07.334736 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m98jf\" (UniqueName: \"kubernetes.io/projected/ee6a48cf-727f-4b03-a512-585f80242787-kube-api-access-m98jf\") pod \"klusterlet-addon-workmgr-f4898cd7d-ljxhs\" (UID: \"ee6a48cf-727f-4b03-a512-585f80242787\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f4898cd7d-ljxhs" Apr 20 20:06:07.431866 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:07.431809 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f4898cd7d-ljxhs" Apr 20 20:06:07.560149 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:07.560121 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-f4898cd7d-ljxhs"] Apr 20 20:06:07.563210 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:06:07.563179 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee6a48cf_727f_4b03_a512_585f80242787.slice/crio-cba45888bfda36ad074ba773cbeed5ebd48ebc777833b874537ede59c886b4e6 WatchSource:0}: Error finding container cba45888bfda36ad074ba773cbeed5ebd48ebc777833b874537ede59c886b4e6: Status 404 returned error can't find the container with id cba45888bfda36ad074ba773cbeed5ebd48ebc777833b874537ede59c886b4e6 Apr 20 20:06:08.113344 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:08.113308 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f4898cd7d-ljxhs" event={"ID":"ee6a48cf-727f-4b03-a512-585f80242787","Type":"ContainerStarted","Data":"cba45888bfda36ad074ba773cbeed5ebd48ebc777833b874537ede59c886b4e6"} Apr 20 20:06:12.122241 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:12.122199 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f4898cd7d-ljxhs" event={"ID":"ee6a48cf-727f-4b03-a512-585f80242787","Type":"ContainerStarted","Data":"77feb5849cdad77328b0aa4af693dff947b83fc023c4d8c723823dda81c8a447"} Apr 20 20:06:12.122573 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:12.122398 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f4898cd7d-ljxhs" Apr 20 20:06:12.123890 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:12.123866 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f4898cd7d-ljxhs" Apr 20 20:06:12.152121 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:12.152065 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f4898cd7d-ljxhs" podStartSLOduration=1.182491012 podStartE2EDuration="5.152055543s" podCreationTimestamp="2026-04-20 20:06:07 +0000 UTC" firstStartedPulling="2026-04-20 20:06:07.564938867 +0000 UTC m=+42.277620842" lastFinishedPulling="2026-04-20 20:06:11.534503384 +0000 UTC m=+46.247185373" observedRunningTime="2026-04-20 20:06:12.137382219 +0000 UTC m=+46.850064215" watchObservedRunningTime="2026-04-20 20:06:12.152055543 +0000 UTC m=+46.864737538" Apr 20 20:06:13.875515 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:13.875481 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a78ff420-4c0e-42f9-a564-7d110a40f9d0-registry-tls\") pod \"image-registry-6f7749bc57-7ktsn\" (UID: \"a78ff420-4c0e-42f9-a564-7d110a40f9d0\") " pod="openshift-image-registry/image-registry-6f7749bc57-7ktsn" Apr 20 20:06:13.875843 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:06:13.875599 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 20:06:13.875843 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:06:13.875610 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6f7749bc57-7ktsn: secret "image-registry-tls" not found Apr 20 20:06:13.875843 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:06:13.875658 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a78ff420-4c0e-42f9-a564-7d110a40f9d0-registry-tls podName:a78ff420-4c0e-42f9-a564-7d110a40f9d0 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:29.875643934 +0000 UTC m=+64.588325908 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a78ff420-4c0e-42f9-a564-7d110a40f9d0-registry-tls") pod "image-registry-6f7749bc57-7ktsn" (UID: "a78ff420-4c0e-42f9-a564-7d110a40f9d0") : secret "image-registry-tls" not found Apr 20 20:06:13.976162 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:13.976136 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/88bdb2a3-8afb-427f-bc63-e22166098be9-metrics-tls\") pod \"dns-default-x2zsb\" (UID: \"88bdb2a3-8afb-427f-bc63-e22166098be9\") " pod="openshift-dns/dns-default-x2zsb" Apr 20 20:06:13.976286 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:13.976166 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3dfe201c-4c10-4a3b-98f1-39ca2bed620f-cert\") pod \"ingress-canary-s4dgk\" (UID: \"3dfe201c-4c10-4a3b-98f1-39ca2bed620f\") " pod="openshift-ingress-canary/ingress-canary-s4dgk" Apr 20 20:06:13.976286 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:06:13.976266 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:06:13.976345 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:06:13.976314 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:06:13.976376 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:06:13.976317 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88bdb2a3-8afb-427f-bc63-e22166098be9-metrics-tls podName:88bdb2a3-8afb-427f-bc63-e22166098be9 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:29.976302591 +0000 UTC m=+64.688984566 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/88bdb2a3-8afb-427f-bc63-e22166098be9-metrics-tls") pod "dns-default-x2zsb" (UID: "88bdb2a3-8afb-427f-bc63-e22166098be9") : secret "dns-default-metrics-tls" not found Apr 20 20:06:13.976419 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:06:13.976384 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3dfe201c-4c10-4a3b-98f1-39ca2bed620f-cert podName:3dfe201c-4c10-4a3b-98f1-39ca2bed620f nodeName:}" failed. No retries permitted until 2026-04-20 20:06:29.97636181 +0000 UTC m=+64.689043787 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3dfe201c-4c10-4a3b-98f1-39ca2bed620f-cert") pod "ingress-canary-s4dgk" (UID: "3dfe201c-4c10-4a3b-98f1-39ca2bed620f") : secret "canary-serving-cert" not found Apr 20 20:06:24.093023 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:24.092998 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-z55qt" Apr 20 20:06:29.891596 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:29.891565 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a78ff420-4c0e-42f9-a564-7d110a40f9d0-registry-tls\") pod \"image-registry-6f7749bc57-7ktsn\" (UID: \"a78ff420-4c0e-42f9-a564-7d110a40f9d0\") " pod="openshift-image-registry/image-registry-6f7749bc57-7ktsn" Apr 20 20:06:29.891954 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:06:29.891706 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 20:06:29.891954 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:06:29.891723 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6f7749bc57-7ktsn: secret "image-registry-tls" not found Apr 20 20:06:29.891954 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:06:29.891777 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a78ff420-4c0e-42f9-a564-7d110a40f9d0-registry-tls podName:a78ff420-4c0e-42f9-a564-7d110a40f9d0 nodeName:}" failed. No retries permitted until 2026-04-20 20:07:01.891762249 +0000 UTC m=+96.604444224 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a78ff420-4c0e-42f9-a564-7d110a40f9d0-registry-tls") pod "image-registry-6f7749bc57-7ktsn" (UID: "a78ff420-4c0e-42f9-a564-7d110a40f9d0") : secret "image-registry-tls" not found Apr 20 20:06:29.992185 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:29.992153 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/88bdb2a3-8afb-427f-bc63-e22166098be9-metrics-tls\") pod \"dns-default-x2zsb\" (UID: \"88bdb2a3-8afb-427f-bc63-e22166098be9\") " pod="openshift-dns/dns-default-x2zsb" Apr 20 20:06:29.992185 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:29.992190 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3dfe201c-4c10-4a3b-98f1-39ca2bed620f-cert\") pod \"ingress-canary-s4dgk\" (UID: \"3dfe201c-4c10-4a3b-98f1-39ca2bed620f\") " pod="openshift-ingress-canary/ingress-canary-s4dgk" Apr 20 20:06:29.992364 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:06:29.992285 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:06:29.992364 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:06:29.992299 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:06:29.992364 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:06:29.992347 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88bdb2a3-8afb-427f-bc63-e22166098be9-metrics-tls podName:88bdb2a3-8afb-427f-bc63-e22166098be9 nodeName:}" failed. No retries permitted until 2026-04-20 20:07:01.992330443 +0000 UTC m=+96.705012420 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/88bdb2a3-8afb-427f-bc63-e22166098be9-metrics-tls") pod "dns-default-x2zsb" (UID: "88bdb2a3-8afb-427f-bc63-e22166098be9") : secret "dns-default-metrics-tls" not found Apr 20 20:06:29.992364 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:06:29.992365 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3dfe201c-4c10-4a3b-98f1-39ca2bed620f-cert podName:3dfe201c-4c10-4a3b-98f1-39ca2bed620f nodeName:}" failed. No retries permitted until 2026-04-20 20:07:01.99235714 +0000 UTC m=+96.705039114 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3dfe201c-4c10-4a3b-98f1-39ca2bed620f-cert") pod "ingress-canary-s4dgk" (UID: "3dfe201c-4c10-4a3b-98f1-39ca2bed620f") : secret "canary-serving-cert" not found Apr 20 20:06:31.602302 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:31.602248 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4c96dea8-a54f-4ca2-a3fb-757208554fe3-metrics-certs\") pod \"network-metrics-daemon-2n29c\" (UID: \"4c96dea8-a54f-4ca2-a3fb-757208554fe3\") " pod="openshift-multus/network-metrics-daemon-2n29c" Apr 20 20:06:31.602302 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:31.602303 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2f559047-01c4-4f3b-a5a7-1c183af11e8f-original-pull-secret\") pod \"global-pull-secret-syncer-7wn8c\" (UID: \"2f559047-01c4-4f3b-a5a7-1c183af11e8f\") " pod="kube-system/global-pull-secret-syncer-7wn8c" Apr 20 20:06:31.605299 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:31.605276 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 20:06:31.605382 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:31.605323 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 20:06:31.612669 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:06:31.612653 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 20:06:31.612737 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:06:31.612714 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c96dea8-a54f-4ca2-a3fb-757208554fe3-metrics-certs podName:4c96dea8-a54f-4ca2-a3fb-757208554fe3 nodeName:}" failed. No retries permitted until 2026-04-20 20:07:35.612698297 +0000 UTC m=+130.325380271 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4c96dea8-a54f-4ca2-a3fb-757208554fe3-metrics-certs") pod "network-metrics-daemon-2n29c" (UID: "4c96dea8-a54f-4ca2-a3fb-757208554fe3") : secret "metrics-daemon-secret" not found Apr 20 20:06:31.614698 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:31.614677 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2f559047-01c4-4f3b-a5a7-1c183af11e8f-original-pull-secret\") pod \"global-pull-secret-syncer-7wn8c\" (UID: \"2f559047-01c4-4f3b-a5a7-1c183af11e8f\") " pod="kube-system/global-pull-secret-syncer-7wn8c" Apr 20 20:06:31.756806 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:31.756779 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-7wn8c" Apr 20 20:06:31.804022 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:31.803990 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84l8q\" (UniqueName: \"kubernetes.io/projected/26d6564f-741b-481d-a1c3-a42559981c32-kube-api-access-84l8q\") pod \"network-check-target-p8x9m\" (UID: \"26d6564f-741b-481d-a1c3-a42559981c32\") " pod="openshift-network-diagnostics/network-check-target-p8x9m" Apr 20 20:06:31.807101 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:31.806948 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 20:06:31.817131 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:31.817095 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 20:06:31.827730 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:31.827685 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-84l8q\" (UniqueName: \"kubernetes.io/projected/26d6564f-741b-481d-a1c3-a42559981c32-kube-api-access-84l8q\") pod \"network-check-target-p8x9m\" (UID: \"26d6564f-741b-481d-a1c3-a42559981c32\") " pod="openshift-network-diagnostics/network-check-target-p8x9m" Apr 20 20:06:31.869809 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:31.869751 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-7wn8c"] Apr 20 20:06:31.872455 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:06:31.872419 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f559047_01c4_4f3b_a5a7_1c183af11e8f.slice/crio-25e435c404face206f5a36e77ff6267b3e4b2330d2d1c2f8cce392f6ffcad43c WatchSource:0}: Error finding container 25e435c404face206f5a36e77ff6267b3e4b2330d2d1c2f8cce392f6ffcad43c: Status 404 returned error can't find the container with id 25e435c404face206f5a36e77ff6267b3e4b2330d2d1c2f8cce392f6ffcad43c Apr 20 20:06:32.075296 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:32.075263 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-5w7vj\"" Apr 20 20:06:32.083630 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:32.083610 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-p8x9m" Apr 20 20:06:32.159006 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:32.158978 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-7wn8c" event={"ID":"2f559047-01c4-4f3b-a5a7-1c183af11e8f","Type":"ContainerStarted","Data":"25e435c404face206f5a36e77ff6267b3e4b2330d2d1c2f8cce392f6ffcad43c"} Apr 20 20:06:32.188659 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:32.188633 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-p8x9m"] Apr 20 20:06:32.191536 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:06:32.191513 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26d6564f_741b_481d_a1c3_a42559981c32.slice/crio-4748c6f493bf2c3dc8a88244964b11107bde1c63abc0b6ac2a75b512e6111a50 WatchSource:0}: Error finding container 4748c6f493bf2c3dc8a88244964b11107bde1c63abc0b6ac2a75b512e6111a50: Status 404 returned error can't find the container with id 4748c6f493bf2c3dc8a88244964b11107bde1c63abc0b6ac2a75b512e6111a50 Apr 20 20:06:33.162928 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:33.162884 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-p8x9m" event={"ID":"26d6564f-741b-481d-a1c3-a42559981c32","Type":"ContainerStarted","Data":"4748c6f493bf2c3dc8a88244964b11107bde1c63abc0b6ac2a75b512e6111a50"} Apr 20 20:06:37.172850 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:37.172810 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-7wn8c" event={"ID":"2f559047-01c4-4f3b-a5a7-1c183af11e8f","Type":"ContainerStarted","Data":"56b9eb577febae724763539db6b4b27135ed1a3ff57e3e469ba659b166c2ee18"} Apr 20 20:06:37.174080 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:37.174055 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-p8x9m" event={"ID":"26d6564f-741b-481d-a1c3-a42559981c32","Type":"ContainerStarted","Data":"a223a49873ac525e14a2cce339d6c5c387f5eddb8ee3922dd8a248635519a179"} Apr 20 20:06:37.174195 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:37.174155 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-p8x9m" Apr 20 20:06:37.192988 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:37.192940 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-7wn8c" podStartSLOduration=66.71576303 podStartE2EDuration="1m11.192929176s" podCreationTimestamp="2026-04-20 20:05:26 +0000 UTC" firstStartedPulling="2026-04-20 20:06:31.874134397 +0000 UTC m=+66.586816372" lastFinishedPulling="2026-04-20 20:06:36.351300539 +0000 UTC m=+71.063982518" observedRunningTime="2026-04-20 20:06:37.191939747 +0000 UTC m=+71.904621742" watchObservedRunningTime="2026-04-20 20:06:37.192929176 +0000 UTC m=+71.905611171" Apr 20 20:06:37.208447 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:06:37.208397 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-p8x9m" podStartSLOduration=67.054573437 podStartE2EDuration="1m11.208382811s" podCreationTimestamp="2026-04-20 20:05:26 +0000 UTC" firstStartedPulling="2026-04-20 20:06:32.193225077 +0000 UTC m=+66.905907052" lastFinishedPulling="2026-04-20 20:06:36.347034437 +0000 UTC m=+71.059716426" observedRunningTime="2026-04-20 20:06:37.207846231 +0000 UTC m=+71.920528229" watchObservedRunningTime="2026-04-20 20:06:37.208382811 +0000 UTC m=+71.921064812" Apr 20 20:07:01.934092 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:01.934037 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a78ff420-4c0e-42f9-a564-7d110a40f9d0-registry-tls\") pod \"image-registry-6f7749bc57-7ktsn\" (UID: \"a78ff420-4c0e-42f9-a564-7d110a40f9d0\") " pod="openshift-image-registry/image-registry-6f7749bc57-7ktsn" Apr 20 20:07:01.934502 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:07:01.934193 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 20:07:01.934502 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:07:01.934213 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6f7749bc57-7ktsn: secret "image-registry-tls" not found Apr 20 20:07:01.934502 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:07:01.934270 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a78ff420-4c0e-42f9-a564-7d110a40f9d0-registry-tls podName:a78ff420-4c0e-42f9-a564-7d110a40f9d0 nodeName:}" failed. No retries permitted until 2026-04-20 20:08:05.934254309 +0000 UTC m=+160.646936284 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a78ff420-4c0e-42f9-a564-7d110a40f9d0-registry-tls") pod "image-registry-6f7749bc57-7ktsn" (UID: "a78ff420-4c0e-42f9-a564-7d110a40f9d0") : secret "image-registry-tls" not found Apr 20 20:07:02.035232 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:02.035193 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/88bdb2a3-8afb-427f-bc63-e22166098be9-metrics-tls\") pod \"dns-default-x2zsb\" (UID: \"88bdb2a3-8afb-427f-bc63-e22166098be9\") " pod="openshift-dns/dns-default-x2zsb" Apr 20 20:07:02.035232 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:02.035233 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3dfe201c-4c10-4a3b-98f1-39ca2bed620f-cert\") pod \"ingress-canary-s4dgk\" (UID: \"3dfe201c-4c10-4a3b-98f1-39ca2bed620f\") " pod="openshift-ingress-canary/ingress-canary-s4dgk" Apr 20 20:07:02.035404 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:07:02.035341 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:07:02.035404 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:07:02.035341 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:07:02.035479 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:07:02.035410 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3dfe201c-4c10-4a3b-98f1-39ca2bed620f-cert podName:3dfe201c-4c10-4a3b-98f1-39ca2bed620f nodeName:}" failed. No retries permitted until 2026-04-20 20:08:06.035391975 +0000 UTC m=+160.748073971 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3dfe201c-4c10-4a3b-98f1-39ca2bed620f-cert") pod "ingress-canary-s4dgk" (UID: "3dfe201c-4c10-4a3b-98f1-39ca2bed620f") : secret "canary-serving-cert" not found Apr 20 20:07:02.035479 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:07:02.035429 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/88bdb2a3-8afb-427f-bc63-e22166098be9-metrics-tls podName:88bdb2a3-8afb-427f-bc63-e22166098be9 nodeName:}" failed. No retries permitted until 2026-04-20 20:08:06.035420228 +0000 UTC m=+160.748102204 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/88bdb2a3-8afb-427f-bc63-e22166098be9-metrics-tls") pod "dns-default-x2zsb" (UID: "88bdb2a3-8afb-427f-bc63-e22166098be9") : secret "dns-default-metrics-tls" not found Apr 20 20:07:08.178441 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:08.178411 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-p8x9m" Apr 20 20:07:35.670091 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:35.670054 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4c96dea8-a54f-4ca2-a3fb-757208554fe3-metrics-certs\") pod \"network-metrics-daemon-2n29c\" (UID: \"4c96dea8-a54f-4ca2-a3fb-757208554fe3\") " pod="openshift-multus/network-metrics-daemon-2n29c" Apr 20 20:07:35.670541 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:07:35.670208 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 20:07:35.670541 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:07:35.670281 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c96dea8-a54f-4ca2-a3fb-757208554fe3-metrics-certs podName:4c96dea8-a54f-4ca2-a3fb-757208554fe3 nodeName:}" failed. No retries permitted until 2026-04-20 20:09:37.670264488 +0000 UTC m=+252.382946472 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4c96dea8-a54f-4ca2-a3fb-757208554fe3-metrics-certs") pod "network-metrics-daemon-2n29c" (UID: "4c96dea8-a54f-4ca2-a3fb-757208554fe3") : secret "metrics-daemon-secret" not found Apr 20 20:07:46.173509 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:46.173483 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-kzk2j_9cdd6b10-c252-4482-9ba2-40fdae9ef435/dns-node-resolver/0.log" Apr 20 20:07:46.973454 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:46.973425 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-56wkj_8c5b0ba6-52e0-4a02-bb42-a4b2b377f250/node-ca/0.log" Apr 20 20:07:50.444448 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:50.444414 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jgpb4"] Apr 20 20:07:50.447301 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:50.447284 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jgpb4" Apr 20 20:07:50.450094 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:50.450071 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 20 20:07:50.450223 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:50.450071 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 20 20:07:50.451257 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:50.451231 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-ccsj8\"" Apr 20 20:07:50.451257 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:50.451245 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 20 20:07:50.451386 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:50.451234 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 20 20:07:50.456633 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:50.456611 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jgpb4"] Apr 20 20:07:50.571906 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:50.571883 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc006b49-b340-4b63-9917-d78702246b64-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-jgpb4\" (UID: \"bc006b49-b340-4b63-9917-d78702246b64\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jgpb4" Apr 20 20:07:50.572049 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:50.571935 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc006b49-b340-4b63-9917-d78702246b64-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-jgpb4\" (UID: \"bc006b49-b340-4b63-9917-d78702246b64\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jgpb4" Apr 20 20:07:50.572049 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:50.571968 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zj7j\" (UniqueName: \"kubernetes.io/projected/bc006b49-b340-4b63-9917-d78702246b64-kube-api-access-9zj7j\") pod \"kube-storage-version-migrator-operator-6769c5d45-jgpb4\" (UID: \"bc006b49-b340-4b63-9917-d78702246b64\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jgpb4" Apr 20 20:07:50.672356 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:50.672325 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc006b49-b340-4b63-9917-d78702246b64-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-jgpb4\" (UID: \"bc006b49-b340-4b63-9917-d78702246b64\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jgpb4" Apr 20 20:07:50.672488 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:50.672379 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc006b49-b340-4b63-9917-d78702246b64-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-jgpb4\" (UID: \"bc006b49-b340-4b63-9917-d78702246b64\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jgpb4" Apr 20 20:07:50.672488 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:50.672416 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9zj7j\" (UniqueName: \"kubernetes.io/projected/bc006b49-b340-4b63-9917-d78702246b64-kube-api-access-9zj7j\") pod \"kube-storage-version-migrator-operator-6769c5d45-jgpb4\" (UID: \"bc006b49-b340-4b63-9917-d78702246b64\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jgpb4" Apr 20 20:07:50.672814 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:50.672795 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc006b49-b340-4b63-9917-d78702246b64-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-jgpb4\" (UID: \"bc006b49-b340-4b63-9917-d78702246b64\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jgpb4" Apr 20 20:07:50.674458 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:50.674437 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc006b49-b340-4b63-9917-d78702246b64-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-jgpb4\" (UID: \"bc006b49-b340-4b63-9917-d78702246b64\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jgpb4" Apr 20 20:07:50.680552 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:50.680526 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zj7j\" (UniqueName: \"kubernetes.io/projected/bc006b49-b340-4b63-9917-d78702246b64-kube-api-access-9zj7j\") pod \"kube-storage-version-migrator-operator-6769c5d45-jgpb4\" (UID: \"bc006b49-b340-4b63-9917-d78702246b64\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jgpb4" Apr 20 20:07:50.755339 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:50.755289 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jgpb4" Apr 20 20:07:50.865981 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:50.865956 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jgpb4"] Apr 20 20:07:50.869158 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:07:50.869130 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc006b49_b340_4b63_9917_d78702246b64.slice/crio-9745ecd5d09ab5f23f15ea5724078b6610425fa638dc4eb82d6179a09a6909a9 WatchSource:0}: Error finding container 9745ecd5d09ab5f23f15ea5724078b6610425fa638dc4eb82d6179a09a6909a9: Status 404 returned error can't find the container with id 9745ecd5d09ab5f23f15ea5724078b6610425fa638dc4eb82d6179a09a6909a9 Apr 20 20:07:51.313686 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:51.313645 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jgpb4" event={"ID":"bc006b49-b340-4b63-9917-d78702246b64","Type":"ContainerStarted","Data":"9745ecd5d09ab5f23f15ea5724078b6610425fa638dc4eb82d6179a09a6909a9"} Apr 20 20:07:51.860497 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:51.860465 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-tsc9v"] Apr 20 20:07:51.863484 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:51.863460 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-tsc9v" Apr 20 20:07:51.866151 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:51.866131 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-nvqfr\"" Apr 20 20:07:51.870965 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:51.870940 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-tsc9v"] Apr 20 20:07:51.980395 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:51.980360 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zhxk\" (UniqueName: \"kubernetes.io/projected/cbd39223-9397-40f2-b3bf-fdd7ce3ab9ac-kube-api-access-8zhxk\") pod \"network-check-source-8894fc9bd-tsc9v\" (UID: \"cbd39223-9397-40f2-b3bf-fdd7ce3ab9ac\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-tsc9v" Apr 20 20:07:52.080720 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:52.080687 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8zhxk\" (UniqueName: \"kubernetes.io/projected/cbd39223-9397-40f2-b3bf-fdd7ce3ab9ac-kube-api-access-8zhxk\") pod \"network-check-source-8894fc9bd-tsc9v\" (UID: \"cbd39223-9397-40f2-b3bf-fdd7ce3ab9ac\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-tsc9v" Apr 20 20:07:52.090148 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:52.090121 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zhxk\" (UniqueName: \"kubernetes.io/projected/cbd39223-9397-40f2-b3bf-fdd7ce3ab9ac-kube-api-access-8zhxk\") pod \"network-check-source-8894fc9bd-tsc9v\" (UID: \"cbd39223-9397-40f2-b3bf-fdd7ce3ab9ac\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-tsc9v" Apr 20 20:07:52.174936 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:52.174875 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-tsc9v" Apr 20 20:07:52.293203 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:52.293176 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-tsc9v"] Apr 20 20:07:52.296342 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:07:52.296313 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbd39223_9397_40f2_b3bf_fdd7ce3ab9ac.slice/crio-bbd6343032091903747df300a621794e4fc1e0ef0c6755c33540f99f5786bec6 WatchSource:0}: Error finding container bbd6343032091903747df300a621794e4fc1e0ef0c6755c33540f99f5786bec6: Status 404 returned error can't find the container with id bbd6343032091903747df300a621794e4fc1e0ef0c6755c33540f99f5786bec6 Apr 20 20:07:52.316218 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:52.316188 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-tsc9v" event={"ID":"cbd39223-9397-40f2-b3bf-fdd7ce3ab9ac","Type":"ContainerStarted","Data":"bbd6343032091903747df300a621794e4fc1e0ef0c6755c33540f99f5786bec6"} Apr 20 20:07:53.319957 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:53.319916 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-tsc9v" event={"ID":"cbd39223-9397-40f2-b3bf-fdd7ce3ab9ac","Type":"ContainerStarted","Data":"b87e454e92c18ae9eee31090706f4e1bec2e69da194b408ba1550f1b2ed236ef"} Apr 20 20:07:53.321205 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:53.321181 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jgpb4" event={"ID":"bc006b49-b340-4b63-9917-d78702246b64","Type":"ContainerStarted","Data":"1acae262c21e958932ff87f8782982d0f9f46fbb078a5fe863e376e83e0ca704"} Apr 20 20:07:53.337890 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:53.337816 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-tsc9v" podStartSLOduration=2.337802855 podStartE2EDuration="2.337802855s" podCreationTimestamp="2026-04-20 20:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:07:53.337273292 +0000 UTC m=+148.049955290" watchObservedRunningTime="2026-04-20 20:07:53.337802855 +0000 UTC m=+148.050484853" Apr 20 20:07:53.354075 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:53.354023 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jgpb4" podStartSLOduration=1.151287899 podStartE2EDuration="3.354008871s" podCreationTimestamp="2026-04-20 20:07:50 +0000 UTC" firstStartedPulling="2026-04-20 20:07:50.870894279 +0000 UTC m=+145.583576254" lastFinishedPulling="2026-04-20 20:07:53.07361525 +0000 UTC m=+147.786297226" observedRunningTime="2026-04-20 20:07:53.353441102 +0000 UTC m=+148.066123100" watchObservedRunningTime="2026-04-20 20:07:53.354008871 +0000 UTC m=+148.066690869" Apr 20 20:07:53.934275 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:53.934234 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-85nmr"] Apr 20 20:07:53.937141 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:53.937125 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-85nmr" Apr 20 20:07:53.942555 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:53.942523 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 20 20:07:53.942746 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:53.942731 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 20 20:07:53.943820 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:53.943796 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-tgmk5\"" Apr 20 20:07:53.956553 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:53.956530 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-85nmr"] Apr 20 20:07:53.995831 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:53.995810 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/10ad3736-4c80-4544-ad33-36bdd65c0779-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-85nmr\" (UID: \"10ad3736-4c80-4544-ad33-36bdd65c0779\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-85nmr" Apr 20 20:07:53.995942 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:53.995930 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/10ad3736-4c80-4544-ad33-36bdd65c0779-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-85nmr\" (UID: \"10ad3736-4c80-4544-ad33-36bdd65c0779\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-85nmr" Apr 20 20:07:54.097153 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:54.097127 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/10ad3736-4c80-4544-ad33-36bdd65c0779-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-85nmr\" (UID: \"10ad3736-4c80-4544-ad33-36bdd65c0779\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-85nmr" Apr 20 20:07:54.097279 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:54.097174 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/10ad3736-4c80-4544-ad33-36bdd65c0779-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-85nmr\" (UID: \"10ad3736-4c80-4544-ad33-36bdd65c0779\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-85nmr" Apr 20 20:07:54.097328 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:07:54.097314 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 20:07:54.097381 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:07:54.097372 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/10ad3736-4c80-4544-ad33-36bdd65c0779-networking-console-plugin-cert podName:10ad3736-4c80-4544-ad33-36bdd65c0779 nodeName:}" failed. No retries permitted until 2026-04-20 20:07:54.597356684 +0000 UTC m=+149.310038663 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/10ad3736-4c80-4544-ad33-36bdd65c0779-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-85nmr" (UID: "10ad3736-4c80-4544-ad33-36bdd65c0779") : secret "networking-console-plugin-cert" not found Apr 20 20:07:54.097895 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:54.097872 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/10ad3736-4c80-4544-ad33-36bdd65c0779-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-85nmr\" (UID: \"10ad3736-4c80-4544-ad33-36bdd65c0779\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-85nmr" Apr 20 20:07:54.600439 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:54.600405 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/10ad3736-4c80-4544-ad33-36bdd65c0779-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-85nmr\" (UID: \"10ad3736-4c80-4544-ad33-36bdd65c0779\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-85nmr" Apr 20 20:07:54.600898 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:07:54.600528 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 20:07:54.600898 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:07:54.600586 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/10ad3736-4c80-4544-ad33-36bdd65c0779-networking-console-plugin-cert podName:10ad3736-4c80-4544-ad33-36bdd65c0779 nodeName:}" failed. No retries permitted until 2026-04-20 20:07:55.600570024 +0000 UTC m=+150.313251999 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/10ad3736-4c80-4544-ad33-36bdd65c0779-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-85nmr" (UID: "10ad3736-4c80-4544-ad33-36bdd65c0779") : secret "networking-console-plugin-cert" not found Apr 20 20:07:55.367547 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:55.367515 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-zszld"] Apr 20 20:07:55.370713 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:55.370690 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zszld" Apr 20 20:07:55.375151 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:55.375127 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 20 20:07:55.375294 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:55.375210 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-wn54g\"" Apr 20 20:07:55.375294 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:55.375247 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 20 20:07:55.380911 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:55.380890 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-zszld"] Apr 20 20:07:55.509905 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:55.509868 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-279x2\" (UniqueName: \"kubernetes.io/projected/88f6fa69-2bdb-4e93-941b-4e14d4be2a2f-kube-api-access-279x2\") pod \"migrator-74bb7799d9-zszld\" (UID: \"88f6fa69-2bdb-4e93-941b-4e14d4be2a2f\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zszld" Apr 20 20:07:55.610325 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:55.610289 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/10ad3736-4c80-4544-ad33-36bdd65c0779-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-85nmr\" (UID: \"10ad3736-4c80-4544-ad33-36bdd65c0779\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-85nmr" Apr 20 20:07:55.610629 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:55.610356 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-279x2\" (UniqueName: \"kubernetes.io/projected/88f6fa69-2bdb-4e93-941b-4e14d4be2a2f-kube-api-access-279x2\") pod \"migrator-74bb7799d9-zszld\" (UID: \"88f6fa69-2bdb-4e93-941b-4e14d4be2a2f\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zszld" Apr 20 20:07:55.610629 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:07:55.610446 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 20:07:55.610629 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:07:55.610522 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/10ad3736-4c80-4544-ad33-36bdd65c0779-networking-console-plugin-cert podName:10ad3736-4c80-4544-ad33-36bdd65c0779 nodeName:}" failed. No retries permitted until 2026-04-20 20:07:57.610501702 +0000 UTC m=+152.323183691 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/10ad3736-4c80-4544-ad33-36bdd65c0779-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-85nmr" (UID: "10ad3736-4c80-4544-ad33-36bdd65c0779") : secret "networking-console-plugin-cert" not found Apr 20 20:07:55.619123 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:55.619062 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-279x2\" (UniqueName: \"kubernetes.io/projected/88f6fa69-2bdb-4e93-941b-4e14d4be2a2f-kube-api-access-279x2\") pod \"migrator-74bb7799d9-zszld\" (UID: \"88f6fa69-2bdb-4e93-941b-4e14d4be2a2f\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zszld" Apr 20 20:07:55.679013 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:55.678991 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zszld" Apr 20 20:07:55.791078 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:55.791052 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-zszld"] Apr 20 20:07:55.793362 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:07:55.793336 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88f6fa69_2bdb_4e93_941b_4e14d4be2a2f.slice/crio-ef6e14a503727be118a0a20afed2aad0df6fff5a4895a50b6015cc340add68fb WatchSource:0}: Error finding container ef6e14a503727be118a0a20afed2aad0df6fff5a4895a50b6015cc340add68fb: Status 404 returned error can't find the container with id ef6e14a503727be118a0a20afed2aad0df6fff5a4895a50b6015cc340add68fb Apr 20 20:07:56.329846 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:56.329808 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zszld" event={"ID":"88f6fa69-2bdb-4e93-941b-4e14d4be2a2f","Type":"ContainerStarted","Data":"ef6e14a503727be118a0a20afed2aad0df6fff5a4895a50b6015cc340add68fb"} Apr 20 20:07:56.718476 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:56.718396 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-4rnlw"] Apr 20 20:07:56.721726 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:56.721699 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-4rnlw" Apr 20 20:07:56.729708 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:56.729661 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 20 20:07:56.730946 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:56.730931 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-mv68r\"" Apr 20 20:07:56.731284 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:56.731263 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 20 20:07:56.731406 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:56.731295 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 20 20:07:56.731406 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:56.731311 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 20 20:07:56.734465 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:56.734443 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-4rnlw"] Apr 20 20:07:56.819521 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:56.819485 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl66m\" (UniqueName: \"kubernetes.io/projected/a81717d3-dbd2-40c7-8313-12171e9bd99b-kube-api-access-xl66m\") pod \"service-ca-865cb79987-4rnlw\" (UID: \"a81717d3-dbd2-40c7-8313-12171e9bd99b\") " pod="openshift-service-ca/service-ca-865cb79987-4rnlw" Apr 20 20:07:56.819672 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:56.819574 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a81717d3-dbd2-40c7-8313-12171e9bd99b-signing-cabundle\") pod \"service-ca-865cb79987-4rnlw\" (UID: \"a81717d3-dbd2-40c7-8313-12171e9bd99b\") " pod="openshift-service-ca/service-ca-865cb79987-4rnlw" Apr 20 20:07:56.819672 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:56.819617 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a81717d3-dbd2-40c7-8313-12171e9bd99b-signing-key\") pod \"service-ca-865cb79987-4rnlw\" (UID: \"a81717d3-dbd2-40c7-8313-12171e9bd99b\") " pod="openshift-service-ca/service-ca-865cb79987-4rnlw" Apr 20 20:07:56.919913 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:56.919891 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a81717d3-dbd2-40c7-8313-12171e9bd99b-signing-cabundle\") pod \"service-ca-865cb79987-4rnlw\" (UID: \"a81717d3-dbd2-40c7-8313-12171e9bd99b\") " pod="openshift-service-ca/service-ca-865cb79987-4rnlw" Apr 20 20:07:56.920002 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:56.919960 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a81717d3-dbd2-40c7-8313-12171e9bd99b-signing-key\") pod \"service-ca-865cb79987-4rnlw\" (UID: \"a81717d3-dbd2-40c7-8313-12171e9bd99b\") " pod="openshift-service-ca/service-ca-865cb79987-4rnlw" Apr 20 20:07:56.920002 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:56.919989 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xl66m\" (UniqueName: \"kubernetes.io/projected/a81717d3-dbd2-40c7-8313-12171e9bd99b-kube-api-access-xl66m\") pod \"service-ca-865cb79987-4rnlw\" (UID: \"a81717d3-dbd2-40c7-8313-12171e9bd99b\") " pod="openshift-service-ca/service-ca-865cb79987-4rnlw" Apr 20 20:07:56.920568 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:56.920543 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a81717d3-dbd2-40c7-8313-12171e9bd99b-signing-cabundle\") pod \"service-ca-865cb79987-4rnlw\" (UID: \"a81717d3-dbd2-40c7-8313-12171e9bd99b\") " pod="openshift-service-ca/service-ca-865cb79987-4rnlw" Apr 20 20:07:56.922322 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:56.922301 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a81717d3-dbd2-40c7-8313-12171e9bd99b-signing-key\") pod \"service-ca-865cb79987-4rnlw\" (UID: \"a81717d3-dbd2-40c7-8313-12171e9bd99b\") " pod="openshift-service-ca/service-ca-865cb79987-4rnlw" Apr 20 20:07:56.929440 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:56.929415 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl66m\" (UniqueName: \"kubernetes.io/projected/a81717d3-dbd2-40c7-8313-12171e9bd99b-kube-api-access-xl66m\") pod \"service-ca-865cb79987-4rnlw\" (UID: \"a81717d3-dbd2-40c7-8313-12171e9bd99b\") " pod="openshift-service-ca/service-ca-865cb79987-4rnlw" Apr 20 20:07:57.032796 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:57.032775 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-4rnlw" Apr 20 20:07:57.143007 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:57.142982 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-4rnlw"] Apr 20 20:07:57.145911 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:07:57.145884 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda81717d3_dbd2_40c7_8313_12171e9bd99b.slice/crio-936a31254a038009040369c23e8feb18f6de2629f4388a1c4d897f66e90c521a WatchSource:0}: Error finding container 936a31254a038009040369c23e8feb18f6de2629f4388a1c4d897f66e90c521a: Status 404 returned error can't find the container with id 936a31254a038009040369c23e8feb18f6de2629f4388a1c4d897f66e90c521a Apr 20 20:07:57.333954 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:57.333922 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zszld" event={"ID":"88f6fa69-2bdb-4e93-941b-4e14d4be2a2f","Type":"ContainerStarted","Data":"9e60ae016d29dc730f6857f6cca4b224661043d9d18497c82fa6a4b065fc43f5"} Apr 20 20:07:57.334085 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:57.333963 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zszld" event={"ID":"88f6fa69-2bdb-4e93-941b-4e14d4be2a2f","Type":"ContainerStarted","Data":"0e29fa38805559db6667e648822dd2b083e7b26b5fb5463dc6a6bdc11e68f93b"} Apr 20 20:07:57.334881 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:57.334860 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-4rnlw" event={"ID":"a81717d3-dbd2-40c7-8313-12171e9bd99b","Type":"ContainerStarted","Data":"936a31254a038009040369c23e8feb18f6de2629f4388a1c4d897f66e90c521a"} Apr 20 20:07:57.354498 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:57.354451 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zszld" podStartSLOduration=1.2879004840000001 podStartE2EDuration="2.354439137s" podCreationTimestamp="2026-04-20 20:07:55 +0000 UTC" firstStartedPulling="2026-04-20 20:07:55.79508724 +0000 UTC m=+150.507769218" lastFinishedPulling="2026-04-20 20:07:56.861625884 +0000 UTC m=+151.574307871" observedRunningTime="2026-04-20 20:07:57.353688439 +0000 UTC m=+152.066370435" watchObservedRunningTime="2026-04-20 20:07:57.354439137 +0000 UTC m=+152.067121145" Apr 20 20:07:57.626080 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:57.625986 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/10ad3736-4c80-4544-ad33-36bdd65c0779-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-85nmr\" (UID: \"10ad3736-4c80-4544-ad33-36bdd65c0779\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-85nmr" Apr 20 20:07:57.626233 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:07:57.626169 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 20:07:57.626294 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:07:57.626249 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/10ad3736-4c80-4544-ad33-36bdd65c0779-networking-console-plugin-cert podName:10ad3736-4c80-4544-ad33-36bdd65c0779 nodeName:}" failed. No retries permitted until 2026-04-20 20:08:01.626228312 +0000 UTC m=+156.338910297 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/10ad3736-4c80-4544-ad33-36bdd65c0779-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-85nmr" (UID: "10ad3736-4c80-4544-ad33-36bdd65c0779") : secret "networking-console-plugin-cert" not found Apr 20 20:07:59.341495 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:59.341460 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-4rnlw" event={"ID":"a81717d3-dbd2-40c7-8313-12171e9bd99b","Type":"ContainerStarted","Data":"b3bac44dae318e59a9380fff21b830e2a28477bf187d05b4c89458c93d110ca4"} Apr 20 20:07:59.368920 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:07:59.368870 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-4rnlw" podStartSLOduration=1.761164561 podStartE2EDuration="3.368854868s" podCreationTimestamp="2026-04-20 20:07:56 +0000 UTC" firstStartedPulling="2026-04-20 20:07:57.147674269 +0000 UTC m=+151.860356245" lastFinishedPulling="2026-04-20 20:07:58.755364573 +0000 UTC m=+153.468046552" observedRunningTime="2026-04-20 20:07:59.367413634 +0000 UTC m=+154.080095622" watchObservedRunningTime="2026-04-20 20:07:59.368854868 +0000 UTC m=+154.081536867" Apr 20 20:08:01.154301 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:08:01.154256 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-6f7749bc57-7ktsn" podUID="a78ff420-4c0e-42f9-a564-7d110a40f9d0" Apr 20 20:08:01.180502 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:08:01.180481 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-s4dgk" podUID="3dfe201c-4c10-4a3b-98f1-39ca2bed620f" Apr 20 20:08:01.247578 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:08:01.247547 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-x2zsb" podUID="88bdb2a3-8afb-427f-bc63-e22166098be9" Apr 20 20:08:01.661124 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:01.661076 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/10ad3736-4c80-4544-ad33-36bdd65c0779-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-85nmr\" (UID: \"10ad3736-4c80-4544-ad33-36bdd65c0779\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-85nmr" Apr 20 20:08:01.661272 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:08:01.661224 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 20:08:01.661308 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:08:01.661290 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/10ad3736-4c80-4544-ad33-36bdd65c0779-networking-console-plugin-cert podName:10ad3736-4c80-4544-ad33-36bdd65c0779 nodeName:}" failed. No retries permitted until 2026-04-20 20:08:09.661274408 +0000 UTC m=+164.373956387 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/10ad3736-4c80-4544-ad33-36bdd65c0779-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-85nmr" (UID: "10ad3736-4c80-4544-ad33-36bdd65c0779") : secret "networking-console-plugin-cert" not found Apr 20 20:08:02.951412 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:08:02.951373 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-2n29c" podUID="4c96dea8-a54f-4ca2-a3fb-757208554fe3" Apr 20 20:08:05.996615 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:05.996584 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a78ff420-4c0e-42f9-a564-7d110a40f9d0-registry-tls\") pod \"image-registry-6f7749bc57-7ktsn\" (UID: \"a78ff420-4c0e-42f9-a564-7d110a40f9d0\") " pod="openshift-image-registry/image-registry-6f7749bc57-7ktsn" Apr 20 20:08:05.998896 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:05.998877 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a78ff420-4c0e-42f9-a564-7d110a40f9d0-registry-tls\") pod \"image-registry-6f7749bc57-7ktsn\" (UID: \"a78ff420-4c0e-42f9-a564-7d110a40f9d0\") " pod="openshift-image-registry/image-registry-6f7749bc57-7ktsn" Apr 20 20:08:06.097642 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:06.097612 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/88bdb2a3-8afb-427f-bc63-e22166098be9-metrics-tls\") pod \"dns-default-x2zsb\" (UID: \"88bdb2a3-8afb-427f-bc63-e22166098be9\") " pod="openshift-dns/dns-default-x2zsb" Apr 20 20:08:06.097774 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:06.097670 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3dfe201c-4c10-4a3b-98f1-39ca2bed620f-cert\") pod \"ingress-canary-s4dgk\" (UID: \"3dfe201c-4c10-4a3b-98f1-39ca2bed620f\") " pod="openshift-ingress-canary/ingress-canary-s4dgk" Apr 20 20:08:06.099825 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:06.099799 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/88bdb2a3-8afb-427f-bc63-e22166098be9-metrics-tls\") pod \"dns-default-x2zsb\" (UID: \"88bdb2a3-8afb-427f-bc63-e22166098be9\") " pod="openshift-dns/dns-default-x2zsb" Apr 20 20:08:06.099950 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:06.099932 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3dfe201c-4c10-4a3b-98f1-39ca2bed620f-cert\") pod \"ingress-canary-s4dgk\" (UID: \"3dfe201c-4c10-4a3b-98f1-39ca2bed620f\") " pod="openshift-ingress-canary/ingress-canary-s4dgk" Apr 20 20:08:09.723326 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:09.723292 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/10ad3736-4c80-4544-ad33-36bdd65c0779-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-85nmr\" (UID: \"10ad3736-4c80-4544-ad33-36bdd65c0779\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-85nmr" Apr 20 20:08:09.725726 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:09.725699 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/10ad3736-4c80-4544-ad33-36bdd65c0779-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-85nmr\" (UID: \"10ad3736-4c80-4544-ad33-36bdd65c0779\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-85nmr" Apr 20 20:08:09.845846 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:09.845818 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-85nmr" Apr 20 20:08:09.965292 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:09.965263 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-85nmr"] Apr 20 20:08:09.968412 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:08:09.968382 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10ad3736_4c80_4544_ad33_36bdd65c0779.slice/crio-2306d10ba43bbd32a5f98d070f5579c01ca78ce07d7b78c97673f5c38ce2a351 WatchSource:0}: Error finding container 2306d10ba43bbd32a5f98d070f5579c01ca78ce07d7b78c97673f5c38ce2a351: Status 404 returned error can't find the container with id 2306d10ba43bbd32a5f98d070f5579c01ca78ce07d7b78c97673f5c38ce2a351 Apr 20 20:08:10.368092 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:10.368060 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-85nmr" event={"ID":"10ad3736-4c80-4544-ad33-36bdd65c0779","Type":"ContainerStarted","Data":"2306d10ba43bbd32a5f98d070f5579c01ca78ce07d7b78c97673f5c38ce2a351"} Apr 20 20:08:11.371607 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:11.371571 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-85nmr" event={"ID":"10ad3736-4c80-4544-ad33-36bdd65c0779","Type":"ContainerStarted","Data":"dba99e75cc50fecb09e41227f2f4bc14f8e552711fe80da1abf0f336a656b057"} Apr 20 20:08:11.388336 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:11.388290 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-85nmr" podStartSLOduration=17.379647388 podStartE2EDuration="18.388276033s" podCreationTimestamp="2026-04-20 20:07:53 +0000 UTC" firstStartedPulling="2026-04-20 20:08:09.97072267 +0000 UTC m=+164.683404644" lastFinishedPulling="2026-04-20 20:08:10.979351315 +0000 UTC m=+165.692033289" observedRunningTime="2026-04-20 20:08:11.38790756 +0000 UTC m=+166.100589709" watchObservedRunningTime="2026-04-20 20:08:11.388276033 +0000 UTC m=+166.100958029" Apr 20 20:08:12.123497 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:12.123443 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f4898cd7d-ljxhs" podUID="ee6a48cf-727f-4b03-a512-585f80242787" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.9:8000/readyz\": dial tcp 10.132.0.9:8000: connect: connection refused" Apr 20 20:08:12.374808 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:12.374731 2576 generic.go:358] "Generic (PLEG): container finished" podID="ee6a48cf-727f-4b03-a512-585f80242787" containerID="77feb5849cdad77328b0aa4af693dff947b83fc023c4d8c723823dda81c8a447" exitCode=1 Apr 20 20:08:12.375155 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:12.374807 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f4898cd7d-ljxhs" event={"ID":"ee6a48cf-727f-4b03-a512-585f80242787","Type":"ContainerDied","Data":"77feb5849cdad77328b0aa4af693dff947b83fc023c4d8c723823dda81c8a447"} Apr 20 20:08:12.375238 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:12.375222 2576 scope.go:117] "RemoveContainer" containerID="77feb5849cdad77328b0aa4af693dff947b83fc023c4d8c723823dda81c8a447" Apr 20 20:08:13.378497 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:13.378464 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f4898cd7d-ljxhs" event={"ID":"ee6a48cf-727f-4b03-a512-585f80242787","Type":"ContainerStarted","Data":"5c27c5bbb93cf20e208990f19941ea7738863f6b904bd066daf56950c4b187d7"} Apr 20 20:08:13.378834 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:13.378729 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f4898cd7d-ljxhs" Apr 20 20:08:13.379311 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:13.379290 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f4898cd7d-ljxhs" Apr 20 20:08:13.939977 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:13.939951 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6f7749bc57-7ktsn" Apr 20 20:08:13.940171 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:13.940151 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-x2zsb" Apr 20 20:08:13.944655 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:13.944634 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-m22kk\"" Apr 20 20:08:13.944732 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:13.944634 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-dhv99\"" Apr 20 20:08:13.951282 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:13.951268 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-x2zsb" Apr 20 20:08:13.951376 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:13.951361 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6f7749bc57-7ktsn" Apr 20 20:08:14.077206 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:14.077178 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-x2zsb"] Apr 20 20:08:14.080177 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:08:14.080148 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88bdb2a3_8afb_427f_bc63_e22166098be9.slice/crio-fcd189074be3b7e2181a3b140bbb9ee099247f8f31a4ee8f65b412765b477993 WatchSource:0}: Error finding container fcd189074be3b7e2181a3b140bbb9ee099247f8f31a4ee8f65b412765b477993: Status 404 returned error can't find the container with id fcd189074be3b7e2181a3b140bbb9ee099247f8f31a4ee8f65b412765b477993 Apr 20 20:08:14.094302 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:14.094280 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6f7749bc57-7ktsn"] Apr 20 20:08:14.096905 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:08:14.096880 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda78ff420_4c0e_42f9_a564_7d110a40f9d0.slice/crio-4b5647dd18ff0863bcaad60cd6f7f56911504ea12a16f007acb1013776019058 WatchSource:0}: Error finding container 4b5647dd18ff0863bcaad60cd6f7f56911504ea12a16f007acb1013776019058: Status 404 returned error can't find the container with id 4b5647dd18ff0863bcaad60cd6f7f56911504ea12a16f007acb1013776019058 Apr 20 20:08:14.384132 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:14.384085 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6f7749bc57-7ktsn" event={"ID":"a78ff420-4c0e-42f9-a564-7d110a40f9d0","Type":"ContainerStarted","Data":"38c993cad43d63daad74446f4e37a41ef261180edc53d9d63288ce3ca6f2d2ec"} Apr 20 20:08:14.384530 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:14.384139 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6f7749bc57-7ktsn" event={"ID":"a78ff420-4c0e-42f9-a564-7d110a40f9d0","Type":"ContainerStarted","Data":"4b5647dd18ff0863bcaad60cd6f7f56911504ea12a16f007acb1013776019058"} Apr 20 20:08:14.384530 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:14.384160 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6f7749bc57-7ktsn" Apr 20 20:08:14.385156 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:14.385132 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-x2zsb" event={"ID":"88bdb2a3-8afb-427f-bc63-e22166098be9","Type":"ContainerStarted","Data":"fcd189074be3b7e2181a3b140bbb9ee099247f8f31a4ee8f65b412765b477993"} Apr 20 20:08:14.408455 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:14.408413 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6f7749bc57-7ktsn" podStartSLOduration=168.408399921 podStartE2EDuration="2m48.408399921s" podCreationTimestamp="2026-04-20 20:05:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:08:14.406906339 +0000 UTC m=+169.119588335" watchObservedRunningTime="2026-04-20 20:08:14.408399921 +0000 UTC m=+169.121081908" Apr 20 20:08:14.939410 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:14.939374 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-s4dgk" Apr 20 20:08:14.942469 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:14.942447 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-wkgqm\"" Apr 20 20:08:14.950640 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:14.950620 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-s4dgk" Apr 20 20:08:15.372000 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:15.371971 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-s4dgk"] Apr 20 20:08:15.375470 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:08:15.375435 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dfe201c_4c10_4a3b_98f1_39ca2bed620f.slice/crio-88edf039c888d1a0c696572b0985b4b425d8af3756da8493be915388cc54c89f WatchSource:0}: Error finding container 88edf039c888d1a0c696572b0985b4b425d8af3756da8493be915388cc54c89f: Status 404 returned error can't find the container with id 88edf039c888d1a0c696572b0985b4b425d8af3756da8493be915388cc54c89f Apr 20 20:08:15.389341 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:15.389312 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-s4dgk" event={"ID":"3dfe201c-4c10-4a3b-98f1-39ca2bed620f","Type":"ContainerStarted","Data":"88edf039c888d1a0c696572b0985b4b425d8af3756da8493be915388cc54c89f"} Apr 20 20:08:15.943646 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:15.943615 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2n29c" Apr 20 20:08:16.394031 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:16.393993 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-x2zsb" event={"ID":"88bdb2a3-8afb-427f-bc63-e22166098be9","Type":"ContainerStarted","Data":"f8b97a258e245d47cc491b6e17e1c10191c9b78103b8333cdf175b6586acd013"} Apr 20 20:08:16.394031 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:16.394032 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-x2zsb" event={"ID":"88bdb2a3-8afb-427f-bc63-e22166098be9","Type":"ContainerStarted","Data":"e7538b93174acdbaa411b9f6f845cc4ebc783248a8e1ca4430255981bf448aba"} Apr 20 20:08:16.394520 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:16.394213 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-x2zsb" Apr 20 20:08:16.416169 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:16.416095 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-x2zsb" podStartSLOduration=137.203505553 podStartE2EDuration="2m18.416080775s" podCreationTimestamp="2026-04-20 20:05:58 +0000 UTC" firstStartedPulling="2026-04-20 20:08:14.082018871 +0000 UTC m=+168.794700845" lastFinishedPulling="2026-04-20 20:08:15.294594074 +0000 UTC m=+170.007276067" observedRunningTime="2026-04-20 20:08:16.414458664 +0000 UTC m=+171.127140661" watchObservedRunningTime="2026-04-20 20:08:16.416080775 +0000 UTC m=+171.128762772" Apr 20 20:08:17.398292 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:17.398254 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-s4dgk" event={"ID":"3dfe201c-4c10-4a3b-98f1-39ca2bed620f","Type":"ContainerStarted","Data":"fefd92e0ea231ca9c2ffd6d5576d62c10bff90e0ccbd79b8a9f8b5a3dd489e23"} Apr 20 20:08:17.419617 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:17.419574 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-s4dgk" podStartSLOduration=137.859775711 podStartE2EDuration="2m19.419560835s" podCreationTimestamp="2026-04-20 20:05:58 +0000 UTC" firstStartedPulling="2026-04-20 20:08:15.378463282 +0000 UTC m=+170.091145258" lastFinishedPulling="2026-04-20 20:08:16.938248403 +0000 UTC m=+171.650930382" observedRunningTime="2026-04-20 20:08:17.418483133 +0000 UTC m=+172.131165154" watchObservedRunningTime="2026-04-20 20:08:17.419560835 +0000 UTC m=+172.132242831" Apr 20 20:08:20.948087 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:20.948050 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-59c5444cfc-k8zdh"] Apr 20 20:08:20.952750 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:20.952727 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59c5444cfc-k8zdh" Apr 20 20:08:20.955479 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:20.955457 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 20 20:08:20.956882 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:20.956856 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 20 20:08:20.956978 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:20.956901 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 20 20:08:20.956978 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:20.956920 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 20 20:08:20.956978 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:20.956947 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 20 20:08:20.957168 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:20.956992 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-9pl5s\"" Apr 20 20:08:20.957168 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:20.956947 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 20 20:08:20.957168 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:20.957150 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 20 20:08:20.969005 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:20.968987 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-59c5444cfc-k8zdh"] Apr 20 20:08:21.052990 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:21.052968 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8fmnj"] Apr 20 20:08:21.056020 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:21.056003 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8fmnj" Apr 20 20:08:21.059216 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:21.059201 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-cfjtr\"" Apr 20 20:08:21.059446 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:21.059420 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 20 20:08:21.070876 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:21.070855 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8fmnj"] Apr 20 20:08:21.078610 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:21.078590 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-hmlx5"] Apr 20 20:08:21.081860 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:21.081840 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-hmlx5" Apr 20 20:08:21.086070 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:21.086041 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 20:08:21.086211 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:21.086093 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 20:08:21.086211 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:21.086132 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 20:08:21.086211 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:21.086144 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-lxzp7\"" Apr 20 20:08:21.086514 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:21.086494 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 20:08:21.100703 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:21.100684 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-hmlx5"] Apr 20 20:08:21.110086 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:21.110066 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bf5957de-812d-47f7-b353-00ec68e9ecf4-console-serving-cert\") pod \"console-59c5444cfc-k8zdh\" (UID: \"bf5957de-812d-47f7-b353-00ec68e9ecf4\") " pod="openshift-console/console-59c5444cfc-k8zdh" Apr 20 20:08:21.110196 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:21.110101 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bf5957de-812d-47f7-b353-00ec68e9ecf4-console-oauth-config\") pod \"console-59c5444cfc-k8zdh\" (UID: \"bf5957de-812d-47f7-b353-00ec68e9ecf4\") " pod="openshift-console/console-59c5444cfc-k8zdh" Apr 20 20:08:21.110196 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:21.110181 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bf5957de-812d-47f7-b353-00ec68e9ecf4-service-ca\") pod \"console-59c5444cfc-k8zdh\" (UID: \"bf5957de-812d-47f7-b353-00ec68e9ecf4\") " pod="openshift-console/console-59c5444cfc-k8zdh" Apr 20 20:08:21.110308 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:21.110214 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bf5957de-812d-47f7-b353-00ec68e9ecf4-oauth-serving-cert\") pod \"console-59c5444cfc-k8zdh\" (UID: \"bf5957de-812d-47f7-b353-00ec68e9ecf4\") " pod="openshift-console/console-59c5444cfc-k8zdh" Apr 20 20:08:21.110308 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:21.110242 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bf5957de-812d-47f7-b353-00ec68e9ecf4-console-config\") pod \"console-59c5444cfc-k8zdh\" (UID: \"bf5957de-812d-47f7-b353-00ec68e9ecf4\") " pod="openshift-console/console-59c5444cfc-k8zdh" Apr 20 20:08:21.110308 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:21.110293 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24m45\" (UniqueName: \"kubernetes.io/projected/bf5957de-812d-47f7-b353-00ec68e9ecf4-kube-api-access-24m45\") pod \"console-59c5444cfc-k8zdh\" (UID: \"bf5957de-812d-47f7-b353-00ec68e9ecf4\") " pod="openshift-console/console-59c5444cfc-k8zdh" Apr 20 20:08:21.210620 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:21.210561 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-24m45\" (UniqueName: \"kubernetes.io/projected/bf5957de-812d-47f7-b353-00ec68e9ecf4-kube-api-access-24m45\") pod \"console-59c5444cfc-k8zdh\" (UID: \"bf5957de-812d-47f7-b353-00ec68e9ecf4\") " pod="openshift-console/console-59c5444cfc-k8zdh" Apr 20 20:08:21.210620 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:21.210606 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bf5957de-812d-47f7-b353-00ec68e9ecf4-console-serving-cert\") pod \"console-59c5444cfc-k8zdh\" (UID: \"bf5957de-812d-47f7-b353-00ec68e9ecf4\") " pod="openshift-console/console-59c5444cfc-k8zdh" Apr 20 20:08:21.210790 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:21.210637 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/25f0a3e8-817e-4737-ac33-9ebc57ac1001-crio-socket\") pod \"insights-runtime-extractor-hmlx5\" (UID: \"25f0a3e8-817e-4737-ac33-9ebc57ac1001\") " pod="openshift-insights/insights-runtime-extractor-hmlx5" Apr 20 20:08:21.210790 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:21.210661 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6t6z\" (UniqueName: \"kubernetes.io/projected/25f0a3e8-817e-4737-ac33-9ebc57ac1001-kube-api-access-q6t6z\") pod \"insights-runtime-extractor-hmlx5\" (UID: \"25f0a3e8-817e-4737-ac33-9ebc57ac1001\") " pod="openshift-insights/insights-runtime-extractor-hmlx5" Apr 20 20:08:21.210790 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:21.210691 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bf5957de-812d-47f7-b353-00ec68e9ecf4-console-oauth-config\") pod \"console-59c5444cfc-k8zdh\" (UID: \"bf5957de-812d-47f7-b353-00ec68e9ecf4\") " pod="openshift-console/console-59c5444cfc-k8zdh" Apr 20 20:08:21.210790 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:21.210763 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/25f0a3e8-817e-4737-ac33-9ebc57ac1001-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-hmlx5\" (UID: \"25f0a3e8-817e-4737-ac33-9ebc57ac1001\") " pod="openshift-insights/insights-runtime-extractor-hmlx5" Apr 20 20:08:21.210974 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:21.210806 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bf5957de-812d-47f7-b353-00ec68e9ecf4-service-ca\") pod \"console-59c5444cfc-k8zdh\" (UID: \"bf5957de-812d-47f7-b353-00ec68e9ecf4\") " pod="openshift-console/console-59c5444cfc-k8zdh" Apr 20 20:08:21.210974 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:21.210852 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bf5957de-812d-47f7-b353-00ec68e9ecf4-oauth-serving-cert\") pod \"console-59c5444cfc-k8zdh\" (UID: \"bf5957de-812d-47f7-b353-00ec68e9ecf4\") " pod="openshift-console/console-59c5444cfc-k8zdh" Apr 20 20:08:21.210974 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:21.210893 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bf5957de-812d-47f7-b353-00ec68e9ecf4-console-config\") pod \"console-59c5444cfc-k8zdh\" (UID: \"bf5957de-812d-47f7-b353-00ec68e9ecf4\") " pod="openshift-console/console-59c5444cfc-k8zdh" Apr 20 20:08:21.210974 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:21.210920 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/25f0a3e8-817e-4737-ac33-9ebc57ac1001-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-hmlx5\" (UID: \"25f0a3e8-817e-4737-ac33-9ebc57ac1001\") " pod="openshift-insights/insights-runtime-extractor-hmlx5" Apr 20 20:08:21.210974 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:21.210940 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/695e2e8f-4584-4016-a017-c63df5f16230-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-8fmnj\" (UID: \"695e2e8f-4584-4016-a017-c63df5f16230\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8fmnj" Apr 20 20:08:21.210974 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:21.210970 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/25f0a3e8-817e-4737-ac33-9ebc57ac1001-data-volume\") pod \"insights-runtime-extractor-hmlx5\" (UID: \"25f0a3e8-817e-4737-ac33-9ebc57ac1001\") " pod="openshift-insights/insights-runtime-extractor-hmlx5" Apr 20 20:08:21.211483 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:21.211463 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bf5957de-812d-47f7-b353-00ec68e9ecf4-service-ca\") pod \"console-59c5444cfc-k8zdh\" (UID: \"bf5957de-812d-47f7-b353-00ec68e9ecf4\") " pod="openshift-console/console-59c5444cfc-k8zdh" Apr 20 20:08:21.211522 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:21.211491 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bf5957de-812d-47f7-b353-00ec68e9ecf4-console-config\") pod \"console-59c5444cfc-k8zdh\" (UID: \"bf5957de-812d-47f7-b353-00ec68e9ecf4\") " pod="openshift-console/console-59c5444cfc-k8zdh" Apr 20 20:08:21.211554 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:21.211531 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bf5957de-812d-47f7-b353-00ec68e9ecf4-oauth-serving-cert\") pod \"console-59c5444cfc-k8zdh\" (UID: \"bf5957de-812d-47f7-b353-00ec68e9ecf4\") " pod="openshift-console/console-59c5444cfc-k8zdh" Apr 20 20:08:21.212967 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:21.212946 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bf5957de-812d-47f7-b353-00ec68e9ecf4-console-serving-cert\") pod \"console-59c5444cfc-k8zdh\" (UID: \"bf5957de-812d-47f7-b353-00ec68e9ecf4\") " pod="openshift-console/console-59c5444cfc-k8zdh" Apr 20 20:08:21.213481 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:21.213465 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bf5957de-812d-47f7-b353-00ec68e9ecf4-console-oauth-config\") pod \"console-59c5444cfc-k8zdh\" (UID: \"bf5957de-812d-47f7-b353-00ec68e9ecf4\") " pod="openshift-console/console-59c5444cfc-k8zdh" Apr 20 20:08:21.226661 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:21.226640 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-24m45\" (UniqueName: \"kubernetes.io/projected/bf5957de-812d-47f7-b353-00ec68e9ecf4-kube-api-access-24m45\") pod \"console-59c5444cfc-k8zdh\" (UID: \"bf5957de-812d-47f7-b353-00ec68e9ecf4\") " pod="openshift-console/console-59c5444cfc-k8zdh" Apr 20 20:08:21.261748 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:21.261727 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59c5444cfc-k8zdh" Apr 20 20:08:21.311585 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:21.311556 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/25f0a3e8-817e-4737-ac33-9ebc57ac1001-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-hmlx5\" (UID: \"25f0a3e8-817e-4737-ac33-9ebc57ac1001\") " pod="openshift-insights/insights-runtime-extractor-hmlx5" Apr 20 20:08:21.311714 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:21.311632 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/25f0a3e8-817e-4737-ac33-9ebc57ac1001-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-hmlx5\" (UID: \"25f0a3e8-817e-4737-ac33-9ebc57ac1001\") " pod="openshift-insights/insights-runtime-extractor-hmlx5" Apr 20 20:08:21.311714 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:21.311662 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/695e2e8f-4584-4016-a017-c63df5f16230-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-8fmnj\" (UID: \"695e2e8f-4584-4016-a017-c63df5f16230\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8fmnj" Apr 20 20:08:21.311714 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:21.311693 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/25f0a3e8-817e-4737-ac33-9ebc57ac1001-data-volume\") pod \"insights-runtime-extractor-hmlx5\" (UID: \"25f0a3e8-817e-4737-ac33-9ebc57ac1001\") " pod="openshift-insights/insights-runtime-extractor-hmlx5" Apr 20 20:08:21.311862 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:21.311754 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/25f0a3e8-817e-4737-ac33-9ebc57ac1001-crio-socket\") pod \"insights-runtime-extractor-hmlx5\" (UID: \"25f0a3e8-817e-4737-ac33-9ebc57ac1001\") " pod="openshift-insights/insights-runtime-extractor-hmlx5" Apr 20 20:08:21.311862 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:21.311778 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q6t6z\" (UniqueName: \"kubernetes.io/projected/25f0a3e8-817e-4737-ac33-9ebc57ac1001-kube-api-access-q6t6z\") pod \"insights-runtime-extractor-hmlx5\" (UID: \"25f0a3e8-817e-4737-ac33-9ebc57ac1001\") " pod="openshift-insights/insights-runtime-extractor-hmlx5" Apr 20 20:08:21.312483 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:21.312348 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/25f0a3e8-817e-4737-ac33-9ebc57ac1001-crio-socket\") pod \"insights-runtime-extractor-hmlx5\" (UID: \"25f0a3e8-817e-4737-ac33-9ebc57ac1001\") " pod="openshift-insights/insights-runtime-extractor-hmlx5" Apr 20 20:08:21.312483 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:21.312413 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/25f0a3e8-817e-4737-ac33-9ebc57ac1001-data-volume\") pod \"insights-runtime-extractor-hmlx5\" (UID: \"25f0a3e8-817e-4737-ac33-9ebc57ac1001\") " pod="openshift-insights/insights-runtime-extractor-hmlx5" Apr 20 20:08:21.312931 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:21.312816 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/25f0a3e8-817e-4737-ac33-9ebc57ac1001-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-hmlx5\" (UID: \"25f0a3e8-817e-4737-ac33-9ebc57ac1001\") " pod="openshift-insights/insights-runtime-extractor-hmlx5" Apr 20 20:08:21.314096 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:21.314074 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/695e2e8f-4584-4016-a017-c63df5f16230-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-8fmnj\" (UID: \"695e2e8f-4584-4016-a017-c63df5f16230\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8fmnj" Apr 20 20:08:21.314484 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:21.314467 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/25f0a3e8-817e-4737-ac33-9ebc57ac1001-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-hmlx5\" (UID: \"25f0a3e8-817e-4737-ac33-9ebc57ac1001\") " pod="openshift-insights/insights-runtime-extractor-hmlx5" Apr 20 20:08:21.325354 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:21.325334 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6t6z\" (UniqueName: \"kubernetes.io/projected/25f0a3e8-817e-4737-ac33-9ebc57ac1001-kube-api-access-q6t6z\") pod \"insights-runtime-extractor-hmlx5\" (UID: \"25f0a3e8-817e-4737-ac33-9ebc57ac1001\") " pod="openshift-insights/insights-runtime-extractor-hmlx5" Apr 20 20:08:21.363810 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:21.363775 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8fmnj" Apr 20 20:08:21.390552 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:21.390523 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-hmlx5" Apr 20 20:08:21.414672 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:21.414646 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-59c5444cfc-k8zdh"] Apr 20 20:08:21.418223 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:08:21.418147 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf5957de_812d_47f7_b353_00ec68e9ecf4.slice/crio-248d5b95f6497723e9d31198ae02297ad7bc6eb61b6c3f5c12becaa7b759854f WatchSource:0}: Error finding container 248d5b95f6497723e9d31198ae02297ad7bc6eb61b6c3f5c12becaa7b759854f: Status 404 returned error can't find the container with id 248d5b95f6497723e9d31198ae02297ad7bc6eb61b6c3f5c12becaa7b759854f Apr 20 20:08:21.490816 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:21.490792 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8fmnj"] Apr 20 20:08:21.492399 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:08:21.492369 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod695e2e8f_4584_4016_a017_c63df5f16230.slice/crio-ddadd2dff895cd05386740f2f64ba69409a7733fe42ffefded50724c3024f598 WatchSource:0}: Error finding container ddadd2dff895cd05386740f2f64ba69409a7733fe42ffefded50724c3024f598: Status 404 returned error can't find the container with id ddadd2dff895cd05386740f2f64ba69409a7733fe42ffefded50724c3024f598 Apr 20 20:08:21.516997 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:21.516972 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-hmlx5"] Apr 20 20:08:21.519736 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:08:21.519706 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25f0a3e8_817e_4737_ac33_9ebc57ac1001.slice/crio-ac68f508125903055c77a307a3943fffe0df632e53bce7aee743dc0250fe7083 WatchSource:0}: Error finding container ac68f508125903055c77a307a3943fffe0df632e53bce7aee743dc0250fe7083: Status 404 returned error can't find the container with id ac68f508125903055c77a307a3943fffe0df632e53bce7aee743dc0250fe7083 Apr 20 20:08:22.416272 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:22.416189 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hmlx5" event={"ID":"25f0a3e8-817e-4737-ac33-9ebc57ac1001","Type":"ContainerStarted","Data":"f860a46a2a67ddb4a93d54ae78bd7706543877f110dbf5c0ec21a27312d6adaa"} Apr 20 20:08:22.416272 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:22.416235 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hmlx5" event={"ID":"25f0a3e8-817e-4737-ac33-9ebc57ac1001","Type":"ContainerStarted","Data":"7c593a96b1baf845e1c1162234a968c57d673197935435ad9323d6bf85f71a0f"} Apr 20 20:08:22.416272 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:22.416248 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hmlx5" event={"ID":"25f0a3e8-817e-4737-ac33-9ebc57ac1001","Type":"ContainerStarted","Data":"ac68f508125903055c77a307a3943fffe0df632e53bce7aee743dc0250fe7083"} Apr 20 20:08:22.417589 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:22.417558 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8fmnj" event={"ID":"695e2e8f-4584-4016-a017-c63df5f16230","Type":"ContainerStarted","Data":"ddadd2dff895cd05386740f2f64ba69409a7733fe42ffefded50724c3024f598"} Apr 20 20:08:22.418987 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:22.418961 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59c5444cfc-k8zdh" event={"ID":"bf5957de-812d-47f7-b353-00ec68e9ecf4","Type":"ContainerStarted","Data":"248d5b95f6497723e9d31198ae02297ad7bc6eb61b6c3f5c12becaa7b759854f"} Apr 20 20:08:23.423491 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:23.423453 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8fmnj" event={"ID":"695e2e8f-4584-4016-a017-c63df5f16230","Type":"ContainerStarted","Data":"79e31e269a5971873153531a0ff667364f716802ae05cca5f8eb1bb35de5f169"} Apr 20 20:08:23.423921 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:23.423652 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8fmnj" Apr 20 20:08:23.429497 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:23.429370 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8fmnj" Apr 20 20:08:23.440348 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:23.440301 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-8fmnj" podStartSLOduration=1.353557383 podStartE2EDuration="2.44028384s" podCreationTimestamp="2026-04-20 20:08:21 +0000 UTC" firstStartedPulling="2026-04-20 20:08:21.494333095 +0000 UTC m=+176.207015069" lastFinishedPulling="2026-04-20 20:08:22.581059544 +0000 UTC m=+177.293741526" observedRunningTime="2026-04-20 20:08:23.439222008 +0000 UTC m=+178.151904030" watchObservedRunningTime="2026-04-20 20:08:23.44028384 +0000 UTC m=+178.152965839" Apr 20 20:08:25.430326 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:25.430286 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hmlx5" event={"ID":"25f0a3e8-817e-4737-ac33-9ebc57ac1001","Type":"ContainerStarted","Data":"8fc0d0bea4c187560dbad2454c47844a24cfb8fa0edcd3ff232c83337a19ec30"} Apr 20 20:08:25.431581 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:25.431561 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59c5444cfc-k8zdh" event={"ID":"bf5957de-812d-47f7-b353-00ec68e9ecf4","Type":"ContainerStarted","Data":"36963ee6200dc59d039abeb3f4d793ab77de6980063581593ca2d8c839ca1624"} Apr 20 20:08:25.455156 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:25.455091 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-hmlx5" podStartSLOduration=1.5599948719999999 podStartE2EDuration="4.455080176s" podCreationTimestamp="2026-04-20 20:08:21 +0000 UTC" firstStartedPulling="2026-04-20 20:08:21.578075462 +0000 UTC m=+176.290757436" lastFinishedPulling="2026-04-20 20:08:24.47316076 +0000 UTC m=+179.185842740" observedRunningTime="2026-04-20 20:08:25.454324672 +0000 UTC m=+180.167006669" watchObservedRunningTime="2026-04-20 20:08:25.455080176 +0000 UTC m=+180.167762172" Apr 20 20:08:25.480203 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:25.480167 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-59c5444cfc-k8zdh" podStartSLOduration=2.427896013 podStartE2EDuration="5.480157181s" podCreationTimestamp="2026-04-20 20:08:20 +0000 UTC" firstStartedPulling="2026-04-20 20:08:21.420060584 +0000 UTC m=+176.132742559" lastFinishedPulling="2026-04-20 20:08:24.472321746 +0000 UTC m=+179.185003727" observedRunningTime="2026-04-20 20:08:25.479419037 +0000 UTC m=+180.192101032" watchObservedRunningTime="2026-04-20 20:08:25.480157181 +0000 UTC m=+180.192839177" Apr 20 20:08:26.400176 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:26.400145 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-x2zsb" Apr 20 20:08:27.135263 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:27.135231 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-78556b798c-d8n9n"] Apr 20 20:08:27.141660 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:27.141640 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78556b798c-d8n9n" Apr 20 20:08:27.151221 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:27.151201 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 20 20:08:27.162508 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:27.162484 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-78556b798c-d8n9n"] Apr 20 20:08:27.255609 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:27.255580 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6832f12c-cb68-4e4c-9dde-084003a3d82d-oauth-serving-cert\") pod \"console-78556b798c-d8n9n\" (UID: \"6832f12c-cb68-4e4c-9dde-084003a3d82d\") " pod="openshift-console/console-78556b798c-d8n9n" Apr 20 20:08:27.255737 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:27.255621 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6832f12c-cb68-4e4c-9dde-084003a3d82d-service-ca\") pod \"console-78556b798c-d8n9n\" (UID: \"6832f12c-cb68-4e4c-9dde-084003a3d82d\") " pod="openshift-console/console-78556b798c-d8n9n" Apr 20 20:08:27.255737 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:27.255648 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6832f12c-cb68-4e4c-9dde-084003a3d82d-console-serving-cert\") pod \"console-78556b798c-d8n9n\" (UID: \"6832f12c-cb68-4e4c-9dde-084003a3d82d\") " pod="openshift-console/console-78556b798c-d8n9n" Apr 20 20:08:27.255737 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:27.255681 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjgwj\" (UniqueName: \"kubernetes.io/projected/6832f12c-cb68-4e4c-9dde-084003a3d82d-kube-api-access-hjgwj\") pod \"console-78556b798c-d8n9n\" (UID: \"6832f12c-cb68-4e4c-9dde-084003a3d82d\") " pod="openshift-console/console-78556b798c-d8n9n" Apr 20 20:08:27.255737 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:27.255702 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6832f12c-cb68-4e4c-9dde-084003a3d82d-console-oauth-config\") pod \"console-78556b798c-d8n9n\" (UID: \"6832f12c-cb68-4e4c-9dde-084003a3d82d\") " pod="openshift-console/console-78556b798c-d8n9n" Apr 20 20:08:27.255737 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:27.255721 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6832f12c-cb68-4e4c-9dde-084003a3d82d-trusted-ca-bundle\") pod \"console-78556b798c-d8n9n\" (UID: \"6832f12c-cb68-4e4c-9dde-084003a3d82d\") " pod="openshift-console/console-78556b798c-d8n9n" Apr 20 20:08:27.255903 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:27.255744 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6832f12c-cb68-4e4c-9dde-084003a3d82d-console-config\") pod \"console-78556b798c-d8n9n\" (UID: \"6832f12c-cb68-4e4c-9dde-084003a3d82d\") " pod="openshift-console/console-78556b798c-d8n9n" Apr 20 20:08:27.356676 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:27.356648 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6832f12c-cb68-4e4c-9dde-084003a3d82d-console-config\") pod \"console-78556b798c-d8n9n\" (UID: \"6832f12c-cb68-4e4c-9dde-084003a3d82d\") " pod="openshift-console/console-78556b798c-d8n9n" Apr 20 20:08:27.356789 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:27.356707 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6832f12c-cb68-4e4c-9dde-084003a3d82d-oauth-serving-cert\") pod \"console-78556b798c-d8n9n\" (UID: \"6832f12c-cb68-4e4c-9dde-084003a3d82d\") " pod="openshift-console/console-78556b798c-d8n9n" Apr 20 20:08:27.356789 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:27.356737 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6832f12c-cb68-4e4c-9dde-084003a3d82d-service-ca\") pod \"console-78556b798c-d8n9n\" (UID: \"6832f12c-cb68-4e4c-9dde-084003a3d82d\") " pod="openshift-console/console-78556b798c-d8n9n" Apr 20 20:08:27.356789 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:27.356756 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6832f12c-cb68-4e4c-9dde-084003a3d82d-console-serving-cert\") pod \"console-78556b798c-d8n9n\" (UID: \"6832f12c-cb68-4e4c-9dde-084003a3d82d\") " pod="openshift-console/console-78556b798c-d8n9n" Apr 20 20:08:27.356789 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:27.356773 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hjgwj\" (UniqueName: \"kubernetes.io/projected/6832f12c-cb68-4e4c-9dde-084003a3d82d-kube-api-access-hjgwj\") pod \"console-78556b798c-d8n9n\" (UID: \"6832f12c-cb68-4e4c-9dde-084003a3d82d\") " pod="openshift-console/console-78556b798c-d8n9n" Apr 20 20:08:27.356966 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:27.356798 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6832f12c-cb68-4e4c-9dde-084003a3d82d-console-oauth-config\") pod \"console-78556b798c-d8n9n\" (UID: \"6832f12c-cb68-4e4c-9dde-084003a3d82d\") " pod="openshift-console/console-78556b798c-d8n9n" Apr 20 20:08:27.356966 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:27.356856 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6832f12c-cb68-4e4c-9dde-084003a3d82d-trusted-ca-bundle\") pod \"console-78556b798c-d8n9n\" (UID: \"6832f12c-cb68-4e4c-9dde-084003a3d82d\") " pod="openshift-console/console-78556b798c-d8n9n" Apr 20 20:08:27.357471 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:27.357448 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6832f12c-cb68-4e4c-9dde-084003a3d82d-oauth-serving-cert\") pod \"console-78556b798c-d8n9n\" (UID: \"6832f12c-cb68-4e4c-9dde-084003a3d82d\") " pod="openshift-console/console-78556b798c-d8n9n" Apr 20 20:08:27.357572 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:27.357495 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6832f12c-cb68-4e4c-9dde-084003a3d82d-console-config\") pod \"console-78556b798c-d8n9n\" (UID: \"6832f12c-cb68-4e4c-9dde-084003a3d82d\") " pod="openshift-console/console-78556b798c-d8n9n" Apr 20 20:08:27.357572 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:27.357476 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6832f12c-cb68-4e4c-9dde-084003a3d82d-service-ca\") pod \"console-78556b798c-d8n9n\" (UID: \"6832f12c-cb68-4e4c-9dde-084003a3d82d\") " pod="openshift-console/console-78556b798c-d8n9n" Apr 20 20:08:27.357705 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:27.357681 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6832f12c-cb68-4e4c-9dde-084003a3d82d-trusted-ca-bundle\") pod \"console-78556b798c-d8n9n\" (UID: \"6832f12c-cb68-4e4c-9dde-084003a3d82d\") " pod="openshift-console/console-78556b798c-d8n9n" Apr 20 20:08:27.359122 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:27.359079 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6832f12c-cb68-4e4c-9dde-084003a3d82d-console-oauth-config\") pod \"console-78556b798c-d8n9n\" (UID: \"6832f12c-cb68-4e4c-9dde-084003a3d82d\") " pod="openshift-console/console-78556b798c-d8n9n" Apr 20 20:08:27.359318 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:27.359296 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6832f12c-cb68-4e4c-9dde-084003a3d82d-console-serving-cert\") pod \"console-78556b798c-d8n9n\" (UID: \"6832f12c-cb68-4e4c-9dde-084003a3d82d\") " pod="openshift-console/console-78556b798c-d8n9n" Apr 20 20:08:27.366749 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:27.366726 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjgwj\" (UniqueName: \"kubernetes.io/projected/6832f12c-cb68-4e4c-9dde-084003a3d82d-kube-api-access-hjgwj\") pod \"console-78556b798c-d8n9n\" (UID: \"6832f12c-cb68-4e4c-9dde-084003a3d82d\") " pod="openshift-console/console-78556b798c-d8n9n" Apr 20 20:08:27.450763 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:27.450697 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78556b798c-d8n9n" Apr 20 20:08:27.566689 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:27.566666 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-78556b798c-d8n9n"] Apr 20 20:08:27.569007 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:08:27.568976 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6832f12c_cb68_4e4c_9dde_084003a3d82d.slice/crio-3e1287a4a74d8762f0c9801eb3cb28382ef2705aa464e9d1cb1b5aefff6c9e23 WatchSource:0}: Error finding container 3e1287a4a74d8762f0c9801eb3cb28382ef2705aa464e9d1cb1b5aefff6c9e23: Status 404 returned error can't find the container with id 3e1287a4a74d8762f0c9801eb3cb28382ef2705aa464e9d1cb1b5aefff6c9e23 Apr 20 20:08:28.441574 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:28.441539 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78556b798c-d8n9n" event={"ID":"6832f12c-cb68-4e4c-9dde-084003a3d82d","Type":"ContainerStarted","Data":"ea099e5dbeffe2ff442c0ed6edaa7b0d8c887b29a26747f79ff9a74db3b565b4"} Apr 20 20:08:28.441574 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:28.441579 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78556b798c-d8n9n" event={"ID":"6832f12c-cb68-4e4c-9dde-084003a3d82d","Type":"ContainerStarted","Data":"3e1287a4a74d8762f0c9801eb3cb28382ef2705aa464e9d1cb1b5aefff6c9e23"} Apr 20 20:08:28.463061 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:28.463008 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-78556b798c-d8n9n" podStartSLOduration=1.462995327 podStartE2EDuration="1.462995327s" podCreationTimestamp="2026-04-20 20:08:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:08:28.461091637 +0000 UTC m=+183.173773633" watchObservedRunningTime="2026-04-20 20:08:28.462995327 +0000 UTC m=+183.175677322" Apr 20 20:08:28.753143 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:28.753043 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-srrtl"] Apr 20 20:08:28.778386 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:28.778359 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-srrtl" Apr 20 20:08:28.782431 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:28.782218 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 20:08:28.782431 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:28.782226 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 20:08:28.782431 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:28.782261 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-7dbd8\"" Apr 20 20:08:28.782431 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:28.782282 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 20:08:28.782431 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:28.782292 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 20:08:28.782431 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:28.782383 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 20:08:28.782790 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:28.782588 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 20:08:28.868009 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:28.867982 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/76f7379a-b448-41ee-851d-a712d528cbda-node-exporter-textfile\") pod \"node-exporter-srrtl\" (UID: \"76f7379a-b448-41ee-851d-a712d528cbda\") " pod="openshift-monitoring/node-exporter-srrtl" Apr 20 20:08:28.868156 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:28.868037 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/76f7379a-b448-41ee-851d-a712d528cbda-metrics-client-ca\") pod \"node-exporter-srrtl\" (UID: \"76f7379a-b448-41ee-851d-a712d528cbda\") " pod="openshift-monitoring/node-exporter-srrtl" Apr 20 20:08:28.868156 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:28.868078 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbwfz\" (UniqueName: \"kubernetes.io/projected/76f7379a-b448-41ee-851d-a712d528cbda-kube-api-access-jbwfz\") pod \"node-exporter-srrtl\" (UID: \"76f7379a-b448-41ee-851d-a712d528cbda\") " pod="openshift-monitoring/node-exporter-srrtl" Apr 20 20:08:28.868266 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:28.868165 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/76f7379a-b448-41ee-851d-a712d528cbda-root\") pod \"node-exporter-srrtl\" (UID: \"76f7379a-b448-41ee-851d-a712d528cbda\") " pod="openshift-monitoring/node-exporter-srrtl" Apr 20 20:08:28.868266 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:28.868203 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/76f7379a-b448-41ee-851d-a712d528cbda-node-exporter-wtmp\") pod \"node-exporter-srrtl\" (UID: \"76f7379a-b448-41ee-851d-a712d528cbda\") " pod="openshift-monitoring/node-exporter-srrtl" Apr 20 20:08:28.868266 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:28.868241 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/76f7379a-b448-41ee-851d-a712d528cbda-node-exporter-accelerators-collector-config\") pod \"node-exporter-srrtl\" (UID: \"76f7379a-b448-41ee-851d-a712d528cbda\") " pod="openshift-monitoring/node-exporter-srrtl" Apr 20 20:08:28.868401 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:28.868306 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/76f7379a-b448-41ee-851d-a712d528cbda-node-exporter-tls\") pod \"node-exporter-srrtl\" (UID: \"76f7379a-b448-41ee-851d-a712d528cbda\") " pod="openshift-monitoring/node-exporter-srrtl" Apr 20 20:08:28.868401 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:28.868336 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/76f7379a-b448-41ee-851d-a712d528cbda-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-srrtl\" (UID: \"76f7379a-b448-41ee-851d-a712d528cbda\") " pod="openshift-monitoring/node-exporter-srrtl" Apr 20 20:08:28.868401 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:28.868383 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/76f7379a-b448-41ee-851d-a712d528cbda-sys\") pod \"node-exporter-srrtl\" (UID: \"76f7379a-b448-41ee-851d-a712d528cbda\") " pod="openshift-monitoring/node-exporter-srrtl" Apr 20 20:08:28.968912 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:28.968884 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/76f7379a-b448-41ee-851d-a712d528cbda-node-exporter-wtmp\") pod \"node-exporter-srrtl\" (UID: \"76f7379a-b448-41ee-851d-a712d528cbda\") " pod="openshift-monitoring/node-exporter-srrtl" Apr 20 20:08:28.969013 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:28.968927 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/76f7379a-b448-41ee-851d-a712d528cbda-node-exporter-accelerators-collector-config\") pod \"node-exporter-srrtl\" (UID: \"76f7379a-b448-41ee-851d-a712d528cbda\") " pod="openshift-monitoring/node-exporter-srrtl" Apr 20 20:08:28.969013 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:28.968976 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/76f7379a-b448-41ee-851d-a712d528cbda-node-exporter-tls\") pod \"node-exporter-srrtl\" (UID: \"76f7379a-b448-41ee-851d-a712d528cbda\") " pod="openshift-monitoring/node-exporter-srrtl" Apr 20 20:08:28.969013 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:28.968999 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/76f7379a-b448-41ee-851d-a712d528cbda-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-srrtl\" (UID: \"76f7379a-b448-41ee-851d-a712d528cbda\") " pod="openshift-monitoring/node-exporter-srrtl" Apr 20 20:08:28.969180 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:28.969042 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/76f7379a-b448-41ee-851d-a712d528cbda-node-exporter-wtmp\") pod \"node-exporter-srrtl\" (UID: \"76f7379a-b448-41ee-851d-a712d528cbda\") " pod="openshift-monitoring/node-exporter-srrtl" Apr 20 20:08:28.969180 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:28.969104 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/76f7379a-b448-41ee-851d-a712d528cbda-sys\") pod \"node-exporter-srrtl\" (UID: \"76f7379a-b448-41ee-851d-a712d528cbda\") " pod="openshift-monitoring/node-exporter-srrtl" Apr 20 20:08:28.969180 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:08:28.969142 2576 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 20 20:08:28.969180 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:28.969161 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/76f7379a-b448-41ee-851d-a712d528cbda-sys\") pod \"node-exporter-srrtl\" (UID: \"76f7379a-b448-41ee-851d-a712d528cbda\") " pod="openshift-monitoring/node-exporter-srrtl" Apr 20 20:08:28.969180 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:28.969165 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/76f7379a-b448-41ee-851d-a712d528cbda-node-exporter-textfile\") pod \"node-exporter-srrtl\" (UID: \"76f7379a-b448-41ee-851d-a712d528cbda\") " pod="openshift-monitoring/node-exporter-srrtl" Apr 20 20:08:28.969428 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:08:28.969210 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76f7379a-b448-41ee-851d-a712d528cbda-node-exporter-tls podName:76f7379a-b448-41ee-851d-a712d528cbda nodeName:}" failed. No retries permitted until 2026-04-20 20:08:29.469186123 +0000 UTC m=+184.181868097 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/76f7379a-b448-41ee-851d-a712d528cbda-node-exporter-tls") pod "node-exporter-srrtl" (UID: "76f7379a-b448-41ee-851d-a712d528cbda") : secret "node-exporter-tls" not found Apr 20 20:08:28.969428 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:28.969257 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/76f7379a-b448-41ee-851d-a712d528cbda-metrics-client-ca\") pod \"node-exporter-srrtl\" (UID: \"76f7379a-b448-41ee-851d-a712d528cbda\") " pod="openshift-monitoring/node-exporter-srrtl" Apr 20 20:08:28.969428 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:28.969294 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jbwfz\" (UniqueName: \"kubernetes.io/projected/76f7379a-b448-41ee-851d-a712d528cbda-kube-api-access-jbwfz\") pod \"node-exporter-srrtl\" (UID: \"76f7379a-b448-41ee-851d-a712d528cbda\") " pod="openshift-monitoring/node-exporter-srrtl" Apr 20 20:08:28.969428 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:28.969338 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/76f7379a-b448-41ee-851d-a712d528cbda-root\") pod \"node-exporter-srrtl\" (UID: \"76f7379a-b448-41ee-851d-a712d528cbda\") " pod="openshift-monitoring/node-exporter-srrtl" Apr 20 20:08:28.969428 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:28.969414 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/76f7379a-b448-41ee-851d-a712d528cbda-root\") pod \"node-exporter-srrtl\" (UID: \"76f7379a-b448-41ee-851d-a712d528cbda\") " pod="openshift-monitoring/node-exporter-srrtl" Apr 20 20:08:28.969660 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:28.969566 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/76f7379a-b448-41ee-851d-a712d528cbda-node-exporter-textfile\") pod \"node-exporter-srrtl\" (UID: \"76f7379a-b448-41ee-851d-a712d528cbda\") " pod="openshift-monitoring/node-exporter-srrtl" Apr 20 20:08:28.969707 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:28.969678 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/76f7379a-b448-41ee-851d-a712d528cbda-node-exporter-accelerators-collector-config\") pod \"node-exporter-srrtl\" (UID: \"76f7379a-b448-41ee-851d-a712d528cbda\") " pod="openshift-monitoring/node-exporter-srrtl" Apr 20 20:08:28.969823 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:28.969807 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/76f7379a-b448-41ee-851d-a712d528cbda-metrics-client-ca\") pod \"node-exporter-srrtl\" (UID: \"76f7379a-b448-41ee-851d-a712d528cbda\") " pod="openshift-monitoring/node-exporter-srrtl" Apr 20 20:08:28.971438 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:28.971412 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/76f7379a-b448-41ee-851d-a712d528cbda-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-srrtl\" (UID: \"76f7379a-b448-41ee-851d-a712d528cbda\") " pod="openshift-monitoring/node-exporter-srrtl" Apr 20 20:08:28.983149 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:28.983128 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbwfz\" (UniqueName: \"kubernetes.io/projected/76f7379a-b448-41ee-851d-a712d528cbda-kube-api-access-jbwfz\") pod \"node-exporter-srrtl\" (UID: \"76f7379a-b448-41ee-851d-a712d528cbda\") " pod="openshift-monitoring/node-exporter-srrtl" Apr 20 20:08:29.475085 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:29.475054 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/76f7379a-b448-41ee-851d-a712d528cbda-node-exporter-tls\") pod \"node-exporter-srrtl\" (UID: \"76f7379a-b448-41ee-851d-a712d528cbda\") " pod="openshift-monitoring/node-exporter-srrtl" Apr 20 20:08:29.475445 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:08:29.475168 2576 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 20 20:08:29.475445 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:08:29.475215 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76f7379a-b448-41ee-851d-a712d528cbda-node-exporter-tls podName:76f7379a-b448-41ee-851d-a712d528cbda nodeName:}" failed. No retries permitted until 2026-04-20 20:08:30.475202102 +0000 UTC m=+185.187884076 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/76f7379a-b448-41ee-851d-a712d528cbda-node-exporter-tls") pod "node-exporter-srrtl" (UID: "76f7379a-b448-41ee-851d-a712d528cbda") : secret "node-exporter-tls" not found Apr 20 20:08:30.481559 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:30.481523 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/76f7379a-b448-41ee-851d-a712d528cbda-node-exporter-tls\") pod \"node-exporter-srrtl\" (UID: \"76f7379a-b448-41ee-851d-a712d528cbda\") " pod="openshift-monitoring/node-exporter-srrtl" Apr 20 20:08:30.483683 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:30.483661 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/76f7379a-b448-41ee-851d-a712d528cbda-node-exporter-tls\") pod \"node-exporter-srrtl\" (UID: \"76f7379a-b448-41ee-851d-a712d528cbda\") " pod="openshift-monitoring/node-exporter-srrtl" Apr 20 20:08:30.588005 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:30.587976 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-srrtl" Apr 20 20:08:30.595950 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:08:30.595918 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76f7379a_b448_41ee_851d_a712d528cbda.slice/crio-ceefa7628f71a54c131c13322008b6a8b6d74cc3a0f3deaea787bf17c65ba5b2 WatchSource:0}: Error finding container ceefa7628f71a54c131c13322008b6a8b6d74cc3a0f3deaea787bf17c65ba5b2: Status 404 returned error can't find the container with id ceefa7628f71a54c131c13322008b6a8b6d74cc3a0f3deaea787bf17c65ba5b2 Apr 20 20:08:31.262827 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:31.262786 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-59c5444cfc-k8zdh" Apr 20 20:08:31.262827 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:31.262830 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-59c5444cfc-k8zdh" Apr 20 20:08:31.268583 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:31.268547 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-59c5444cfc-k8zdh" Apr 20 20:08:31.450369 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:31.450332 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-srrtl" event={"ID":"76f7379a-b448-41ee-851d-a712d528cbda","Type":"ContainerStarted","Data":"ceefa7628f71a54c131c13322008b6a8b6d74cc3a0f3deaea787bf17c65ba5b2"} Apr 20 20:08:31.454299 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:31.454276 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-59c5444cfc-k8zdh" Apr 20 20:08:31.814173 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:31.814146 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-575b747bcc-xbtfs"] Apr 20 20:08:31.817494 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:31.817478 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-575b747bcc-xbtfs" Apr 20 20:08:31.821808 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:31.821782 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 20 20:08:31.821947 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:31.821928 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 20 20:08:31.822007 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:31.821991 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 20 20:08:31.822071 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:31.822024 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 20 20:08:31.822315 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:31.822293 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-sxkmm\"" Apr 20 20:08:31.822495 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:31.822481 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 20 20:08:31.822549 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:31.822486 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-80820llh2p0fi\"" Apr 20 20:08:31.834040 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:31.834015 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-575b747bcc-xbtfs"] Apr 20 20:08:31.993222 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:31.993193 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/874f316f-a84f-4d91-97df-8300d3e42b26-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-575b747bcc-xbtfs\" (UID: \"874f316f-a84f-4d91-97df-8300d3e42b26\") " pod="openshift-monitoring/thanos-querier-575b747bcc-xbtfs" Apr 20 20:08:31.993408 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:31.993235 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/874f316f-a84f-4d91-97df-8300d3e42b26-secret-grpc-tls\") pod \"thanos-querier-575b747bcc-xbtfs\" (UID: \"874f316f-a84f-4d91-97df-8300d3e42b26\") " pod="openshift-monitoring/thanos-querier-575b747bcc-xbtfs" Apr 20 20:08:31.993408 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:31.993282 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/874f316f-a84f-4d91-97df-8300d3e42b26-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-575b747bcc-xbtfs\" (UID: \"874f316f-a84f-4d91-97df-8300d3e42b26\") " pod="openshift-monitoring/thanos-querier-575b747bcc-xbtfs" Apr 20 20:08:31.993408 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:31.993300 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/874f316f-a84f-4d91-97df-8300d3e42b26-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-575b747bcc-xbtfs\" (UID: \"874f316f-a84f-4d91-97df-8300d3e42b26\") " pod="openshift-monitoring/thanos-querier-575b747bcc-xbtfs" Apr 20 20:08:31.993408 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:31.993315 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/874f316f-a84f-4d91-97df-8300d3e42b26-metrics-client-ca\") pod \"thanos-querier-575b747bcc-xbtfs\" (UID: \"874f316f-a84f-4d91-97df-8300d3e42b26\") " pod="openshift-monitoring/thanos-querier-575b747bcc-xbtfs" Apr 20 20:08:31.993408 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:31.993397 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/874f316f-a84f-4d91-97df-8300d3e42b26-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-575b747bcc-xbtfs\" (UID: \"874f316f-a84f-4d91-97df-8300d3e42b26\") " pod="openshift-monitoring/thanos-querier-575b747bcc-xbtfs" Apr 20 20:08:31.993638 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:31.993429 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/874f316f-a84f-4d91-97df-8300d3e42b26-secret-thanos-querier-tls\") pod \"thanos-querier-575b747bcc-xbtfs\" (UID: \"874f316f-a84f-4d91-97df-8300d3e42b26\") " pod="openshift-monitoring/thanos-querier-575b747bcc-xbtfs" Apr 20 20:08:31.993638 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:31.993459 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fvjv\" (UniqueName: \"kubernetes.io/projected/874f316f-a84f-4d91-97df-8300d3e42b26-kube-api-access-2fvjv\") pod \"thanos-querier-575b747bcc-xbtfs\" (UID: \"874f316f-a84f-4d91-97df-8300d3e42b26\") " pod="openshift-monitoring/thanos-querier-575b747bcc-xbtfs" Apr 20 20:08:32.094841 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:32.094768 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2fvjv\" (UniqueName: \"kubernetes.io/projected/874f316f-a84f-4d91-97df-8300d3e42b26-kube-api-access-2fvjv\") pod \"thanos-querier-575b747bcc-xbtfs\" (UID: \"874f316f-a84f-4d91-97df-8300d3e42b26\") " pod="openshift-monitoring/thanos-querier-575b747bcc-xbtfs" Apr 20 20:08:32.094841 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:32.094805 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/874f316f-a84f-4d91-97df-8300d3e42b26-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-575b747bcc-xbtfs\" (UID: \"874f316f-a84f-4d91-97df-8300d3e42b26\") " pod="openshift-monitoring/thanos-querier-575b747bcc-xbtfs" Apr 20 20:08:32.094997 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:32.094892 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/874f316f-a84f-4d91-97df-8300d3e42b26-secret-grpc-tls\") pod \"thanos-querier-575b747bcc-xbtfs\" (UID: \"874f316f-a84f-4d91-97df-8300d3e42b26\") " pod="openshift-monitoring/thanos-querier-575b747bcc-xbtfs" Apr 20 20:08:32.095049 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:32.094993 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/874f316f-a84f-4d91-97df-8300d3e42b26-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-575b747bcc-xbtfs\" (UID: \"874f316f-a84f-4d91-97df-8300d3e42b26\") " pod="openshift-monitoring/thanos-querier-575b747bcc-xbtfs" Apr 20 20:08:32.095243 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:32.095219 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/874f316f-a84f-4d91-97df-8300d3e42b26-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-575b747bcc-xbtfs\" (UID: \"874f316f-a84f-4d91-97df-8300d3e42b26\") " pod="openshift-monitoring/thanos-querier-575b747bcc-xbtfs" Apr 20 20:08:32.095398 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:32.095260 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/874f316f-a84f-4d91-97df-8300d3e42b26-metrics-client-ca\") pod \"thanos-querier-575b747bcc-xbtfs\" (UID: \"874f316f-a84f-4d91-97df-8300d3e42b26\") " pod="openshift-monitoring/thanos-querier-575b747bcc-xbtfs" Apr 20 20:08:32.095398 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:32.095305 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/874f316f-a84f-4d91-97df-8300d3e42b26-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-575b747bcc-xbtfs\" (UID: \"874f316f-a84f-4d91-97df-8300d3e42b26\") " pod="openshift-monitoring/thanos-querier-575b747bcc-xbtfs" Apr 20 20:08:32.095398 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:32.095338 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/874f316f-a84f-4d91-97df-8300d3e42b26-secret-thanos-querier-tls\") pod \"thanos-querier-575b747bcc-xbtfs\" (UID: \"874f316f-a84f-4d91-97df-8300d3e42b26\") " pod="openshift-monitoring/thanos-querier-575b747bcc-xbtfs" Apr 20 20:08:32.096035 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:32.096010 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/874f316f-a84f-4d91-97df-8300d3e42b26-metrics-client-ca\") pod \"thanos-querier-575b747bcc-xbtfs\" (UID: \"874f316f-a84f-4d91-97df-8300d3e42b26\") " pod="openshift-monitoring/thanos-querier-575b747bcc-xbtfs" Apr 20 20:08:32.097514 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:32.097483 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/874f316f-a84f-4d91-97df-8300d3e42b26-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-575b747bcc-xbtfs\" (UID: \"874f316f-a84f-4d91-97df-8300d3e42b26\") " pod="openshift-monitoring/thanos-querier-575b747bcc-xbtfs" Apr 20 20:08:32.097661 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:32.097643 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/874f316f-a84f-4d91-97df-8300d3e42b26-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-575b747bcc-xbtfs\" (UID: \"874f316f-a84f-4d91-97df-8300d3e42b26\") " pod="openshift-monitoring/thanos-querier-575b747bcc-xbtfs" Apr 20 20:08:32.097761 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:32.097743 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/874f316f-a84f-4d91-97df-8300d3e42b26-secret-grpc-tls\") pod \"thanos-querier-575b747bcc-xbtfs\" (UID: \"874f316f-a84f-4d91-97df-8300d3e42b26\") " pod="openshift-monitoring/thanos-querier-575b747bcc-xbtfs" Apr 20 20:08:32.097993 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:32.097972 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/874f316f-a84f-4d91-97df-8300d3e42b26-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-575b747bcc-xbtfs\" (UID: \"874f316f-a84f-4d91-97df-8300d3e42b26\") " pod="openshift-monitoring/thanos-querier-575b747bcc-xbtfs" Apr 20 20:08:32.098146 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:32.098127 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/874f316f-a84f-4d91-97df-8300d3e42b26-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-575b747bcc-xbtfs\" (UID: \"874f316f-a84f-4d91-97df-8300d3e42b26\") " pod="openshift-monitoring/thanos-querier-575b747bcc-xbtfs" Apr 20 20:08:32.098235 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:32.098219 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/874f316f-a84f-4d91-97df-8300d3e42b26-secret-thanos-querier-tls\") pod \"thanos-querier-575b747bcc-xbtfs\" (UID: \"874f316f-a84f-4d91-97df-8300d3e42b26\") " pod="openshift-monitoring/thanos-querier-575b747bcc-xbtfs" Apr 20 20:08:32.104947 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:32.104931 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fvjv\" (UniqueName: \"kubernetes.io/projected/874f316f-a84f-4d91-97df-8300d3e42b26-kube-api-access-2fvjv\") pod \"thanos-querier-575b747bcc-xbtfs\" (UID: \"874f316f-a84f-4d91-97df-8300d3e42b26\") " pod="openshift-monitoring/thanos-querier-575b747bcc-xbtfs" Apr 20 20:08:32.126832 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:32.126812 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-575b747bcc-xbtfs" Apr 20 20:08:32.252619 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:32.252596 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-575b747bcc-xbtfs"] Apr 20 20:08:32.254458 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:08:32.254431 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod874f316f_a84f_4d91_97df_8300d3e42b26.slice/crio-fd795cb02c53a8e5ed0b7486717658128bf879ffebe3a8d79e30a083e14f4393 WatchSource:0}: Error finding container fd795cb02c53a8e5ed0b7486717658128bf879ffebe3a8d79e30a083e14f4393: Status 404 returned error can't find the container with id fd795cb02c53a8e5ed0b7486717658128bf879ffebe3a8d79e30a083e14f4393 Apr 20 20:08:32.453467 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:32.453401 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-575b747bcc-xbtfs" event={"ID":"874f316f-a84f-4d91-97df-8300d3e42b26","Type":"ContainerStarted","Data":"fd795cb02c53a8e5ed0b7486717658128bf879ffebe3a8d79e30a083e14f4393"} Apr 20 20:08:32.454627 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:32.454607 2576 generic.go:358] "Generic (PLEG): container finished" podID="76f7379a-b448-41ee-851d-a712d528cbda" containerID="00c97dc5c700810a23a54609193ea4634baddcbfa34ab63e636e40bb40ec0aa9" exitCode=0 Apr 20 20:08:32.454715 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:32.454693 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-srrtl" event={"ID":"76f7379a-b448-41ee-851d-a712d528cbda","Type":"ContainerDied","Data":"00c97dc5c700810a23a54609193ea4634baddcbfa34ab63e636e40bb40ec0aa9"} Apr 20 20:08:33.460599 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:33.460562 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-srrtl" event={"ID":"76f7379a-b448-41ee-851d-a712d528cbda","Type":"ContainerStarted","Data":"343edaf18efbd80923a6aa413c475b0753046946459b02b7f74d85d1ef193260"} Apr 20 20:08:33.461171 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:33.460607 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-srrtl" event={"ID":"76f7379a-b448-41ee-851d-a712d528cbda","Type":"ContainerStarted","Data":"15b1669cd5c2e24756490c0f7993811bbe23e9ad68848177cbd46d9a88d1b04a"} Apr 20 20:08:33.510206 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:33.510087 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-srrtl" podStartSLOduration=4.618317599 podStartE2EDuration="5.510074533s" podCreationTimestamp="2026-04-20 20:08:28 +0000 UTC" firstStartedPulling="2026-04-20 20:08:30.597410146 +0000 UTC m=+185.310092121" lastFinishedPulling="2026-04-20 20:08:31.489167077 +0000 UTC m=+186.201849055" observedRunningTime="2026-04-20 20:08:33.509890498 +0000 UTC m=+188.222572493" watchObservedRunningTime="2026-04-20 20:08:33.510074533 +0000 UTC m=+188.222756528" Apr 20 20:08:34.464754 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:34.464721 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-575b747bcc-xbtfs" event={"ID":"874f316f-a84f-4d91-97df-8300d3e42b26","Type":"ContainerStarted","Data":"93010df130819f3518fec3896c252623dad6e3b76f37f9065895d822af15f098"} Apr 20 20:08:34.464754 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:34.464759 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-575b747bcc-xbtfs" event={"ID":"874f316f-a84f-4d91-97df-8300d3e42b26","Type":"ContainerStarted","Data":"060acd4921f533a72b305c0558969d5a23a8b3588a4cd41e47a25cce5b3133b4"} Apr 20 20:08:34.465165 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:34.464769 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-575b747bcc-xbtfs" event={"ID":"874f316f-a84f-4d91-97df-8300d3e42b26","Type":"ContainerStarted","Data":"3001b5fc4fada8602470e00a1111acc8052550fc81eba3c1c8f6d89a33cf5632"} Apr 20 20:08:35.353384 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:35.353358 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-78556b798c-d8n9n"] Apr 20 20:08:35.393976 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:35.393954 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6f7749bc57-7ktsn" Apr 20 20:08:35.412826 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:35.412803 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7d4bc4b64f-qzwqf"] Apr 20 20:08:35.415800 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:35.415785 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d4bc4b64f-qzwqf" Apr 20 20:08:35.423599 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:35.423583 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6abafad9-224c-4f34-9eea-67324b84e58c-console-oauth-config\") pod \"console-7d4bc4b64f-qzwqf\" (UID: \"6abafad9-224c-4f34-9eea-67324b84e58c\") " pod="openshift-console/console-7d4bc4b64f-qzwqf" Apr 20 20:08:35.423683 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:35.423606 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6abafad9-224c-4f34-9eea-67324b84e58c-oauth-serving-cert\") pod \"console-7d4bc4b64f-qzwqf\" (UID: \"6abafad9-224c-4f34-9eea-67324b84e58c\") " pod="openshift-console/console-7d4bc4b64f-qzwqf" Apr 20 20:08:35.423683 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:35.423628 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6abafad9-224c-4f34-9eea-67324b84e58c-trusted-ca-bundle\") pod \"console-7d4bc4b64f-qzwqf\" (UID: \"6abafad9-224c-4f34-9eea-67324b84e58c\") " pod="openshift-console/console-7d4bc4b64f-qzwqf" Apr 20 20:08:35.423818 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:35.423799 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6abafad9-224c-4f34-9eea-67324b84e58c-console-config\") pod \"console-7d4bc4b64f-qzwqf\" (UID: \"6abafad9-224c-4f34-9eea-67324b84e58c\") " pod="openshift-console/console-7d4bc4b64f-qzwqf" Apr 20 20:08:35.423868 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:35.423828 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmzhk\" (UniqueName: \"kubernetes.io/projected/6abafad9-224c-4f34-9eea-67324b84e58c-kube-api-access-dmzhk\") pod \"console-7d4bc4b64f-qzwqf\" (UID: \"6abafad9-224c-4f34-9eea-67324b84e58c\") " pod="openshift-console/console-7d4bc4b64f-qzwqf" Apr 20 20:08:35.423951 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:35.423937 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6abafad9-224c-4f34-9eea-67324b84e58c-service-ca\") pod \"console-7d4bc4b64f-qzwqf\" (UID: \"6abafad9-224c-4f34-9eea-67324b84e58c\") " pod="openshift-console/console-7d4bc4b64f-qzwqf" Apr 20 20:08:35.424014 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:35.423960 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6abafad9-224c-4f34-9eea-67324b84e58c-console-serving-cert\") pod \"console-7d4bc4b64f-qzwqf\" (UID: \"6abafad9-224c-4f34-9eea-67324b84e58c\") " pod="openshift-console/console-7d4bc4b64f-qzwqf" Apr 20 20:08:35.428596 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:35.428576 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7d4bc4b64f-qzwqf"] Apr 20 20:08:35.469884 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:35.469860 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-575b747bcc-xbtfs" event={"ID":"874f316f-a84f-4d91-97df-8300d3e42b26","Type":"ContainerStarted","Data":"b60438abd519c2391d0916be4fb4bbf1f560bc50d9a5718cd4c2d436870005e1"} Apr 20 20:08:35.469884 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:35.469885 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-575b747bcc-xbtfs" event={"ID":"874f316f-a84f-4d91-97df-8300d3e42b26","Type":"ContainerStarted","Data":"1210f6c4855b2d02d0c4b5de0fda8ef1da4f6b6674aaae11b51ee936cd649697"} Apr 20 20:08:35.470236 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:35.469895 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-575b747bcc-xbtfs" event={"ID":"874f316f-a84f-4d91-97df-8300d3e42b26","Type":"ContainerStarted","Data":"d776b4c91809ee4144f9e281e36d0dba1adbc8b33c6e5846a514cf22440f6809"} Apr 20 20:08:35.470236 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:35.470121 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-575b747bcc-xbtfs" Apr 20 20:08:35.501334 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:35.501295 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-575b747bcc-xbtfs" podStartSLOduration=1.942376508 podStartE2EDuration="4.50128495s" podCreationTimestamp="2026-04-20 20:08:31 +0000 UTC" firstStartedPulling="2026-04-20 20:08:32.256247358 +0000 UTC m=+186.968929332" lastFinishedPulling="2026-04-20 20:08:34.815155788 +0000 UTC m=+189.527837774" observedRunningTime="2026-04-20 20:08:35.499463493 +0000 UTC m=+190.212145510" watchObservedRunningTime="2026-04-20 20:08:35.50128495 +0000 UTC m=+190.213966946" Apr 20 20:08:35.524883 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:35.524857 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6abafad9-224c-4f34-9eea-67324b84e58c-console-config\") pod \"console-7d4bc4b64f-qzwqf\" (UID: \"6abafad9-224c-4f34-9eea-67324b84e58c\") " pod="openshift-console/console-7d4bc4b64f-qzwqf" Apr 20 20:08:35.524965 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:35.524893 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dmzhk\" (UniqueName: \"kubernetes.io/projected/6abafad9-224c-4f34-9eea-67324b84e58c-kube-api-access-dmzhk\") pod \"console-7d4bc4b64f-qzwqf\" (UID: \"6abafad9-224c-4f34-9eea-67324b84e58c\") " pod="openshift-console/console-7d4bc4b64f-qzwqf" Apr 20 20:08:35.525011 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:35.524987 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6abafad9-224c-4f34-9eea-67324b84e58c-service-ca\") pod \"console-7d4bc4b64f-qzwqf\" (UID: \"6abafad9-224c-4f34-9eea-67324b84e58c\") " pod="openshift-console/console-7d4bc4b64f-qzwqf" Apr 20 20:08:35.525052 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:35.525040 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6abafad9-224c-4f34-9eea-67324b84e58c-console-serving-cert\") pod \"console-7d4bc4b64f-qzwqf\" (UID: \"6abafad9-224c-4f34-9eea-67324b84e58c\") " pod="openshift-console/console-7d4bc4b64f-qzwqf" Apr 20 20:08:35.525092 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:35.525075 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6abafad9-224c-4f34-9eea-67324b84e58c-console-oauth-config\") pod \"console-7d4bc4b64f-qzwqf\" (UID: \"6abafad9-224c-4f34-9eea-67324b84e58c\") " pod="openshift-console/console-7d4bc4b64f-qzwqf" Apr 20 20:08:35.525162 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:35.525099 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6abafad9-224c-4f34-9eea-67324b84e58c-oauth-serving-cert\") pod \"console-7d4bc4b64f-qzwqf\" (UID: \"6abafad9-224c-4f34-9eea-67324b84e58c\") " pod="openshift-console/console-7d4bc4b64f-qzwqf" Apr 20 20:08:35.525200 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:35.525166 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6abafad9-224c-4f34-9eea-67324b84e58c-trusted-ca-bundle\") pod \"console-7d4bc4b64f-qzwqf\" (UID: \"6abafad9-224c-4f34-9eea-67324b84e58c\") " pod="openshift-console/console-7d4bc4b64f-qzwqf" Apr 20 20:08:35.525591 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:35.525569 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6abafad9-224c-4f34-9eea-67324b84e58c-console-config\") pod \"console-7d4bc4b64f-qzwqf\" (UID: \"6abafad9-224c-4f34-9eea-67324b84e58c\") " pod="openshift-console/console-7d4bc4b64f-qzwqf" Apr 20 20:08:35.525954 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:35.525935 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6abafad9-224c-4f34-9eea-67324b84e58c-trusted-ca-bundle\") pod \"console-7d4bc4b64f-qzwqf\" (UID: \"6abafad9-224c-4f34-9eea-67324b84e58c\") " pod="openshift-console/console-7d4bc4b64f-qzwqf" Apr 20 20:08:35.526161 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:35.526141 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6abafad9-224c-4f34-9eea-67324b84e58c-service-ca\") pod \"console-7d4bc4b64f-qzwqf\" (UID: \"6abafad9-224c-4f34-9eea-67324b84e58c\") " pod="openshift-console/console-7d4bc4b64f-qzwqf" Apr 20 20:08:35.526434 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:35.526415 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6abafad9-224c-4f34-9eea-67324b84e58c-oauth-serving-cert\") pod \"console-7d4bc4b64f-qzwqf\" (UID: \"6abafad9-224c-4f34-9eea-67324b84e58c\") " pod="openshift-console/console-7d4bc4b64f-qzwqf" Apr 20 20:08:35.527515 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:35.527489 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6abafad9-224c-4f34-9eea-67324b84e58c-console-oauth-config\") pod \"console-7d4bc4b64f-qzwqf\" (UID: \"6abafad9-224c-4f34-9eea-67324b84e58c\") " pod="openshift-console/console-7d4bc4b64f-qzwqf" Apr 20 20:08:35.527671 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:35.527652 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6abafad9-224c-4f34-9eea-67324b84e58c-console-serving-cert\") pod \"console-7d4bc4b64f-qzwqf\" (UID: \"6abafad9-224c-4f34-9eea-67324b84e58c\") " pod="openshift-console/console-7d4bc4b64f-qzwqf" Apr 20 20:08:35.536523 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:35.536505 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmzhk\" (UniqueName: \"kubernetes.io/projected/6abafad9-224c-4f34-9eea-67324b84e58c-kube-api-access-dmzhk\") pod \"console-7d4bc4b64f-qzwqf\" (UID: \"6abafad9-224c-4f34-9eea-67324b84e58c\") " pod="openshift-console/console-7d4bc4b64f-qzwqf" Apr 20 20:08:35.723960 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:35.723890 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d4bc4b64f-qzwqf" Apr 20 20:08:35.840371 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:35.840344 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7d4bc4b64f-qzwqf"] Apr 20 20:08:35.842515 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:08:35.842490 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6abafad9_224c_4f34_9eea_67324b84e58c.slice/crio-3e55dfbd011d023a566c54f04003803c05473d31f45fcdeb81b9881ddc1f08c0 WatchSource:0}: Error finding container 3e55dfbd011d023a566c54f04003803c05473d31f45fcdeb81b9881ddc1f08c0: Status 404 returned error can't find the container with id 3e55dfbd011d023a566c54f04003803c05473d31f45fcdeb81b9881ddc1f08c0 Apr 20 20:08:36.473814 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:36.473771 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d4bc4b64f-qzwqf" event={"ID":"6abafad9-224c-4f34-9eea-67324b84e58c","Type":"ContainerStarted","Data":"264e41ced12a10fa2340605bcf10acd7c20b45b0cf5ea8ae91f6fec14e196d17"} Apr 20 20:08:36.473814 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:36.473820 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d4bc4b64f-qzwqf" event={"ID":"6abafad9-224c-4f34-9eea-67324b84e58c","Type":"ContainerStarted","Data":"3e55dfbd011d023a566c54f04003803c05473d31f45fcdeb81b9881ddc1f08c0"} Apr 20 20:08:36.495801 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:36.495764 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7d4bc4b64f-qzwqf" podStartSLOduration=1.495752935 podStartE2EDuration="1.495752935s" podCreationTimestamp="2026-04-20 20:08:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:08:36.495052326 +0000 UTC m=+191.207734334" watchObservedRunningTime="2026-04-20 20:08:36.495752935 +0000 UTC m=+191.208434928" Apr 20 20:08:37.451325 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:37.451294 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-78556b798c-d8n9n" Apr 20 20:08:41.479581 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:41.479550 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-575b747bcc-xbtfs" Apr 20 20:08:42.971785 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:42.971755 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6f7749bc57-7ktsn"] Apr 20 20:08:43.682552 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:43.682516 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7d4bc4b64f-qzwqf"] Apr 20 20:08:43.712876 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:43.712840 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-c459df6d8-5f5rg"] Apr 20 20:08:43.717095 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:43.717073 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c459df6d8-5f5rg" Apr 20 20:08:43.732851 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:43.732828 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c459df6d8-5f5rg"] Apr 20 20:08:43.781033 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:43.781011 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8-trusted-ca-bundle\") pod \"console-c459df6d8-5f5rg\" (UID: \"a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8\") " pod="openshift-console/console-c459df6d8-5f5rg" Apr 20 20:08:43.781164 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:43.781066 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8-oauth-serving-cert\") pod \"console-c459df6d8-5f5rg\" (UID: \"a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8\") " pod="openshift-console/console-c459df6d8-5f5rg" Apr 20 20:08:43.781164 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:43.781126 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8-console-config\") pod \"console-c459df6d8-5f5rg\" (UID: \"a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8\") " pod="openshift-console/console-c459df6d8-5f5rg" Apr 20 20:08:43.781258 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:43.781166 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5n7n\" (UniqueName: \"kubernetes.io/projected/a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8-kube-api-access-k5n7n\") pod \"console-c459df6d8-5f5rg\" (UID: \"a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8\") " pod="openshift-console/console-c459df6d8-5f5rg" Apr 20 20:08:43.781258 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:43.781202 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8-service-ca\") pod \"console-c459df6d8-5f5rg\" (UID: \"a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8\") " pod="openshift-console/console-c459df6d8-5f5rg" Apr 20 20:08:43.781258 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:43.781241 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8-console-oauth-config\") pod \"console-c459df6d8-5f5rg\" (UID: \"a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8\") " pod="openshift-console/console-c459df6d8-5f5rg" Apr 20 20:08:43.781258 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:43.781257 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8-console-serving-cert\") pod \"console-c459df6d8-5f5rg\" (UID: \"a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8\") " pod="openshift-console/console-c459df6d8-5f5rg" Apr 20 20:08:43.881725 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:43.881698 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8-oauth-serving-cert\") pod \"console-c459df6d8-5f5rg\" (UID: \"a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8\") " pod="openshift-console/console-c459df6d8-5f5rg" Apr 20 20:08:43.881811 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:43.881734 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8-console-config\") pod \"console-c459df6d8-5f5rg\" (UID: \"a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8\") " pod="openshift-console/console-c459df6d8-5f5rg" Apr 20 20:08:43.881811 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:43.881753 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5n7n\" (UniqueName: \"kubernetes.io/projected/a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8-kube-api-access-k5n7n\") pod \"console-c459df6d8-5f5rg\" (UID: \"a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8\") " pod="openshift-console/console-c459df6d8-5f5rg" Apr 20 20:08:43.881896 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:43.881874 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8-service-ca\") pod \"console-c459df6d8-5f5rg\" (UID: \"a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8\") " pod="openshift-console/console-c459df6d8-5f5rg" Apr 20 20:08:43.881950 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:43.881924 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8-console-oauth-config\") pod \"console-c459df6d8-5f5rg\" (UID: \"a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8\") " pod="openshift-console/console-c459df6d8-5f5rg" Apr 20 20:08:43.882077 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:43.881962 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8-console-serving-cert\") pod \"console-c459df6d8-5f5rg\" (UID: \"a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8\") " pod="openshift-console/console-c459df6d8-5f5rg" Apr 20 20:08:43.882077 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:43.882005 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8-trusted-ca-bundle\") pod \"console-c459df6d8-5f5rg\" (UID: \"a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8\") " pod="openshift-console/console-c459df6d8-5f5rg" Apr 20 20:08:43.882502 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:43.882473 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8-oauth-serving-cert\") pod \"console-c459df6d8-5f5rg\" (UID: \"a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8\") " pod="openshift-console/console-c459df6d8-5f5rg" Apr 20 20:08:43.882579 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:43.882526 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8-console-config\") pod \"console-c459df6d8-5f5rg\" (UID: \"a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8\") " pod="openshift-console/console-c459df6d8-5f5rg" Apr 20 20:08:43.882579 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:43.882560 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8-service-ca\") pod \"console-c459df6d8-5f5rg\" (UID: \"a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8\") " pod="openshift-console/console-c459df6d8-5f5rg" Apr 20 20:08:43.882811 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:43.882793 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8-trusted-ca-bundle\") pod \"console-c459df6d8-5f5rg\" (UID: \"a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8\") " pod="openshift-console/console-c459df6d8-5f5rg" Apr 20 20:08:43.884513 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:43.884489 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8-console-serving-cert\") pod \"console-c459df6d8-5f5rg\" (UID: \"a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8\") " pod="openshift-console/console-c459df6d8-5f5rg" Apr 20 20:08:43.884583 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:43.884567 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8-console-oauth-config\") pod \"console-c459df6d8-5f5rg\" (UID: \"a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8\") " pod="openshift-console/console-c459df6d8-5f5rg" Apr 20 20:08:43.891183 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:43.891164 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5n7n\" (UniqueName: \"kubernetes.io/projected/a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8-kube-api-access-k5n7n\") pod \"console-c459df6d8-5f5rg\" (UID: \"a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8\") " pod="openshift-console/console-c459df6d8-5f5rg" Apr 20 20:08:44.025961 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:44.025897 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c459df6d8-5f5rg" Apr 20 20:08:44.154930 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:44.154907 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c459df6d8-5f5rg"] Apr 20 20:08:44.157408 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:08:44.157386 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6b38bc1_5e7c_4c5f_aded_bf5d99a8ace8.slice/crio-8974c5137a3946baab0c1ba1475c8d21585e79b8eb7997fb023e72797b296b76 WatchSource:0}: Error finding container 8974c5137a3946baab0c1ba1475c8d21585e79b8eb7997fb023e72797b296b76: Status 404 returned error can't find the container with id 8974c5137a3946baab0c1ba1475c8d21585e79b8eb7997fb023e72797b296b76 Apr 20 20:08:44.504401 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:44.504370 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c459df6d8-5f5rg" event={"ID":"a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8","Type":"ContainerStarted","Data":"9d156dc2fe9847b11886104b42c1bd9359593a9be63380a3f6e7a2813c891d75"} Apr 20 20:08:44.504401 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:44.504405 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c459df6d8-5f5rg" event={"ID":"a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8","Type":"ContainerStarted","Data":"8974c5137a3946baab0c1ba1475c8d21585e79b8eb7997fb023e72797b296b76"} Apr 20 20:08:44.528367 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:44.528322 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-c459df6d8-5f5rg" podStartSLOduration=1.528308992 podStartE2EDuration="1.528308992s" podCreationTimestamp="2026-04-20 20:08:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:08:44.527162386 +0000 UTC m=+199.239844381" watchObservedRunningTime="2026-04-20 20:08:44.528308992 +0000 UTC m=+199.240990982" Apr 20 20:08:45.724224 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:45.724195 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7d4bc4b64f-qzwqf" Apr 20 20:08:54.026837 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:54.026807 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-c459df6d8-5f5rg" Apr 20 20:08:54.026837 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:54.026839 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-c459df6d8-5f5rg" Apr 20 20:08:54.031386 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:54.031367 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-c459df6d8-5f5rg" Apr 20 20:08:54.537160 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:54.537137 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-c459df6d8-5f5rg" Apr 20 20:08:54.583717 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:54.583690 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-59c5444cfc-k8zdh"] Apr 20 20:08:59.547935 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:59.547905 2576 generic.go:358] "Generic (PLEG): container finished" podID="bc006b49-b340-4b63-9917-d78702246b64" containerID="1acae262c21e958932ff87f8782982d0f9f46fbb078a5fe863e376e83e0ca704" exitCode=0 Apr 20 20:08:59.548338 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:59.547975 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jgpb4" event={"ID":"bc006b49-b340-4b63-9917-d78702246b64","Type":"ContainerDied","Data":"1acae262c21e958932ff87f8782982d0f9f46fbb078a5fe863e376e83e0ca704"} Apr 20 20:08:59.548338 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:08:59.548268 2576 scope.go:117] "RemoveContainer" containerID="1acae262c21e958932ff87f8782982d0f9f46fbb078a5fe863e376e83e0ca704" Apr 20 20:09:00.371363 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:00.371322 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-78556b798c-d8n9n" podUID="6832f12c-cb68-4e4c-9dde-084003a3d82d" containerName="console" containerID="cri-o://ea099e5dbeffe2ff442c0ed6edaa7b0d8c887b29a26747f79ff9a74db3b565b4" gracePeriod=15 Apr 20 20:09:00.553466 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:00.553427 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-jgpb4" event={"ID":"bc006b49-b340-4b63-9917-d78702246b64","Type":"ContainerStarted","Data":"a715450fa2a951df853d12175eb39f8d46ad706f4f9ed0571f5ec7dcd43d8101"} Apr 20 20:09:00.555598 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:00.555572 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-78556b798c-d8n9n_6832f12c-cb68-4e4c-9dde-084003a3d82d/console/0.log" Apr 20 20:09:00.555598 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:00.555612 2576 generic.go:358] "Generic (PLEG): container finished" podID="6832f12c-cb68-4e4c-9dde-084003a3d82d" containerID="ea099e5dbeffe2ff442c0ed6edaa7b0d8c887b29a26747f79ff9a74db3b565b4" exitCode=2 Apr 20 20:09:00.555848 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:00.555713 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78556b798c-d8n9n" event={"ID":"6832f12c-cb68-4e4c-9dde-084003a3d82d","Type":"ContainerDied","Data":"ea099e5dbeffe2ff442c0ed6edaa7b0d8c887b29a26747f79ff9a74db3b565b4"} Apr 20 20:09:00.619816 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:00.619795 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-78556b798c-d8n9n_6832f12c-cb68-4e4c-9dde-084003a3d82d/console/0.log" Apr 20 20:09:00.619922 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:00.619865 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78556b798c-d8n9n" Apr 20 20:09:00.713767 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:00.713689 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6832f12c-cb68-4e4c-9dde-084003a3d82d-console-serving-cert\") pod \"6832f12c-cb68-4e4c-9dde-084003a3d82d\" (UID: \"6832f12c-cb68-4e4c-9dde-084003a3d82d\") " Apr 20 20:09:00.713767 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:00.713742 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6832f12c-cb68-4e4c-9dde-084003a3d82d-service-ca\") pod \"6832f12c-cb68-4e4c-9dde-084003a3d82d\" (UID: \"6832f12c-cb68-4e4c-9dde-084003a3d82d\") " Apr 20 20:09:00.713767 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:00.713768 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6832f12c-cb68-4e4c-9dde-084003a3d82d-oauth-serving-cert\") pod \"6832f12c-cb68-4e4c-9dde-084003a3d82d\" (UID: \"6832f12c-cb68-4e4c-9dde-084003a3d82d\") " Apr 20 20:09:00.714027 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:00.713785 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6832f12c-cb68-4e4c-9dde-084003a3d82d-console-config\") pod \"6832f12c-cb68-4e4c-9dde-084003a3d82d\" (UID: \"6832f12c-cb68-4e4c-9dde-084003a3d82d\") " Apr 20 20:09:00.714027 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:00.713802 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6832f12c-cb68-4e4c-9dde-084003a3d82d-trusted-ca-bundle\") pod \"6832f12c-cb68-4e4c-9dde-084003a3d82d\" (UID: \"6832f12c-cb68-4e4c-9dde-084003a3d82d\") " Apr 20 20:09:00.714027 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:00.713848 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjgwj\" (UniqueName: \"kubernetes.io/projected/6832f12c-cb68-4e4c-9dde-084003a3d82d-kube-api-access-hjgwj\") pod \"6832f12c-cb68-4e4c-9dde-084003a3d82d\" (UID: \"6832f12c-cb68-4e4c-9dde-084003a3d82d\") " Apr 20 20:09:00.714027 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:00.713886 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6832f12c-cb68-4e4c-9dde-084003a3d82d-console-oauth-config\") pod \"6832f12c-cb68-4e4c-9dde-084003a3d82d\" (UID: \"6832f12c-cb68-4e4c-9dde-084003a3d82d\") " Apr 20 20:09:00.714279 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:00.714191 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6832f12c-cb68-4e4c-9dde-084003a3d82d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "6832f12c-cb68-4e4c-9dde-084003a3d82d" (UID: "6832f12c-cb68-4e4c-9dde-084003a3d82d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:09:00.714279 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:00.714232 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6832f12c-cb68-4e4c-9dde-084003a3d82d-console-config" (OuterVolumeSpecName: "console-config") pod "6832f12c-cb68-4e4c-9dde-084003a3d82d" (UID: "6832f12c-cb68-4e4c-9dde-084003a3d82d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:09:00.714354 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:00.714282 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6832f12c-cb68-4e4c-9dde-084003a3d82d-service-ca" (OuterVolumeSpecName: "service-ca") pod "6832f12c-cb68-4e4c-9dde-084003a3d82d" (UID: "6832f12c-cb68-4e4c-9dde-084003a3d82d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:09:00.714579 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:00.714538 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6832f12c-cb68-4e4c-9dde-084003a3d82d-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6832f12c-cb68-4e4c-9dde-084003a3d82d" (UID: "6832f12c-cb68-4e4c-9dde-084003a3d82d"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:09:00.716178 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:00.716151 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6832f12c-cb68-4e4c-9dde-084003a3d82d-kube-api-access-hjgwj" (OuterVolumeSpecName: "kube-api-access-hjgwj") pod "6832f12c-cb68-4e4c-9dde-084003a3d82d" (UID: "6832f12c-cb68-4e4c-9dde-084003a3d82d"). InnerVolumeSpecName "kube-api-access-hjgwj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:09:00.716298 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:00.716278 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6832f12c-cb68-4e4c-9dde-084003a3d82d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "6832f12c-cb68-4e4c-9dde-084003a3d82d" (UID: "6832f12c-cb68-4e4c-9dde-084003a3d82d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:09:00.716365 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:00.716348 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6832f12c-cb68-4e4c-9dde-084003a3d82d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "6832f12c-cb68-4e4c-9dde-084003a3d82d" (UID: "6832f12c-cb68-4e4c-9dde-084003a3d82d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:09:00.815037 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:00.815014 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6832f12c-cb68-4e4c-9dde-084003a3d82d-console-oauth-config\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:09:00.815037 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:00.815035 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6832f12c-cb68-4e4c-9dde-084003a3d82d-console-serving-cert\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:09:00.815170 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:00.815045 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6832f12c-cb68-4e4c-9dde-084003a3d82d-service-ca\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:09:00.815170 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:00.815054 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6832f12c-cb68-4e4c-9dde-084003a3d82d-oauth-serving-cert\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:09:00.815170 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:00.815067 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6832f12c-cb68-4e4c-9dde-084003a3d82d-console-config\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:09:00.815170 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:00.815079 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6832f12c-cb68-4e4c-9dde-084003a3d82d-trusted-ca-bundle\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:09:00.815170 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:00.815087 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hjgwj\" (UniqueName: \"kubernetes.io/projected/6832f12c-cb68-4e4c-9dde-084003a3d82d-kube-api-access-hjgwj\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:09:01.560322 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:01.560298 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-78556b798c-d8n9n_6832f12c-cb68-4e4c-9dde-084003a3d82d/console/0.log" Apr 20 20:09:01.560772 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:01.560354 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78556b798c-d8n9n" event={"ID":"6832f12c-cb68-4e4c-9dde-084003a3d82d","Type":"ContainerDied","Data":"3e1287a4a74d8762f0c9801eb3cb28382ef2705aa464e9d1cb1b5aefff6c9e23"} Apr 20 20:09:01.560772 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:01.560386 2576 scope.go:117] "RemoveContainer" containerID="ea099e5dbeffe2ff442c0ed6edaa7b0d8c887b29a26747f79ff9a74db3b565b4" Apr 20 20:09:01.560772 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:01.560410 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78556b798c-d8n9n" Apr 20 20:09:01.581970 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:01.581947 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-78556b798c-d8n9n"] Apr 20 20:09:01.588092 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:01.588074 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-78556b798c-d8n9n"] Apr 20 20:09:01.943475 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:01.943437 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6832f12c-cb68-4e4c-9dde-084003a3d82d" path="/var/lib/kubelet/pods/6832f12c-cb68-4e4c-9dde-084003a3d82d/volumes" Apr 20 20:09:07.989953 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:07.989918 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-6f7749bc57-7ktsn" podUID="a78ff420-4c0e-42f9-a564-7d110a40f9d0" containerName="registry" containerID="cri-o://38c993cad43d63daad74446f4e37a41ef261180edc53d9d63288ce3ca6f2d2ec" gracePeriod=30 Apr 20 20:09:08.217370 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:08.217351 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6f7749bc57-7ktsn" Apr 20 20:09:08.269066 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:08.269003 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a78ff420-4c0e-42f9-a564-7d110a40f9d0-installation-pull-secrets\") pod \"a78ff420-4c0e-42f9-a564-7d110a40f9d0\" (UID: \"a78ff420-4c0e-42f9-a564-7d110a40f9d0\") " Apr 20 20:09:08.269066 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:08.269050 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a78ff420-4c0e-42f9-a564-7d110a40f9d0-trusted-ca\") pod \"a78ff420-4c0e-42f9-a564-7d110a40f9d0\" (UID: \"a78ff420-4c0e-42f9-a564-7d110a40f9d0\") " Apr 20 20:09:08.269260 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:08.269087 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a78ff420-4c0e-42f9-a564-7d110a40f9d0-registry-tls\") pod \"a78ff420-4c0e-42f9-a564-7d110a40f9d0\" (UID: \"a78ff420-4c0e-42f9-a564-7d110a40f9d0\") " Apr 20 20:09:08.269260 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:08.269130 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsprl\" (UniqueName: \"kubernetes.io/projected/a78ff420-4c0e-42f9-a564-7d110a40f9d0-kube-api-access-lsprl\") pod \"a78ff420-4c0e-42f9-a564-7d110a40f9d0\" (UID: \"a78ff420-4c0e-42f9-a564-7d110a40f9d0\") " Apr 20 20:09:08.269260 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:08.269157 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a78ff420-4c0e-42f9-a564-7d110a40f9d0-image-registry-private-configuration\") pod \"a78ff420-4c0e-42f9-a564-7d110a40f9d0\" (UID: \"a78ff420-4c0e-42f9-a564-7d110a40f9d0\") " Apr 20 20:09:08.269260 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:08.269180 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a78ff420-4c0e-42f9-a564-7d110a40f9d0-registry-certificates\") pod \"a78ff420-4c0e-42f9-a564-7d110a40f9d0\" (UID: \"a78ff420-4c0e-42f9-a564-7d110a40f9d0\") " Apr 20 20:09:08.269260 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:08.269215 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a78ff420-4c0e-42f9-a564-7d110a40f9d0-ca-trust-extracted\") pod \"a78ff420-4c0e-42f9-a564-7d110a40f9d0\" (UID: \"a78ff420-4c0e-42f9-a564-7d110a40f9d0\") " Apr 20 20:09:08.269260 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:08.269238 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a78ff420-4c0e-42f9-a564-7d110a40f9d0-bound-sa-token\") pod \"a78ff420-4c0e-42f9-a564-7d110a40f9d0\" (UID: \"a78ff420-4c0e-42f9-a564-7d110a40f9d0\") " Apr 20 20:09:08.269591 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:08.269565 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a78ff420-4c0e-42f9-a564-7d110a40f9d0-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a78ff420-4c0e-42f9-a564-7d110a40f9d0" (UID: "a78ff420-4c0e-42f9-a564-7d110a40f9d0"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:09:08.269706 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:08.269678 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a78ff420-4c0e-42f9-a564-7d110a40f9d0-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "a78ff420-4c0e-42f9-a564-7d110a40f9d0" (UID: "a78ff420-4c0e-42f9-a564-7d110a40f9d0"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:09:08.271528 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:08.271498 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a78ff420-4c0e-42f9-a564-7d110a40f9d0-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a78ff420-4c0e-42f9-a564-7d110a40f9d0" (UID: "a78ff420-4c0e-42f9-a564-7d110a40f9d0"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:09:08.271632 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:08.271573 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a78ff420-4c0e-42f9-a564-7d110a40f9d0-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "a78ff420-4c0e-42f9-a564-7d110a40f9d0" (UID: "a78ff420-4c0e-42f9-a564-7d110a40f9d0"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:09:08.271764 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:08.271732 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a78ff420-4c0e-42f9-a564-7d110a40f9d0-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "a78ff420-4c0e-42f9-a564-7d110a40f9d0" (UID: "a78ff420-4c0e-42f9-a564-7d110a40f9d0"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:09:08.271764 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:08.271760 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a78ff420-4c0e-42f9-a564-7d110a40f9d0-kube-api-access-lsprl" (OuterVolumeSpecName: "kube-api-access-lsprl") pod "a78ff420-4c0e-42f9-a564-7d110a40f9d0" (UID: "a78ff420-4c0e-42f9-a564-7d110a40f9d0"). InnerVolumeSpecName "kube-api-access-lsprl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:09:08.272026 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:08.272010 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a78ff420-4c0e-42f9-a564-7d110a40f9d0-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "a78ff420-4c0e-42f9-a564-7d110a40f9d0" (UID: "a78ff420-4c0e-42f9-a564-7d110a40f9d0"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:09:08.278257 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:08.278232 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a78ff420-4c0e-42f9-a564-7d110a40f9d0-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "a78ff420-4c0e-42f9-a564-7d110a40f9d0" (UID: "a78ff420-4c0e-42f9-a564-7d110a40f9d0"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:09:08.370047 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:08.370021 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a78ff420-4c0e-42f9-a564-7d110a40f9d0-trusted-ca\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:09:08.370047 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:08.370044 2576 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a78ff420-4c0e-42f9-a564-7d110a40f9d0-registry-tls\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:09:08.370204 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:08.370061 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lsprl\" (UniqueName: \"kubernetes.io/projected/a78ff420-4c0e-42f9-a564-7d110a40f9d0-kube-api-access-lsprl\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:09:08.370204 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:08.370074 2576 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a78ff420-4c0e-42f9-a564-7d110a40f9d0-image-registry-private-configuration\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:09:08.370204 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:08.370084 2576 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a78ff420-4c0e-42f9-a564-7d110a40f9d0-registry-certificates\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:09:08.370204 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:08.370092 2576 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a78ff420-4c0e-42f9-a564-7d110a40f9d0-ca-trust-extracted\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:09:08.370204 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:08.370101 2576 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a78ff420-4c0e-42f9-a564-7d110a40f9d0-bound-sa-token\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:09:08.370204 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:08.370126 2576 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a78ff420-4c0e-42f9-a564-7d110a40f9d0-installation-pull-secrets\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:09:08.579923 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:08.579894 2576 generic.go:358] "Generic (PLEG): container finished" podID="a78ff420-4c0e-42f9-a564-7d110a40f9d0" containerID="38c993cad43d63daad74446f4e37a41ef261180edc53d9d63288ce3ca6f2d2ec" exitCode=0 Apr 20 20:09:08.580043 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:08.579951 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6f7749bc57-7ktsn" Apr 20 20:09:08.580043 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:08.579968 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6f7749bc57-7ktsn" event={"ID":"a78ff420-4c0e-42f9-a564-7d110a40f9d0","Type":"ContainerDied","Data":"38c993cad43d63daad74446f4e37a41ef261180edc53d9d63288ce3ca6f2d2ec"} Apr 20 20:09:08.580043 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:08.579991 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6f7749bc57-7ktsn" event={"ID":"a78ff420-4c0e-42f9-a564-7d110a40f9d0","Type":"ContainerDied","Data":"4b5647dd18ff0863bcaad60cd6f7f56911504ea12a16f007acb1013776019058"} Apr 20 20:09:08.580043 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:08.580006 2576 scope.go:117] "RemoveContainer" containerID="38c993cad43d63daad74446f4e37a41ef261180edc53d9d63288ce3ca6f2d2ec" Apr 20 20:09:08.587717 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:08.587698 2576 scope.go:117] "RemoveContainer" containerID="38c993cad43d63daad74446f4e37a41ef261180edc53d9d63288ce3ca6f2d2ec" Apr 20 20:09:08.587956 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:09:08.587938 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38c993cad43d63daad74446f4e37a41ef261180edc53d9d63288ce3ca6f2d2ec\": container with ID starting with 38c993cad43d63daad74446f4e37a41ef261180edc53d9d63288ce3ca6f2d2ec not found: ID does not exist" containerID="38c993cad43d63daad74446f4e37a41ef261180edc53d9d63288ce3ca6f2d2ec" Apr 20 20:09:08.588017 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:08.587961 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38c993cad43d63daad74446f4e37a41ef261180edc53d9d63288ce3ca6f2d2ec"} err="failed to get container status \"38c993cad43d63daad74446f4e37a41ef261180edc53d9d63288ce3ca6f2d2ec\": rpc error: code = NotFound desc = could not find container \"38c993cad43d63daad74446f4e37a41ef261180edc53d9d63288ce3ca6f2d2ec\": container with ID starting with 38c993cad43d63daad74446f4e37a41ef261180edc53d9d63288ce3ca6f2d2ec not found: ID does not exist" Apr 20 20:09:08.606824 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:08.606788 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6f7749bc57-7ktsn"] Apr 20 20:09:08.608859 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:08.608838 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-6f7749bc57-7ktsn"] Apr 20 20:09:08.700292 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:08.700243 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7d4bc4b64f-qzwqf" podUID="6abafad9-224c-4f34-9eea-67324b84e58c" containerName="console" containerID="cri-o://264e41ced12a10fa2340605bcf10acd7c20b45b0cf5ea8ae91f6fec14e196d17" gracePeriod=15 Apr 20 20:09:08.934082 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:08.934063 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7d4bc4b64f-qzwqf_6abafad9-224c-4f34-9eea-67324b84e58c/console/0.log" Apr 20 20:09:08.934236 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:08.934214 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d4bc4b64f-qzwqf" Apr 20 20:09:08.974540 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:08.974514 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6abafad9-224c-4f34-9eea-67324b84e58c-trusted-ca-bundle\") pod \"6abafad9-224c-4f34-9eea-67324b84e58c\" (UID: \"6abafad9-224c-4f34-9eea-67324b84e58c\") " Apr 20 20:09:08.974647 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:08.974558 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6abafad9-224c-4f34-9eea-67324b84e58c-console-serving-cert\") pod \"6abafad9-224c-4f34-9eea-67324b84e58c\" (UID: \"6abafad9-224c-4f34-9eea-67324b84e58c\") " Apr 20 20:09:08.974647 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:08.974585 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6abafad9-224c-4f34-9eea-67324b84e58c-console-config\") pod \"6abafad9-224c-4f34-9eea-67324b84e58c\" (UID: \"6abafad9-224c-4f34-9eea-67324b84e58c\") " Apr 20 20:09:08.974734 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:08.974712 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6abafad9-224c-4f34-9eea-67324b84e58c-console-oauth-config\") pod \"6abafad9-224c-4f34-9eea-67324b84e58c\" (UID: \"6abafad9-224c-4f34-9eea-67324b84e58c\") " Apr 20 20:09:08.974805 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:08.974784 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6abafad9-224c-4f34-9eea-67324b84e58c-oauth-serving-cert\") pod \"6abafad9-224c-4f34-9eea-67324b84e58c\" (UID: \"6abafad9-224c-4f34-9eea-67324b84e58c\") " Apr 20 20:09:08.974916 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:08.974822 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmzhk\" (UniqueName: \"kubernetes.io/projected/6abafad9-224c-4f34-9eea-67324b84e58c-kube-api-access-dmzhk\") pod \"6abafad9-224c-4f34-9eea-67324b84e58c\" (UID: \"6abafad9-224c-4f34-9eea-67324b84e58c\") " Apr 20 20:09:08.974916 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:08.974851 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6abafad9-224c-4f34-9eea-67324b84e58c-service-ca\") pod \"6abafad9-224c-4f34-9eea-67324b84e58c\" (UID: \"6abafad9-224c-4f34-9eea-67324b84e58c\") " Apr 20 20:09:08.974916 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:08.974892 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6abafad9-224c-4f34-9eea-67324b84e58c-console-config" (OuterVolumeSpecName: "console-config") pod "6abafad9-224c-4f34-9eea-67324b84e58c" (UID: "6abafad9-224c-4f34-9eea-67324b84e58c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:09:08.975071 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:08.974920 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6abafad9-224c-4f34-9eea-67324b84e58c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6abafad9-224c-4f34-9eea-67324b84e58c" (UID: "6abafad9-224c-4f34-9eea-67324b84e58c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:09:08.975165 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:08.975094 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6abafad9-224c-4f34-9eea-67324b84e58c-trusted-ca-bundle\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:09:08.975165 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:08.975125 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6abafad9-224c-4f34-9eea-67324b84e58c-console-config\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:09:08.975165 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:08.975151 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6abafad9-224c-4f34-9eea-67324b84e58c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "6abafad9-224c-4f34-9eea-67324b84e58c" (UID: "6abafad9-224c-4f34-9eea-67324b84e58c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:09:08.975343 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:08.975323 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6abafad9-224c-4f34-9eea-67324b84e58c-service-ca" (OuterVolumeSpecName: "service-ca") pod "6abafad9-224c-4f34-9eea-67324b84e58c" (UID: "6abafad9-224c-4f34-9eea-67324b84e58c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:09:08.976663 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:08.976637 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6abafad9-224c-4f34-9eea-67324b84e58c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "6abafad9-224c-4f34-9eea-67324b84e58c" (UID: "6abafad9-224c-4f34-9eea-67324b84e58c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:09:08.976733 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:08.976682 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6abafad9-224c-4f34-9eea-67324b84e58c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "6abafad9-224c-4f34-9eea-67324b84e58c" (UID: "6abafad9-224c-4f34-9eea-67324b84e58c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:09:08.977292 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:08.977276 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6abafad9-224c-4f34-9eea-67324b84e58c-kube-api-access-dmzhk" (OuterVolumeSpecName: "kube-api-access-dmzhk") pod "6abafad9-224c-4f34-9eea-67324b84e58c" (UID: "6abafad9-224c-4f34-9eea-67324b84e58c"). InnerVolumeSpecName "kube-api-access-dmzhk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:09:09.075888 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:09.075865 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6abafad9-224c-4f34-9eea-67324b84e58c-oauth-serving-cert\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:09:09.075888 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:09.075885 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dmzhk\" (UniqueName: \"kubernetes.io/projected/6abafad9-224c-4f34-9eea-67324b84e58c-kube-api-access-dmzhk\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:09:09.076211 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:09.075895 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6abafad9-224c-4f34-9eea-67324b84e58c-service-ca\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:09:09.076211 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:09.075904 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6abafad9-224c-4f34-9eea-67324b84e58c-console-serving-cert\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:09:09.076211 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:09.075912 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6abafad9-224c-4f34-9eea-67324b84e58c-console-oauth-config\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:09:09.583568 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:09.583544 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7d4bc4b64f-qzwqf_6abafad9-224c-4f34-9eea-67324b84e58c/console/0.log" Apr 20 20:09:09.583746 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:09.583581 2576 generic.go:358] "Generic (PLEG): container finished" podID="6abafad9-224c-4f34-9eea-67324b84e58c" containerID="264e41ced12a10fa2340605bcf10acd7c20b45b0cf5ea8ae91f6fec14e196d17" exitCode=2 Apr 20 20:09:09.583746 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:09.583656 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d4bc4b64f-qzwqf" Apr 20 20:09:09.583746 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:09.583657 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d4bc4b64f-qzwqf" event={"ID":"6abafad9-224c-4f34-9eea-67324b84e58c","Type":"ContainerDied","Data":"264e41ced12a10fa2340605bcf10acd7c20b45b0cf5ea8ae91f6fec14e196d17"} Apr 20 20:09:09.583895 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:09.583777 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d4bc4b64f-qzwqf" event={"ID":"6abafad9-224c-4f34-9eea-67324b84e58c","Type":"ContainerDied","Data":"3e55dfbd011d023a566c54f04003803c05473d31f45fcdeb81b9881ddc1f08c0"} Apr 20 20:09:09.583895 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:09.583806 2576 scope.go:117] "RemoveContainer" containerID="264e41ced12a10fa2340605bcf10acd7c20b45b0cf5ea8ae91f6fec14e196d17" Apr 20 20:09:09.591759 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:09.591746 2576 scope.go:117] "RemoveContainer" containerID="264e41ced12a10fa2340605bcf10acd7c20b45b0cf5ea8ae91f6fec14e196d17" Apr 20 20:09:09.592003 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:09:09.591981 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"264e41ced12a10fa2340605bcf10acd7c20b45b0cf5ea8ae91f6fec14e196d17\": container with ID starting with 264e41ced12a10fa2340605bcf10acd7c20b45b0cf5ea8ae91f6fec14e196d17 not found: ID does not exist" containerID="264e41ced12a10fa2340605bcf10acd7c20b45b0cf5ea8ae91f6fec14e196d17" Apr 20 20:09:09.592094 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:09.592007 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"264e41ced12a10fa2340605bcf10acd7c20b45b0cf5ea8ae91f6fec14e196d17"} err="failed to get container status \"264e41ced12a10fa2340605bcf10acd7c20b45b0cf5ea8ae91f6fec14e196d17\": rpc error: code = NotFound desc = could not find container \"264e41ced12a10fa2340605bcf10acd7c20b45b0cf5ea8ae91f6fec14e196d17\": container with ID starting with 264e41ced12a10fa2340605bcf10acd7c20b45b0cf5ea8ae91f6fec14e196d17 not found: ID does not exist" Apr 20 20:09:09.605677 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:09.605654 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7d4bc4b64f-qzwqf"] Apr 20 20:09:09.609441 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:09.609423 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7d4bc4b64f-qzwqf"] Apr 20 20:09:09.946271 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:09.946195 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6abafad9-224c-4f34-9eea-67324b84e58c" path="/var/lib/kubelet/pods/6abafad9-224c-4f34-9eea-67324b84e58c/volumes" Apr 20 20:09:09.946581 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:09.946569 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a78ff420-4c0e-42f9-a564-7d110a40f9d0" path="/var/lib/kubelet/pods/a78ff420-4c0e-42f9-a564-7d110a40f9d0/volumes" Apr 20 20:09:19.604693 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:19.604642 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-59c5444cfc-k8zdh" podUID="bf5957de-812d-47f7-b353-00ec68e9ecf4" containerName="console" containerID="cri-o://36963ee6200dc59d039abeb3f4d793ab77de6980063581593ca2d8c839ca1624" gracePeriod=15 Apr 20 20:09:19.837774 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:19.837751 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-59c5444cfc-k8zdh_bf5957de-812d-47f7-b353-00ec68e9ecf4/console/0.log" Apr 20 20:09:19.837862 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:19.837807 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59c5444cfc-k8zdh" Apr 20 20:09:19.944629 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:19.944566 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bf5957de-812d-47f7-b353-00ec68e9ecf4-oauth-serving-cert\") pod \"bf5957de-812d-47f7-b353-00ec68e9ecf4\" (UID: \"bf5957de-812d-47f7-b353-00ec68e9ecf4\") " Apr 20 20:09:19.944629 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:19.944599 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bf5957de-812d-47f7-b353-00ec68e9ecf4-console-oauth-config\") pod \"bf5957de-812d-47f7-b353-00ec68e9ecf4\" (UID: \"bf5957de-812d-47f7-b353-00ec68e9ecf4\") " Apr 20 20:09:19.944629 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:19.944622 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bf5957de-812d-47f7-b353-00ec68e9ecf4-service-ca\") pod \"bf5957de-812d-47f7-b353-00ec68e9ecf4\" (UID: \"bf5957de-812d-47f7-b353-00ec68e9ecf4\") " Apr 20 20:09:19.944849 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:19.944733 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bf5957de-812d-47f7-b353-00ec68e9ecf4-console-config\") pod \"bf5957de-812d-47f7-b353-00ec68e9ecf4\" (UID: \"bf5957de-812d-47f7-b353-00ec68e9ecf4\") " Apr 20 20:09:19.944849 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:19.944768 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24m45\" (UniqueName: \"kubernetes.io/projected/bf5957de-812d-47f7-b353-00ec68e9ecf4-kube-api-access-24m45\") pod \"bf5957de-812d-47f7-b353-00ec68e9ecf4\" (UID: \"bf5957de-812d-47f7-b353-00ec68e9ecf4\") " Apr 20 20:09:19.944849 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:19.944802 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bf5957de-812d-47f7-b353-00ec68e9ecf4-console-serving-cert\") pod \"bf5957de-812d-47f7-b353-00ec68e9ecf4\" (UID: \"bf5957de-812d-47f7-b353-00ec68e9ecf4\") " Apr 20 20:09:19.945047 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:19.945014 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf5957de-812d-47f7-b353-00ec68e9ecf4-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "bf5957de-812d-47f7-b353-00ec68e9ecf4" (UID: "bf5957de-812d-47f7-b353-00ec68e9ecf4"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:09:19.945047 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:19.945031 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf5957de-812d-47f7-b353-00ec68e9ecf4-service-ca" (OuterVolumeSpecName: "service-ca") pod "bf5957de-812d-47f7-b353-00ec68e9ecf4" (UID: "bf5957de-812d-47f7-b353-00ec68e9ecf4"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:09:19.945205 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:19.945144 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf5957de-812d-47f7-b353-00ec68e9ecf4-console-config" (OuterVolumeSpecName: "console-config") pod "bf5957de-812d-47f7-b353-00ec68e9ecf4" (UID: "bf5957de-812d-47f7-b353-00ec68e9ecf4"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:09:19.946600 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:19.946579 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf5957de-812d-47f7-b353-00ec68e9ecf4-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "bf5957de-812d-47f7-b353-00ec68e9ecf4" (UID: "bf5957de-812d-47f7-b353-00ec68e9ecf4"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:09:19.946723 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:19.946701 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf5957de-812d-47f7-b353-00ec68e9ecf4-kube-api-access-24m45" (OuterVolumeSpecName: "kube-api-access-24m45") pod "bf5957de-812d-47f7-b353-00ec68e9ecf4" (UID: "bf5957de-812d-47f7-b353-00ec68e9ecf4"). InnerVolumeSpecName "kube-api-access-24m45". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:09:19.946787 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:19.946701 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf5957de-812d-47f7-b353-00ec68e9ecf4-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "bf5957de-812d-47f7-b353-00ec68e9ecf4" (UID: "bf5957de-812d-47f7-b353-00ec68e9ecf4"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:09:20.046347 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:20.046314 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/bf5957de-812d-47f7-b353-00ec68e9ecf4-oauth-serving-cert\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:09:20.046347 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:20.046345 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/bf5957de-812d-47f7-b353-00ec68e9ecf4-console-oauth-config\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:09:20.046347 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:20.046355 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/bf5957de-812d-47f7-b353-00ec68e9ecf4-service-ca\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:09:20.046512 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:20.046365 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/bf5957de-812d-47f7-b353-00ec68e9ecf4-console-config\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:09:20.046512 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:20.046373 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-24m45\" (UniqueName: \"kubernetes.io/projected/bf5957de-812d-47f7-b353-00ec68e9ecf4-kube-api-access-24m45\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:09:20.046512 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:20.046383 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/bf5957de-812d-47f7-b353-00ec68e9ecf4-console-serving-cert\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:09:20.615315 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:20.615288 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-59c5444cfc-k8zdh_bf5957de-812d-47f7-b353-00ec68e9ecf4/console/0.log" Apr 20 20:09:20.615689 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:20.615327 2576 generic.go:358] "Generic (PLEG): container finished" podID="bf5957de-812d-47f7-b353-00ec68e9ecf4" containerID="36963ee6200dc59d039abeb3f4d793ab77de6980063581593ca2d8c839ca1624" exitCode=2 Apr 20 20:09:20.615689 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:20.615386 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59c5444cfc-k8zdh" Apr 20 20:09:20.615689 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:20.615411 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59c5444cfc-k8zdh" event={"ID":"bf5957de-812d-47f7-b353-00ec68e9ecf4","Type":"ContainerDied","Data":"36963ee6200dc59d039abeb3f4d793ab77de6980063581593ca2d8c839ca1624"} Apr 20 20:09:20.615689 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:20.615449 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59c5444cfc-k8zdh" event={"ID":"bf5957de-812d-47f7-b353-00ec68e9ecf4","Type":"ContainerDied","Data":"248d5b95f6497723e9d31198ae02297ad7bc6eb61b6c3f5c12becaa7b759854f"} Apr 20 20:09:20.615689 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:20.615464 2576 scope.go:117] "RemoveContainer" containerID="36963ee6200dc59d039abeb3f4d793ab77de6980063581593ca2d8c839ca1624" Apr 20 20:09:20.622863 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:20.622849 2576 scope.go:117] "RemoveContainer" containerID="36963ee6200dc59d039abeb3f4d793ab77de6980063581593ca2d8c839ca1624" Apr 20 20:09:20.623102 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:09:20.623085 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36963ee6200dc59d039abeb3f4d793ab77de6980063581593ca2d8c839ca1624\": container with ID starting with 36963ee6200dc59d039abeb3f4d793ab77de6980063581593ca2d8c839ca1624 not found: ID does not exist" containerID="36963ee6200dc59d039abeb3f4d793ab77de6980063581593ca2d8c839ca1624" Apr 20 20:09:20.623163 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:20.623131 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36963ee6200dc59d039abeb3f4d793ab77de6980063581593ca2d8c839ca1624"} err="failed to get container status \"36963ee6200dc59d039abeb3f4d793ab77de6980063581593ca2d8c839ca1624\": rpc error: code = NotFound desc = could not find container \"36963ee6200dc59d039abeb3f4d793ab77de6980063581593ca2d8c839ca1624\": container with ID starting with 36963ee6200dc59d039abeb3f4d793ab77de6980063581593ca2d8c839ca1624 not found: ID does not exist" Apr 20 20:09:20.635096 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:20.635077 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-59c5444cfc-k8zdh"] Apr 20 20:09:20.639139 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:20.639090 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-59c5444cfc-k8zdh"] Apr 20 20:09:21.943307 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:21.943277 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf5957de-812d-47f7-b353-00ec68e9ecf4" path="/var/lib/kubelet/pods/bf5957de-812d-47f7-b353-00ec68e9ecf4/volumes" Apr 20 20:09:37.677370 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:37.677299 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4c96dea8-a54f-4ca2-a3fb-757208554fe3-metrics-certs\") pod \"network-metrics-daemon-2n29c\" (UID: \"4c96dea8-a54f-4ca2-a3fb-757208554fe3\") " pod="openshift-multus/network-metrics-daemon-2n29c" Apr 20 20:09:37.679473 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:37.679451 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4c96dea8-a54f-4ca2-a3fb-757208554fe3-metrics-certs\") pod \"network-metrics-daemon-2n29c\" (UID: \"4c96dea8-a54f-4ca2-a3fb-757208554fe3\") " pod="openshift-multus/network-metrics-daemon-2n29c" Apr 20 20:09:37.847933 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:37.847907 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-gwc22\"" Apr 20 20:09:37.855354 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:37.855334 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-2n29c" Apr 20 20:09:37.968506 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:37.968443 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-2n29c"] Apr 20 20:09:37.971739 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:09:37.971713 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c96dea8_a54f_4ca2_a3fb_757208554fe3.slice/crio-1957557698e3d9b301de936fd1e9a0c035406ca553f81c3772bd7803d8b602a2 WatchSource:0}: Error finding container 1957557698e3d9b301de936fd1e9a0c035406ca553f81c3772bd7803d8b602a2: Status 404 returned error can't find the container with id 1957557698e3d9b301de936fd1e9a0c035406ca553f81c3772bd7803d8b602a2 Apr 20 20:09:38.664554 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:38.664515 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2n29c" event={"ID":"4c96dea8-a54f-4ca2-a3fb-757208554fe3","Type":"ContainerStarted","Data":"1957557698e3d9b301de936fd1e9a0c035406ca553f81c3772bd7803d8b602a2"} Apr 20 20:09:39.668959 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:39.668926 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2n29c" event={"ID":"4c96dea8-a54f-4ca2-a3fb-757208554fe3","Type":"ContainerStarted","Data":"0d3fdee4348035d6cd1b64dc193c8071d562e483e1a9c9f8237fff21ecc006c0"} Apr 20 20:09:39.668959 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:39.668960 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-2n29c" event={"ID":"4c96dea8-a54f-4ca2-a3fb-757208554fe3","Type":"ContainerStarted","Data":"8240a7e5fc8acb864d8028c5903d69f98a4ab9e45f6709c460b61a6174235691"} Apr 20 20:09:39.685453 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:09:39.685411 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-2n29c" podStartSLOduration=253.81601784 podStartE2EDuration="4m14.685397644s" podCreationTimestamp="2026-04-20 20:05:25 +0000 UTC" firstStartedPulling="2026-04-20 20:09:37.973395113 +0000 UTC m=+252.686077096" lastFinishedPulling="2026-04-20 20:09:38.842774922 +0000 UTC m=+253.555456900" observedRunningTime="2026-04-20 20:09:39.683953011 +0000 UTC m=+254.396635008" watchObservedRunningTime="2026-04-20 20:09:39.685397644 +0000 UTC m=+254.398079639" Apr 20 20:10:00.959038 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:00.959003 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7c4f755c7b-22lkx"] Apr 20 20:10:00.959412 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:00.959284 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a78ff420-4c0e-42f9-a564-7d110a40f9d0" containerName="registry" Apr 20 20:10:00.959412 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:00.959296 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a78ff420-4c0e-42f9-a564-7d110a40f9d0" containerName="registry" Apr 20 20:10:00.959412 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:00.959306 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bf5957de-812d-47f7-b353-00ec68e9ecf4" containerName="console" Apr 20 20:10:00.959412 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:00.959311 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf5957de-812d-47f7-b353-00ec68e9ecf4" containerName="console" Apr 20 20:10:00.959412 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:00.959325 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6abafad9-224c-4f34-9eea-67324b84e58c" containerName="console" Apr 20 20:10:00.959412 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:00.959331 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="6abafad9-224c-4f34-9eea-67324b84e58c" containerName="console" Apr 20 20:10:00.959412 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:00.959345 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6832f12c-cb68-4e4c-9dde-084003a3d82d" containerName="console" Apr 20 20:10:00.959412 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:00.959350 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="6832f12c-cb68-4e4c-9dde-084003a3d82d" containerName="console" Apr 20 20:10:00.959412 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:00.959400 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="bf5957de-812d-47f7-b353-00ec68e9ecf4" containerName="console" Apr 20 20:10:00.959412 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:00.959408 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a78ff420-4c0e-42f9-a564-7d110a40f9d0" containerName="registry" Apr 20 20:10:00.959412 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:00.959414 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="6832f12c-cb68-4e4c-9dde-084003a3d82d" containerName="console" Apr 20 20:10:00.959801 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:00.959422 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="6abafad9-224c-4f34-9eea-67324b84e58c" containerName="console" Apr 20 20:10:00.962380 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:00.962365 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c4f755c7b-22lkx" Apr 20 20:10:00.970477 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:00.970449 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7c4f755c7b-22lkx"] Apr 20 20:10:01.141892 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:01.141845 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2966d7cd-1c3a-408e-bb89-146938f42ec7-service-ca\") pod \"console-7c4f755c7b-22lkx\" (UID: \"2966d7cd-1c3a-408e-bb89-146938f42ec7\") " pod="openshift-console/console-7c4f755c7b-22lkx" Apr 20 20:10:01.142086 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:01.141912 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2966d7cd-1c3a-408e-bb89-146938f42ec7-console-serving-cert\") pod \"console-7c4f755c7b-22lkx\" (UID: \"2966d7cd-1c3a-408e-bb89-146938f42ec7\") " pod="openshift-console/console-7c4f755c7b-22lkx" Apr 20 20:10:01.142086 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:01.141931 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2966d7cd-1c3a-408e-bb89-146938f42ec7-trusted-ca-bundle\") pod \"console-7c4f755c7b-22lkx\" (UID: \"2966d7cd-1c3a-408e-bb89-146938f42ec7\") " pod="openshift-console/console-7c4f755c7b-22lkx" Apr 20 20:10:01.142086 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:01.141948 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2966d7cd-1c3a-408e-bb89-146938f42ec7-console-oauth-config\") pod \"console-7c4f755c7b-22lkx\" (UID: \"2966d7cd-1c3a-408e-bb89-146938f42ec7\") " pod="openshift-console/console-7c4f755c7b-22lkx" Apr 20 20:10:01.142086 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:01.141965 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2966d7cd-1c3a-408e-bb89-146938f42ec7-oauth-serving-cert\") pod \"console-7c4f755c7b-22lkx\" (UID: \"2966d7cd-1c3a-408e-bb89-146938f42ec7\") " pod="openshift-console/console-7c4f755c7b-22lkx" Apr 20 20:10:01.142086 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:01.142065 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sssmc\" (UniqueName: \"kubernetes.io/projected/2966d7cd-1c3a-408e-bb89-146938f42ec7-kube-api-access-sssmc\") pod \"console-7c4f755c7b-22lkx\" (UID: \"2966d7cd-1c3a-408e-bb89-146938f42ec7\") " pod="openshift-console/console-7c4f755c7b-22lkx" Apr 20 20:10:01.142325 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:01.142145 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2966d7cd-1c3a-408e-bb89-146938f42ec7-console-config\") pod \"console-7c4f755c7b-22lkx\" (UID: \"2966d7cd-1c3a-408e-bb89-146938f42ec7\") " pod="openshift-console/console-7c4f755c7b-22lkx" Apr 20 20:10:01.243083 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:01.243004 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sssmc\" (UniqueName: \"kubernetes.io/projected/2966d7cd-1c3a-408e-bb89-146938f42ec7-kube-api-access-sssmc\") pod \"console-7c4f755c7b-22lkx\" (UID: \"2966d7cd-1c3a-408e-bb89-146938f42ec7\") " pod="openshift-console/console-7c4f755c7b-22lkx" Apr 20 20:10:01.243083 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:01.243039 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2966d7cd-1c3a-408e-bb89-146938f42ec7-console-config\") pod \"console-7c4f755c7b-22lkx\" (UID: \"2966d7cd-1c3a-408e-bb89-146938f42ec7\") " pod="openshift-console/console-7c4f755c7b-22lkx" Apr 20 20:10:01.243083 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:01.243058 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2966d7cd-1c3a-408e-bb89-146938f42ec7-service-ca\") pod \"console-7c4f755c7b-22lkx\" (UID: \"2966d7cd-1c3a-408e-bb89-146938f42ec7\") " pod="openshift-console/console-7c4f755c7b-22lkx" Apr 20 20:10:01.243347 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:01.243106 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2966d7cd-1c3a-408e-bb89-146938f42ec7-console-serving-cert\") pod \"console-7c4f755c7b-22lkx\" (UID: \"2966d7cd-1c3a-408e-bb89-146938f42ec7\") " pod="openshift-console/console-7c4f755c7b-22lkx" Apr 20 20:10:01.243347 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:01.243150 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2966d7cd-1c3a-408e-bb89-146938f42ec7-trusted-ca-bundle\") pod \"console-7c4f755c7b-22lkx\" (UID: \"2966d7cd-1c3a-408e-bb89-146938f42ec7\") " pod="openshift-console/console-7c4f755c7b-22lkx" Apr 20 20:10:01.243347 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:01.243173 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2966d7cd-1c3a-408e-bb89-146938f42ec7-console-oauth-config\") pod \"console-7c4f755c7b-22lkx\" (UID: \"2966d7cd-1c3a-408e-bb89-146938f42ec7\") " pod="openshift-console/console-7c4f755c7b-22lkx" Apr 20 20:10:01.243347 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:01.243203 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2966d7cd-1c3a-408e-bb89-146938f42ec7-oauth-serving-cert\") pod \"console-7c4f755c7b-22lkx\" (UID: \"2966d7cd-1c3a-408e-bb89-146938f42ec7\") " pod="openshift-console/console-7c4f755c7b-22lkx" Apr 20 20:10:01.243858 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:01.243825 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2966d7cd-1c3a-408e-bb89-146938f42ec7-console-config\") pod \"console-7c4f755c7b-22lkx\" (UID: \"2966d7cd-1c3a-408e-bb89-146938f42ec7\") " pod="openshift-console/console-7c4f755c7b-22lkx" Apr 20 20:10:01.243858 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:01.243851 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2966d7cd-1c3a-408e-bb89-146938f42ec7-oauth-serving-cert\") pod \"console-7c4f755c7b-22lkx\" (UID: \"2966d7cd-1c3a-408e-bb89-146938f42ec7\") " pod="openshift-console/console-7c4f755c7b-22lkx" Apr 20 20:10:01.244012 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:01.243854 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2966d7cd-1c3a-408e-bb89-146938f42ec7-service-ca\") pod \"console-7c4f755c7b-22lkx\" (UID: \"2966d7cd-1c3a-408e-bb89-146938f42ec7\") " pod="openshift-console/console-7c4f755c7b-22lkx" Apr 20 20:10:01.244012 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:01.243996 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2966d7cd-1c3a-408e-bb89-146938f42ec7-trusted-ca-bundle\") pod \"console-7c4f755c7b-22lkx\" (UID: \"2966d7cd-1c3a-408e-bb89-146938f42ec7\") " pod="openshift-console/console-7c4f755c7b-22lkx" Apr 20 20:10:01.245548 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:01.245530 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2966d7cd-1c3a-408e-bb89-146938f42ec7-console-serving-cert\") pod \"console-7c4f755c7b-22lkx\" (UID: \"2966d7cd-1c3a-408e-bb89-146938f42ec7\") " pod="openshift-console/console-7c4f755c7b-22lkx" Apr 20 20:10:01.245687 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:01.245671 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2966d7cd-1c3a-408e-bb89-146938f42ec7-console-oauth-config\") pod \"console-7c4f755c7b-22lkx\" (UID: \"2966d7cd-1c3a-408e-bb89-146938f42ec7\") " pod="openshift-console/console-7c4f755c7b-22lkx" Apr 20 20:10:01.251818 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:01.251798 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sssmc\" (UniqueName: \"kubernetes.io/projected/2966d7cd-1c3a-408e-bb89-146938f42ec7-kube-api-access-sssmc\") pod \"console-7c4f755c7b-22lkx\" (UID: \"2966d7cd-1c3a-408e-bb89-146938f42ec7\") " pod="openshift-console/console-7c4f755c7b-22lkx" Apr 20 20:10:01.271895 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:01.271870 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c4f755c7b-22lkx" Apr 20 20:10:01.396156 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:01.395937 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7c4f755c7b-22lkx"] Apr 20 20:10:01.398567 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:10:01.398539 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2966d7cd_1c3a_408e_bb89_146938f42ec7.slice/crio-806e46a70c9c4d59a5319d1020a6e99ce8bfd4db03a5feda2309f61f23a9a12c WatchSource:0}: Error finding container 806e46a70c9c4d59a5319d1020a6e99ce8bfd4db03a5feda2309f61f23a9a12c: Status 404 returned error can't find the container with id 806e46a70c9c4d59a5319d1020a6e99ce8bfd4db03a5feda2309f61f23a9a12c Apr 20 20:10:01.731766 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:01.731728 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c4f755c7b-22lkx" event={"ID":"2966d7cd-1c3a-408e-bb89-146938f42ec7","Type":"ContainerStarted","Data":"0cecb1bc40336e3c59a2334f0dbe39e04f2eb37554ade77aa583b571de3405b3"} Apr 20 20:10:01.731766 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:01.731763 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c4f755c7b-22lkx" event={"ID":"2966d7cd-1c3a-408e-bb89-146938f42ec7","Type":"ContainerStarted","Data":"806e46a70c9c4d59a5319d1020a6e99ce8bfd4db03a5feda2309f61f23a9a12c"} Apr 20 20:10:01.748507 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:01.748451 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7c4f755c7b-22lkx" podStartSLOduration=1.7484348 podStartE2EDuration="1.7484348s" podCreationTimestamp="2026-04-20 20:10:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:10:01.747697689 +0000 UTC m=+276.460379679" watchObservedRunningTime="2026-04-20 20:10:01.7484348 +0000 UTC m=+276.461116797" Apr 20 20:10:11.272719 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:11.272683 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7c4f755c7b-22lkx" Apr 20 20:10:11.272719 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:11.272726 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7c4f755c7b-22lkx" Apr 20 20:10:11.277391 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:11.277371 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7c4f755c7b-22lkx" Apr 20 20:10:11.761411 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:11.761380 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7c4f755c7b-22lkx" Apr 20 20:10:11.801025 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:11.800997 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-c459df6d8-5f5rg"] Apr 20 20:10:25.797409 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:25.797383 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z55qt_f78ac3d9-bcf1-43dd-aac7-1678831ee3ba/ovn-acl-logging/0.log" Apr 20 20:10:25.797808 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:25.797386 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z55qt_f78ac3d9-bcf1-43dd-aac7-1678831ee3ba/ovn-acl-logging/0.log" Apr 20 20:10:25.800742 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:25.800722 2576 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 20:10:36.819378 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:36.819319 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-c459df6d8-5f5rg" podUID="a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8" containerName="console" containerID="cri-o://9d156dc2fe9847b11886104b42c1bd9359593a9be63380a3f6e7a2813c891d75" gracePeriod=15 Apr 20 20:10:37.048924 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:37.048903 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c459df6d8-5f5rg_a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8/console/0.log" Apr 20 20:10:37.049035 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:37.048966 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c459df6d8-5f5rg" Apr 20 20:10:37.077480 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:37.077420 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8-console-oauth-config\") pod \"a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8\" (UID: \"a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8\") " Apr 20 20:10:37.077480 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:37.077477 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8-oauth-serving-cert\") pod \"a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8\" (UID: \"a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8\") " Apr 20 20:10:37.077603 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:37.077582 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8-console-config\") pod \"a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8\" (UID: \"a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8\") " Apr 20 20:10:37.077643 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:37.077614 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8-service-ca\") pod \"a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8\" (UID: \"a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8\") " Apr 20 20:10:37.077692 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:37.077646 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8-console-serving-cert\") pod \"a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8\" (UID: \"a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8\") " Apr 20 20:10:37.077742 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:37.077727 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5n7n\" (UniqueName: \"kubernetes.io/projected/a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8-kube-api-access-k5n7n\") pod \"a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8\" (UID: \"a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8\") " Apr 20 20:10:37.077791 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:37.077761 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8-trusted-ca-bundle\") pod \"a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8\" (UID: \"a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8\") " Apr 20 20:10:37.077960 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:37.077934 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8" (UID: "a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:10:37.077960 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:37.077943 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8-console-config" (OuterVolumeSpecName: "console-config") pod "a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8" (UID: "a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:10:37.078073 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:37.077974 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8-service-ca" (OuterVolumeSpecName: "service-ca") pod "a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8" (UID: "a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:10:37.078352 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:37.078320 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8" (UID: "a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:10:37.079711 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:37.079683 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8" (UID: "a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:10:37.079811 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:37.079759 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8" (UID: "a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:10:37.079999 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:37.079976 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8-kube-api-access-k5n7n" (OuterVolumeSpecName: "kube-api-access-k5n7n") pod "a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8" (UID: "a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8"). InnerVolumeSpecName "kube-api-access-k5n7n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:10:37.178461 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:37.178426 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8-oauth-serving-cert\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:10:37.178461 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:37.178461 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8-console-config\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:10:37.178601 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:37.178475 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8-service-ca\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:10:37.178601 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:37.178487 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8-console-serving-cert\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:10:37.178601 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:37.178498 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k5n7n\" (UniqueName: \"kubernetes.io/projected/a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8-kube-api-access-k5n7n\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:10:37.178601 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:37.178512 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8-trusted-ca-bundle\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:10:37.178601 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:37.178524 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8-console-oauth-config\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:10:37.833197 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:37.833168 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c459df6d8-5f5rg_a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8/console/0.log" Apr 20 20:10:37.833618 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:37.833213 2576 generic.go:358] "Generic (PLEG): container finished" podID="a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8" containerID="9d156dc2fe9847b11886104b42c1bd9359593a9be63380a3f6e7a2813c891d75" exitCode=2 Apr 20 20:10:37.833618 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:37.833271 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c459df6d8-5f5rg" event={"ID":"a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8","Type":"ContainerDied","Data":"9d156dc2fe9847b11886104b42c1bd9359593a9be63380a3f6e7a2813c891d75"} Apr 20 20:10:37.833618 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:37.833307 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c459df6d8-5f5rg" event={"ID":"a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8","Type":"ContainerDied","Data":"8974c5137a3946baab0c1ba1475c8d21585e79b8eb7997fb023e72797b296b76"} Apr 20 20:10:37.833618 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:37.833330 2576 scope.go:117] "RemoveContainer" containerID="9d156dc2fe9847b11886104b42c1bd9359593a9be63380a3f6e7a2813c891d75" Apr 20 20:10:37.833618 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:37.833275 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c459df6d8-5f5rg" Apr 20 20:10:37.842033 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:37.842017 2576 scope.go:117] "RemoveContainer" containerID="9d156dc2fe9847b11886104b42c1bd9359593a9be63380a3f6e7a2813c891d75" Apr 20 20:10:37.842299 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:10:37.842282 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d156dc2fe9847b11886104b42c1bd9359593a9be63380a3f6e7a2813c891d75\": container with ID starting with 9d156dc2fe9847b11886104b42c1bd9359593a9be63380a3f6e7a2813c891d75 not found: ID does not exist" containerID="9d156dc2fe9847b11886104b42c1bd9359593a9be63380a3f6e7a2813c891d75" Apr 20 20:10:37.842366 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:37.842310 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d156dc2fe9847b11886104b42c1bd9359593a9be63380a3f6e7a2813c891d75"} err="failed to get container status \"9d156dc2fe9847b11886104b42c1bd9359593a9be63380a3f6e7a2813c891d75\": rpc error: code = NotFound desc = could not find container \"9d156dc2fe9847b11886104b42c1bd9359593a9be63380a3f6e7a2813c891d75\": container with ID starting with 9d156dc2fe9847b11886104b42c1bd9359593a9be63380a3f6e7a2813c891d75 not found: ID does not exist" Apr 20 20:10:37.854654 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:37.854632 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-c459df6d8-5f5rg"] Apr 20 20:10:37.858146 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:37.858126 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-c459df6d8-5f5rg"] Apr 20 20:10:37.942889 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:10:37.942867 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8" path="/var/lib/kubelet/pods/a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8/volumes" Apr 20 20:11:36.674662 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:36.674629 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctwm5w"] Apr 20 20:11:36.675182 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:36.674985 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8" containerName="console" Apr 20 20:11:36.675182 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:36.675004 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8" containerName="console" Apr 20 20:11:36.675182 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:36.675145 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a6b38bc1-5e7c-4c5f-aded-bf5d99a8ace8" containerName="console" Apr 20 20:11:36.678035 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:36.678020 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctwm5w" Apr 20 20:11:36.681668 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:36.681649 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-kktmq\"" Apr 20 20:11:36.682905 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:36.682885 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 20 20:11:36.683018 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:36.682915 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 20 20:11:36.707810 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:36.707787 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctwm5w"] Apr 20 20:11:36.787725 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:36.787699 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sz4t\" (UniqueName: \"kubernetes.io/projected/b8e3d9ae-7916-4a4a-a0da-8a1e5911de56-kube-api-access-4sz4t\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctwm5w\" (UID: \"b8e3d9ae-7916-4a4a-a0da-8a1e5911de56\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctwm5w" Apr 20 20:11:36.787836 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:36.787739 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b8e3d9ae-7916-4a4a-a0da-8a1e5911de56-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctwm5w\" (UID: \"b8e3d9ae-7916-4a4a-a0da-8a1e5911de56\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctwm5w" Apr 20 20:11:36.787836 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:36.787760 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b8e3d9ae-7916-4a4a-a0da-8a1e5911de56-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctwm5w\" (UID: \"b8e3d9ae-7916-4a4a-a0da-8a1e5911de56\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctwm5w" Apr 20 20:11:36.888638 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:36.888600 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4sz4t\" (UniqueName: \"kubernetes.io/projected/b8e3d9ae-7916-4a4a-a0da-8a1e5911de56-kube-api-access-4sz4t\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctwm5w\" (UID: \"b8e3d9ae-7916-4a4a-a0da-8a1e5911de56\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctwm5w" Apr 20 20:11:36.888789 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:36.888652 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b8e3d9ae-7916-4a4a-a0da-8a1e5911de56-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctwm5w\" (UID: \"b8e3d9ae-7916-4a4a-a0da-8a1e5911de56\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctwm5w" Apr 20 20:11:36.888789 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:36.888680 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b8e3d9ae-7916-4a4a-a0da-8a1e5911de56-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctwm5w\" (UID: \"b8e3d9ae-7916-4a4a-a0da-8a1e5911de56\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctwm5w" Apr 20 20:11:36.889023 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:36.889000 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b8e3d9ae-7916-4a4a-a0da-8a1e5911de56-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctwm5w\" (UID: \"b8e3d9ae-7916-4a4a-a0da-8a1e5911de56\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctwm5w" Apr 20 20:11:36.889098 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:36.889070 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b8e3d9ae-7916-4a4a-a0da-8a1e5911de56-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctwm5w\" (UID: \"b8e3d9ae-7916-4a4a-a0da-8a1e5911de56\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctwm5w" Apr 20 20:11:36.897946 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:36.897923 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sz4t\" (UniqueName: \"kubernetes.io/projected/b8e3d9ae-7916-4a4a-a0da-8a1e5911de56-kube-api-access-4sz4t\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctwm5w\" (UID: \"b8e3d9ae-7916-4a4a-a0da-8a1e5911de56\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctwm5w" Apr 20 20:11:36.986232 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:36.986170 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctwm5w" Apr 20 20:11:37.105722 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:37.105699 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctwm5w"] Apr 20 20:11:37.107503 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:11:37.107477 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8e3d9ae_7916_4a4a_a0da_8a1e5911de56.slice/crio-0ff021a788d0c1af4b85a808a36736e4f1559051b3d91f16cde886e76f2a2309 WatchSource:0}: Error finding container 0ff021a788d0c1af4b85a808a36736e4f1559051b3d91f16cde886e76f2a2309: Status 404 returned error can't find the container with id 0ff021a788d0c1af4b85a808a36736e4f1559051b3d91f16cde886e76f2a2309 Apr 20 20:11:37.109164 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:37.109146 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:11:37.992649 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:37.992607 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctwm5w" event={"ID":"b8e3d9ae-7916-4a4a-a0da-8a1e5911de56","Type":"ContainerStarted","Data":"0ff021a788d0c1af4b85a808a36736e4f1559051b3d91f16cde886e76f2a2309"} Apr 20 20:11:43.009621 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:43.009586 2576 generic.go:358] "Generic (PLEG): container finished" podID="b8e3d9ae-7916-4a4a-a0da-8a1e5911de56" containerID="cdded060c2e2429caeeafb017d4d07aec5558c01a70eadc929b08fbca0b1ea68" exitCode=0 Apr 20 20:11:43.010011 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:43.009674 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctwm5w" event={"ID":"b8e3d9ae-7916-4a4a-a0da-8a1e5911de56","Type":"ContainerDied","Data":"cdded060c2e2429caeeafb017d4d07aec5558c01a70eadc929b08fbca0b1ea68"} Apr 20 20:11:46.018965 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:46.018930 2576 generic.go:358] "Generic (PLEG): container finished" podID="b8e3d9ae-7916-4a4a-a0da-8a1e5911de56" containerID="a526a2ff18aec1afc6e3e7d572b88398550ffc40ccf6c06fcb5fdaf3ec3f1780" exitCode=0 Apr 20 20:11:46.018965 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:46.018965 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctwm5w" event={"ID":"b8e3d9ae-7916-4a4a-a0da-8a1e5911de56","Type":"ContainerDied","Data":"a526a2ff18aec1afc6e3e7d572b88398550ffc40ccf6c06fcb5fdaf3ec3f1780"} Apr 20 20:11:54.044088 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:54.044049 2576 generic.go:358] "Generic (PLEG): container finished" podID="b8e3d9ae-7916-4a4a-a0da-8a1e5911de56" containerID="5b01790fe4293560614eeef351cddf40763a5ab32779ce2ab7a6e817b3b2d8e4" exitCode=0 Apr 20 20:11:54.044509 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:54.044140 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctwm5w" event={"ID":"b8e3d9ae-7916-4a4a-a0da-8a1e5911de56","Type":"ContainerDied","Data":"5b01790fe4293560614eeef351cddf40763a5ab32779ce2ab7a6e817b3b2d8e4"} Apr 20 20:11:55.165782 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:55.165760 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctwm5w" Apr 20 20:11:55.237781 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:55.237752 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b8e3d9ae-7916-4a4a-a0da-8a1e5911de56-util\") pod \"b8e3d9ae-7916-4a4a-a0da-8a1e5911de56\" (UID: \"b8e3d9ae-7916-4a4a-a0da-8a1e5911de56\") " Apr 20 20:11:55.237781 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:55.237783 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b8e3d9ae-7916-4a4a-a0da-8a1e5911de56-bundle\") pod \"b8e3d9ae-7916-4a4a-a0da-8a1e5911de56\" (UID: \"b8e3d9ae-7916-4a4a-a0da-8a1e5911de56\") " Apr 20 20:11:55.238018 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:55.237837 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4sz4t\" (UniqueName: \"kubernetes.io/projected/b8e3d9ae-7916-4a4a-a0da-8a1e5911de56-kube-api-access-4sz4t\") pod \"b8e3d9ae-7916-4a4a-a0da-8a1e5911de56\" (UID: \"b8e3d9ae-7916-4a4a-a0da-8a1e5911de56\") " Apr 20 20:11:55.238395 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:55.238372 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8e3d9ae-7916-4a4a-a0da-8a1e5911de56-bundle" (OuterVolumeSpecName: "bundle") pod "b8e3d9ae-7916-4a4a-a0da-8a1e5911de56" (UID: "b8e3d9ae-7916-4a4a-a0da-8a1e5911de56"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:11:55.239944 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:55.239923 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8e3d9ae-7916-4a4a-a0da-8a1e5911de56-kube-api-access-4sz4t" (OuterVolumeSpecName: "kube-api-access-4sz4t") pod "b8e3d9ae-7916-4a4a-a0da-8a1e5911de56" (UID: "b8e3d9ae-7916-4a4a-a0da-8a1e5911de56"). InnerVolumeSpecName "kube-api-access-4sz4t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:11:55.242036 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:55.242011 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8e3d9ae-7916-4a4a-a0da-8a1e5911de56-util" (OuterVolumeSpecName: "util") pod "b8e3d9ae-7916-4a4a-a0da-8a1e5911de56" (UID: "b8e3d9ae-7916-4a4a-a0da-8a1e5911de56"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:11:55.338660 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:55.338580 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4sz4t\" (UniqueName: \"kubernetes.io/projected/b8e3d9ae-7916-4a4a-a0da-8a1e5911de56-kube-api-access-4sz4t\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:11:55.338660 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:55.338607 2576 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b8e3d9ae-7916-4a4a-a0da-8a1e5911de56-util\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:11:55.338660 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:55.338620 2576 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b8e3d9ae-7916-4a4a-a0da-8a1e5911de56-bundle\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:11:56.051025 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:56.050994 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctwm5w" event={"ID":"b8e3d9ae-7916-4a4a-a0da-8a1e5911de56","Type":"ContainerDied","Data":"0ff021a788d0c1af4b85a808a36736e4f1559051b3d91f16cde886e76f2a2309"} Apr 20 20:11:56.051025 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:56.051015 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29ctwm5w" Apr 20 20:11:56.051025 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:56.051030 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ff021a788d0c1af4b85a808a36736e4f1559051b3d91f16cde886e76f2a2309" Apr 20 20:11:58.330525 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:58.330477 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qxc8j"] Apr 20 20:11:58.331011 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:58.330882 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b8e3d9ae-7916-4a4a-a0da-8a1e5911de56" containerName="pull" Apr 20 20:11:58.331011 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:58.330899 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8e3d9ae-7916-4a4a-a0da-8a1e5911de56" containerName="pull" Apr 20 20:11:58.331011 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:58.330916 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b8e3d9ae-7916-4a4a-a0da-8a1e5911de56" containerName="util" Apr 20 20:11:58.331011 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:58.330924 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8e3d9ae-7916-4a4a-a0da-8a1e5911de56" containerName="util" Apr 20 20:11:58.331011 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:58.330941 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b8e3d9ae-7916-4a4a-a0da-8a1e5911de56" containerName="extract" Apr 20 20:11:58.331011 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:58.330949 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8e3d9ae-7916-4a4a-a0da-8a1e5911de56" containerName="extract" Apr 20 20:11:58.331011 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:58.331005 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="b8e3d9ae-7916-4a4a-a0da-8a1e5911de56" containerName="extract" Apr 20 20:11:58.381500 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:58.381461 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qxc8j"] Apr 20 20:11:58.381680 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:58.381607 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qxc8j" Apr 20 20:11:58.384175 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:58.384153 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 20 20:11:58.384306 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:58.384292 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-lnscz\"" Apr 20 20:11:58.384400 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:58.384386 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 20 20:11:58.384480 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:58.384466 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 20 20:11:58.455044 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:58.455011 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/f792d2fc-9721-49ed-b027-57475ec2b38e-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-qxc8j\" (UID: \"f792d2fc-9721-49ed-b027-57475ec2b38e\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qxc8j" Apr 20 20:11:58.455202 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:58.455065 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gjdl\" (UniqueName: \"kubernetes.io/projected/f792d2fc-9721-49ed-b027-57475ec2b38e-kube-api-access-6gjdl\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-qxc8j\" (UID: \"f792d2fc-9721-49ed-b027-57475ec2b38e\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qxc8j" Apr 20 20:11:58.555411 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:58.555378 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/f792d2fc-9721-49ed-b027-57475ec2b38e-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-qxc8j\" (UID: \"f792d2fc-9721-49ed-b027-57475ec2b38e\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qxc8j" Apr 20 20:11:58.555580 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:58.555435 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6gjdl\" (UniqueName: \"kubernetes.io/projected/f792d2fc-9721-49ed-b027-57475ec2b38e-kube-api-access-6gjdl\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-qxc8j\" (UID: \"f792d2fc-9721-49ed-b027-57475ec2b38e\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qxc8j" Apr 20 20:11:58.557880 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:58.557859 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/f792d2fc-9721-49ed-b027-57475ec2b38e-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-qxc8j\" (UID: \"f792d2fc-9721-49ed-b027-57475ec2b38e\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qxc8j" Apr 20 20:11:58.566293 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:58.566263 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gjdl\" (UniqueName: \"kubernetes.io/projected/f792d2fc-9721-49ed-b027-57475ec2b38e-kube-api-access-6gjdl\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-qxc8j\" (UID: \"f792d2fc-9721-49ed-b027-57475ec2b38e\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qxc8j" Apr 20 20:11:58.691033 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:58.690952 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qxc8j" Apr 20 20:11:58.812398 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:58.812297 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qxc8j"] Apr 20 20:11:58.814519 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:11:58.814486 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf792d2fc_9721_49ed_b027_57475ec2b38e.slice/crio-79b84ec145e3821913c9414d60f6633dff4fc7fb86976d68a375ed266ea6d8ae WatchSource:0}: Error finding container 79b84ec145e3821913c9414d60f6633dff4fc7fb86976d68a375ed266ea6d8ae: Status 404 returned error can't find the container with id 79b84ec145e3821913c9414d60f6633dff4fc7fb86976d68a375ed266ea6d8ae Apr 20 20:11:59.062353 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:11:59.062315 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qxc8j" event={"ID":"f792d2fc-9721-49ed-b027-57475ec2b38e","Type":"ContainerStarted","Data":"79b84ec145e3821913c9414d60f6633dff4fc7fb86976d68a375ed266ea6d8ae"} Apr 20 20:12:03.012315 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:03.012283 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-k7xlh"] Apr 20 20:12:03.015450 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:03.015432 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-k7xlh" Apr 20 20:12:03.018095 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:03.018071 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 20 20:12:03.018204 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:03.018074 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 20 20:12:03.018204 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:03.018169 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-w25nj\"" Apr 20 20:12:03.024680 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:03.024658 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-k7xlh"] Apr 20 20:12:03.079329 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:03.079295 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qxc8j" event={"ID":"f792d2fc-9721-49ed-b027-57475ec2b38e","Type":"ContainerStarted","Data":"441c8192c56070eaa1ca26278d41abd33bcce79006bea9d02cc8d2e3570cf7b0"} Apr 20 20:12:03.079467 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:03.079377 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qxc8j" Apr 20 20:12:03.090238 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:03.090213 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5wgp\" (UniqueName: \"kubernetes.io/projected/e5aad4df-a72a-463d-a0ec-2d7703eebc56-kube-api-access-x5wgp\") pod \"keda-operator-ffbb595cb-k7xlh\" (UID: \"e5aad4df-a72a-463d-a0ec-2d7703eebc56\") " pod="openshift-keda/keda-operator-ffbb595cb-k7xlh" Apr 20 20:12:03.090366 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:03.090348 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/e5aad4df-a72a-463d-a0ec-2d7703eebc56-cabundle0\") pod \"keda-operator-ffbb595cb-k7xlh\" (UID: \"e5aad4df-a72a-463d-a0ec-2d7703eebc56\") " pod="openshift-keda/keda-operator-ffbb595cb-k7xlh" Apr 20 20:12:03.090425 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:03.090400 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e5aad4df-a72a-463d-a0ec-2d7703eebc56-certificates\") pod \"keda-operator-ffbb595cb-k7xlh\" (UID: \"e5aad4df-a72a-463d-a0ec-2d7703eebc56\") " pod="openshift-keda/keda-operator-ffbb595cb-k7xlh" Apr 20 20:12:03.191513 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:03.191482 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x5wgp\" (UniqueName: \"kubernetes.io/projected/e5aad4df-a72a-463d-a0ec-2d7703eebc56-kube-api-access-x5wgp\") pod \"keda-operator-ffbb595cb-k7xlh\" (UID: \"e5aad4df-a72a-463d-a0ec-2d7703eebc56\") " pod="openshift-keda/keda-operator-ffbb595cb-k7xlh" Apr 20 20:12:03.191642 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:03.191519 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/e5aad4df-a72a-463d-a0ec-2d7703eebc56-cabundle0\") pod \"keda-operator-ffbb595cb-k7xlh\" (UID: \"e5aad4df-a72a-463d-a0ec-2d7703eebc56\") " pod="openshift-keda/keda-operator-ffbb595cb-k7xlh" Apr 20 20:12:03.191642 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:03.191544 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e5aad4df-a72a-463d-a0ec-2d7703eebc56-certificates\") pod \"keda-operator-ffbb595cb-k7xlh\" (UID: \"e5aad4df-a72a-463d-a0ec-2d7703eebc56\") " pod="openshift-keda/keda-operator-ffbb595cb-k7xlh" Apr 20 20:12:03.191642 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:12:03.191630 2576 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 20 20:12:03.191642 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:12:03.191644 2576 secret.go:281] references non-existent secret key: ca.crt Apr 20 20:12:03.191850 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:12:03.191650 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 20 20:12:03.191850 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:12:03.191662 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-k7xlh: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 20 20:12:03.191850 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:12:03.191706 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e5aad4df-a72a-463d-a0ec-2d7703eebc56-certificates podName:e5aad4df-a72a-463d-a0ec-2d7703eebc56 nodeName:}" failed. No retries permitted until 2026-04-20 20:12:03.691688352 +0000 UTC m=+398.404370328 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/e5aad4df-a72a-463d-a0ec-2d7703eebc56-certificates") pod "keda-operator-ffbb595cb-k7xlh" (UID: "e5aad4df-a72a-463d-a0ec-2d7703eebc56") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 20 20:12:03.192248 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:03.192228 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/e5aad4df-a72a-463d-a0ec-2d7703eebc56-cabundle0\") pod \"keda-operator-ffbb595cb-k7xlh\" (UID: \"e5aad4df-a72a-463d-a0ec-2d7703eebc56\") " pod="openshift-keda/keda-operator-ffbb595cb-k7xlh" Apr 20 20:12:03.199900 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:03.199881 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5wgp\" (UniqueName: \"kubernetes.io/projected/e5aad4df-a72a-463d-a0ec-2d7703eebc56-kube-api-access-x5wgp\") pod \"keda-operator-ffbb595cb-k7xlh\" (UID: \"e5aad4df-a72a-463d-a0ec-2d7703eebc56\") " pod="openshift-keda/keda-operator-ffbb595cb-k7xlh" Apr 20 20:12:03.286364 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:03.286309 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qxc8j" podStartSLOduration=1.615748187 podStartE2EDuration="5.286296886s" podCreationTimestamp="2026-04-20 20:11:58 +0000 UTC" firstStartedPulling="2026-04-20 20:11:58.816242552 +0000 UTC m=+393.528924531" lastFinishedPulling="2026-04-20 20:12:02.486791243 +0000 UTC m=+397.199473230" observedRunningTime="2026-04-20 20:12:03.096758681 +0000 UTC m=+397.809440690" watchObservedRunningTime="2026-04-20 20:12:03.286296886 +0000 UTC m=+397.998978882" Apr 20 20:12:03.287140 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:03.287097 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-wjjvd"] Apr 20 20:12:03.290386 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:03.290370 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-wjjvd" Apr 20 20:12:03.292995 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:03.292972 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 20 20:12:03.299409 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:03.299383 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-wjjvd"] Apr 20 20:12:03.393747 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:03.393720 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52nf7\" (UniqueName: \"kubernetes.io/projected/c708e289-89c8-45fa-b250-f564c5da8db3-kube-api-access-52nf7\") pod \"keda-metrics-apiserver-7c9f485588-wjjvd\" (UID: \"c708e289-89c8-45fa-b250-f564c5da8db3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-wjjvd" Apr 20 20:12:03.393747 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:03.393752 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c708e289-89c8-45fa-b250-f564c5da8db3-certificates\") pod \"keda-metrics-apiserver-7c9f485588-wjjvd\" (UID: \"c708e289-89c8-45fa-b250-f564c5da8db3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-wjjvd" Apr 20 20:12:03.393960 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:03.393769 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/c708e289-89c8-45fa-b250-f564c5da8db3-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-wjjvd\" (UID: \"c708e289-89c8-45fa-b250-f564c5da8db3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-wjjvd" Apr 20 20:12:03.494822 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:03.494792 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c708e289-89c8-45fa-b250-f564c5da8db3-certificates\") pod \"keda-metrics-apiserver-7c9f485588-wjjvd\" (UID: \"c708e289-89c8-45fa-b250-f564c5da8db3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-wjjvd" Apr 20 20:12:03.495049 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:03.494834 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/c708e289-89c8-45fa-b250-f564c5da8db3-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-wjjvd\" (UID: \"c708e289-89c8-45fa-b250-f564c5da8db3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-wjjvd" Apr 20 20:12:03.495049 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:12:03.494936 2576 secret.go:281] references non-existent secret key: tls.crt Apr 20 20:12:03.495049 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:03.494948 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-52nf7\" (UniqueName: \"kubernetes.io/projected/c708e289-89c8-45fa-b250-f564c5da8db3-kube-api-access-52nf7\") pod \"keda-metrics-apiserver-7c9f485588-wjjvd\" (UID: \"c708e289-89c8-45fa-b250-f564c5da8db3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-wjjvd" Apr 20 20:12:03.495049 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:12:03.494958 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 20 20:12:03.495049 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:12:03.494978 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-wjjvd: references non-existent secret key: tls.crt Apr 20 20:12:03.495049 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:12:03.495030 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c708e289-89c8-45fa-b250-f564c5da8db3-certificates podName:c708e289-89c8-45fa-b250-f564c5da8db3 nodeName:}" failed. No retries permitted until 2026-04-20 20:12:03.995014864 +0000 UTC m=+398.707696839 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/c708e289-89c8-45fa-b250-f564c5da8db3-certificates") pod "keda-metrics-apiserver-7c9f485588-wjjvd" (UID: "c708e289-89c8-45fa-b250-f564c5da8db3") : references non-existent secret key: tls.crt Apr 20 20:12:03.495277 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:03.495247 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/c708e289-89c8-45fa-b250-f564c5da8db3-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-wjjvd\" (UID: \"c708e289-89c8-45fa-b250-f564c5da8db3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-wjjvd" Apr 20 20:12:03.509324 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:03.509295 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-52nf7\" (UniqueName: \"kubernetes.io/projected/c708e289-89c8-45fa-b250-f564c5da8db3-kube-api-access-52nf7\") pod \"keda-metrics-apiserver-7c9f485588-wjjvd\" (UID: \"c708e289-89c8-45fa-b250-f564c5da8db3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-wjjvd" Apr 20 20:12:03.514007 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:03.513986 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-q6g8s"] Apr 20 20:12:03.517058 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:03.517041 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-q6g8s" Apr 20 20:12:03.519477 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:03.519456 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 20 20:12:03.528814 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:03.528794 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-q6g8s"] Apr 20 20:12:03.595741 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:03.595675 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a8183170-fb0c-47e8-b5fd-a7d3f1fb78d8-certificates\") pod \"keda-admission-cf49989db-q6g8s\" (UID: \"a8183170-fb0c-47e8-b5fd-a7d3f1fb78d8\") " pod="openshift-keda/keda-admission-cf49989db-q6g8s" Apr 20 20:12:03.595741 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:03.595731 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lgt8\" (UniqueName: \"kubernetes.io/projected/a8183170-fb0c-47e8-b5fd-a7d3f1fb78d8-kube-api-access-7lgt8\") pod \"keda-admission-cf49989db-q6g8s\" (UID: \"a8183170-fb0c-47e8-b5fd-a7d3f1fb78d8\") " pod="openshift-keda/keda-admission-cf49989db-q6g8s" Apr 20 20:12:03.697051 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:03.697015 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e5aad4df-a72a-463d-a0ec-2d7703eebc56-certificates\") pod \"keda-operator-ffbb595cb-k7xlh\" (UID: \"e5aad4df-a72a-463d-a0ec-2d7703eebc56\") " pod="openshift-keda/keda-operator-ffbb595cb-k7xlh" Apr 20 20:12:03.697240 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:03.697062 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a8183170-fb0c-47e8-b5fd-a7d3f1fb78d8-certificates\") pod \"keda-admission-cf49989db-q6g8s\" (UID: \"a8183170-fb0c-47e8-b5fd-a7d3f1fb78d8\") " pod="openshift-keda/keda-admission-cf49989db-q6g8s" Apr 20 20:12:03.697240 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:12:03.697202 2576 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 20 20:12:03.697240 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:12:03.697224 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-q6g8s: secret "keda-admission-webhooks-certs" not found Apr 20 20:12:03.697383 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:03.697235 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7lgt8\" (UniqueName: \"kubernetes.io/projected/a8183170-fb0c-47e8-b5fd-a7d3f1fb78d8-kube-api-access-7lgt8\") pod \"keda-admission-cf49989db-q6g8s\" (UID: \"a8183170-fb0c-47e8-b5fd-a7d3f1fb78d8\") " pod="openshift-keda/keda-admission-cf49989db-q6g8s" Apr 20 20:12:03.697383 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:12:03.697279 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a8183170-fb0c-47e8-b5fd-a7d3f1fb78d8-certificates podName:a8183170-fb0c-47e8-b5fd-a7d3f1fb78d8 nodeName:}" failed. No retries permitted until 2026-04-20 20:12:04.197262529 +0000 UTC m=+398.909944507 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/a8183170-fb0c-47e8-b5fd-a7d3f1fb78d8-certificates") pod "keda-admission-cf49989db-q6g8s" (UID: "a8183170-fb0c-47e8-b5fd-a7d3f1fb78d8") : secret "keda-admission-webhooks-certs" not found Apr 20 20:12:03.697383 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:12:03.697203 2576 secret.go:281] references non-existent secret key: ca.crt Apr 20 20:12:03.697383 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:12:03.697308 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 20 20:12:03.697383 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:12:03.697322 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-k7xlh: references non-existent secret key: ca.crt Apr 20 20:12:03.697383 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:12:03.697369 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e5aad4df-a72a-463d-a0ec-2d7703eebc56-certificates podName:e5aad4df-a72a-463d-a0ec-2d7703eebc56 nodeName:}" failed. No retries permitted until 2026-04-20 20:12:04.697354697 +0000 UTC m=+399.410036677 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/e5aad4df-a72a-463d-a0ec-2d7703eebc56-certificates") pod "keda-operator-ffbb595cb-k7xlh" (UID: "e5aad4df-a72a-463d-a0ec-2d7703eebc56") : references non-existent secret key: ca.crt Apr 20 20:12:03.706328 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:03.706303 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lgt8\" (UniqueName: \"kubernetes.io/projected/a8183170-fb0c-47e8-b5fd-a7d3f1fb78d8-kube-api-access-7lgt8\") pod \"keda-admission-cf49989db-q6g8s\" (UID: \"a8183170-fb0c-47e8-b5fd-a7d3f1fb78d8\") " pod="openshift-keda/keda-admission-cf49989db-q6g8s" Apr 20 20:12:04.000092 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:04.000014 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c708e289-89c8-45fa-b250-f564c5da8db3-certificates\") pod \"keda-metrics-apiserver-7c9f485588-wjjvd\" (UID: \"c708e289-89c8-45fa-b250-f564c5da8db3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-wjjvd" Apr 20 20:12:04.000245 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:12:04.000162 2576 secret.go:281] references non-existent secret key: tls.crt Apr 20 20:12:04.000245 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:12:04.000176 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 20 20:12:04.000245 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:12:04.000197 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-wjjvd: references non-existent secret key: tls.crt Apr 20 20:12:04.000343 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:12:04.000252 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c708e289-89c8-45fa-b250-f564c5da8db3-certificates podName:c708e289-89c8-45fa-b250-f564c5da8db3 nodeName:}" failed. No retries permitted until 2026-04-20 20:12:05.000238131 +0000 UTC m=+399.712920127 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/c708e289-89c8-45fa-b250-f564c5da8db3-certificates") pod "keda-metrics-apiserver-7c9f485588-wjjvd" (UID: "c708e289-89c8-45fa-b250-f564c5da8db3") : references non-existent secret key: tls.crt Apr 20 20:12:04.201635 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:04.201603 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a8183170-fb0c-47e8-b5fd-a7d3f1fb78d8-certificates\") pod \"keda-admission-cf49989db-q6g8s\" (UID: \"a8183170-fb0c-47e8-b5fd-a7d3f1fb78d8\") " pod="openshift-keda/keda-admission-cf49989db-q6g8s" Apr 20 20:12:04.201988 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:12:04.201717 2576 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 20 20:12:04.201988 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:12:04.201734 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-q6g8s: secret "keda-admission-webhooks-certs" not found Apr 20 20:12:04.201988 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:12:04.201778 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a8183170-fb0c-47e8-b5fd-a7d3f1fb78d8-certificates podName:a8183170-fb0c-47e8-b5fd-a7d3f1fb78d8 nodeName:}" failed. No retries permitted until 2026-04-20 20:12:05.201765418 +0000 UTC m=+399.914447394 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/a8183170-fb0c-47e8-b5fd-a7d3f1fb78d8-certificates") pod "keda-admission-cf49989db-q6g8s" (UID: "a8183170-fb0c-47e8-b5fd-a7d3f1fb78d8") : secret "keda-admission-webhooks-certs" not found Apr 20 20:12:04.706684 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:04.706646 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e5aad4df-a72a-463d-a0ec-2d7703eebc56-certificates\") pod \"keda-operator-ffbb595cb-k7xlh\" (UID: \"e5aad4df-a72a-463d-a0ec-2d7703eebc56\") " pod="openshift-keda/keda-operator-ffbb595cb-k7xlh" Apr 20 20:12:04.706861 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:12:04.706813 2576 secret.go:281] references non-existent secret key: ca.crt Apr 20 20:12:04.706861 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:12:04.706829 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 20 20:12:04.706861 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:12:04.706837 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-k7xlh: references non-existent secret key: ca.crt Apr 20 20:12:04.706957 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:12:04.706889 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e5aad4df-a72a-463d-a0ec-2d7703eebc56-certificates podName:e5aad4df-a72a-463d-a0ec-2d7703eebc56 nodeName:}" failed. No retries permitted until 2026-04-20 20:12:06.706872832 +0000 UTC m=+401.419554818 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/e5aad4df-a72a-463d-a0ec-2d7703eebc56-certificates") pod "keda-operator-ffbb595cb-k7xlh" (UID: "e5aad4df-a72a-463d-a0ec-2d7703eebc56") : references non-existent secret key: ca.crt Apr 20 20:12:05.008526 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:05.008445 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c708e289-89c8-45fa-b250-f564c5da8db3-certificates\") pod \"keda-metrics-apiserver-7c9f485588-wjjvd\" (UID: \"c708e289-89c8-45fa-b250-f564c5da8db3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-wjjvd" Apr 20 20:12:05.008666 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:12:05.008567 2576 secret.go:281] references non-existent secret key: tls.crt Apr 20 20:12:05.008666 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:12:05.008582 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 20 20:12:05.008666 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:12:05.008603 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-wjjvd: references non-existent secret key: tls.crt Apr 20 20:12:05.008666 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:12:05.008649 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c708e289-89c8-45fa-b250-f564c5da8db3-certificates podName:c708e289-89c8-45fa-b250-f564c5da8db3 nodeName:}" failed. No retries permitted until 2026-04-20 20:12:07.008636122 +0000 UTC m=+401.721318097 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/c708e289-89c8-45fa-b250-f564c5da8db3-certificates") pod "keda-metrics-apiserver-7c9f485588-wjjvd" (UID: "c708e289-89c8-45fa-b250-f564c5da8db3") : references non-existent secret key: tls.crt Apr 20 20:12:05.210503 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:05.210467 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a8183170-fb0c-47e8-b5fd-a7d3f1fb78d8-certificates\") pod \"keda-admission-cf49989db-q6g8s\" (UID: \"a8183170-fb0c-47e8-b5fd-a7d3f1fb78d8\") " pod="openshift-keda/keda-admission-cf49989db-q6g8s" Apr 20 20:12:05.212818 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:05.212795 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a8183170-fb0c-47e8-b5fd-a7d3f1fb78d8-certificates\") pod \"keda-admission-cf49989db-q6g8s\" (UID: \"a8183170-fb0c-47e8-b5fd-a7d3f1fb78d8\") " pod="openshift-keda/keda-admission-cf49989db-q6g8s" Apr 20 20:12:05.327608 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:05.327581 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-q6g8s" Apr 20 20:12:05.445165 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:05.445141 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-q6g8s"] Apr 20 20:12:05.448125 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:12:05.448075 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8183170_fb0c_47e8_b5fd_a7d3f1fb78d8.slice/crio-7ba9202b315e67a788e07e78dd153b11fcd55da3f19947672a958b9a994db4d1 WatchSource:0}: Error finding container 7ba9202b315e67a788e07e78dd153b11fcd55da3f19947672a958b9a994db4d1: Status 404 returned error can't find the container with id 7ba9202b315e67a788e07e78dd153b11fcd55da3f19947672a958b9a994db4d1 Apr 20 20:12:06.091830 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:06.091792 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-q6g8s" event={"ID":"a8183170-fb0c-47e8-b5fd-a7d3f1fb78d8","Type":"ContainerStarted","Data":"7ba9202b315e67a788e07e78dd153b11fcd55da3f19947672a958b9a994db4d1"} Apr 20 20:12:06.723317 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:06.723284 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e5aad4df-a72a-463d-a0ec-2d7703eebc56-certificates\") pod \"keda-operator-ffbb595cb-k7xlh\" (UID: \"e5aad4df-a72a-463d-a0ec-2d7703eebc56\") " pod="openshift-keda/keda-operator-ffbb595cb-k7xlh" Apr 20 20:12:06.723612 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:12:06.723384 2576 secret.go:281] references non-existent secret key: ca.crt Apr 20 20:12:06.723612 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:12:06.723395 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 20 20:12:06.723612 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:12:06.723403 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-k7xlh: references non-existent secret key: ca.crt Apr 20 20:12:06.723612 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:12:06.723444 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e5aad4df-a72a-463d-a0ec-2d7703eebc56-certificates podName:e5aad4df-a72a-463d-a0ec-2d7703eebc56 nodeName:}" failed. No retries permitted until 2026-04-20 20:12:10.723430995 +0000 UTC m=+405.436112970 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/e5aad4df-a72a-463d-a0ec-2d7703eebc56-certificates") pod "keda-operator-ffbb595cb-k7xlh" (UID: "e5aad4df-a72a-463d-a0ec-2d7703eebc56") : references non-existent secret key: ca.crt Apr 20 20:12:07.026567 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:07.026490 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c708e289-89c8-45fa-b250-f564c5da8db3-certificates\") pod \"keda-metrics-apiserver-7c9f485588-wjjvd\" (UID: \"c708e289-89c8-45fa-b250-f564c5da8db3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-wjjvd" Apr 20 20:12:07.026708 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:12:07.026597 2576 secret.go:281] references non-existent secret key: tls.crt Apr 20 20:12:07.026708 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:12:07.026608 2576 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 20 20:12:07.026708 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:12:07.026625 2576 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-wjjvd: references non-existent secret key: tls.crt Apr 20 20:12:07.026708 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:12:07.026670 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c708e289-89c8-45fa-b250-f564c5da8db3-certificates podName:c708e289-89c8-45fa-b250-f564c5da8db3 nodeName:}" failed. No retries permitted until 2026-04-20 20:12:11.026657565 +0000 UTC m=+405.739339540 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/c708e289-89c8-45fa-b250-f564c5da8db3-certificates") pod "keda-metrics-apiserver-7c9f485588-wjjvd" (UID: "c708e289-89c8-45fa-b250-f564c5da8db3") : references non-existent secret key: tls.crt Apr 20 20:12:07.096520 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:07.096489 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-q6g8s" event={"ID":"a8183170-fb0c-47e8-b5fd-a7d3f1fb78d8","Type":"ContainerStarted","Data":"f53c572beefeef60bfc0a38d95c7d5a37b08b8ed8799efa0617cbfa58d7269f5"} Apr 20 20:12:07.096677 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:07.096579 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-q6g8s" Apr 20 20:12:07.114386 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:07.114331 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-q6g8s" podStartSLOduration=2.936275443 podStartE2EDuration="4.114316573s" podCreationTimestamp="2026-04-20 20:12:03 +0000 UTC" firstStartedPulling="2026-04-20 20:12:05.449720647 +0000 UTC m=+400.162402621" lastFinishedPulling="2026-04-20 20:12:06.627761762 +0000 UTC m=+401.340443751" observedRunningTime="2026-04-20 20:12:07.11349947 +0000 UTC m=+401.826181457" watchObservedRunningTime="2026-04-20 20:12:07.114316573 +0000 UTC m=+401.826998569" Apr 20 20:12:10.753224 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:10.753177 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e5aad4df-a72a-463d-a0ec-2d7703eebc56-certificates\") pod \"keda-operator-ffbb595cb-k7xlh\" (UID: \"e5aad4df-a72a-463d-a0ec-2d7703eebc56\") " pod="openshift-keda/keda-operator-ffbb595cb-k7xlh" Apr 20 20:12:10.755652 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:10.755628 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e5aad4df-a72a-463d-a0ec-2d7703eebc56-certificates\") pod \"keda-operator-ffbb595cb-k7xlh\" (UID: \"e5aad4df-a72a-463d-a0ec-2d7703eebc56\") " pod="openshift-keda/keda-operator-ffbb595cb-k7xlh" Apr 20 20:12:10.826070 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:10.826041 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-k7xlh" Apr 20 20:12:10.941055 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:10.941029 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-k7xlh"] Apr 20 20:12:10.942977 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:12:10.942954 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5aad4df_a72a_463d_a0ec_2d7703eebc56.slice/crio-4507939b8d4f591668ec061273734257c90c3a83601f55a208db05cc8e82585b WatchSource:0}: Error finding container 4507939b8d4f591668ec061273734257c90c3a83601f55a208db05cc8e82585b: Status 404 returned error can't find the container with id 4507939b8d4f591668ec061273734257c90c3a83601f55a208db05cc8e82585b Apr 20 20:12:11.059017 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:11.058856 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c708e289-89c8-45fa-b250-f564c5da8db3-certificates\") pod \"keda-metrics-apiserver-7c9f485588-wjjvd\" (UID: \"c708e289-89c8-45fa-b250-f564c5da8db3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-wjjvd" Apr 20 20:12:11.062095 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:11.062061 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/c708e289-89c8-45fa-b250-f564c5da8db3-certificates\") pod \"keda-metrics-apiserver-7c9f485588-wjjvd\" (UID: \"c708e289-89c8-45fa-b250-f564c5da8db3\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-wjjvd" Apr 20 20:12:11.101183 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:11.101161 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-wjjvd" Apr 20 20:12:11.111281 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:11.111256 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-k7xlh" event={"ID":"e5aad4df-a72a-463d-a0ec-2d7703eebc56","Type":"ContainerStarted","Data":"4507939b8d4f591668ec061273734257c90c3a83601f55a208db05cc8e82585b"} Apr 20 20:12:11.216707 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:11.216677 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-wjjvd"] Apr 20 20:12:11.219384 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:12:11.219359 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc708e289_89c8_45fa_b250_f564c5da8db3.slice/crio-87cf3201df5d5a3454b9c699e8784246dbaeadbfe689cec1fc47e1974b861c46 WatchSource:0}: Error finding container 87cf3201df5d5a3454b9c699e8784246dbaeadbfe689cec1fc47e1974b861c46: Status 404 returned error can't find the container with id 87cf3201df5d5a3454b9c699e8784246dbaeadbfe689cec1fc47e1974b861c46 Apr 20 20:12:12.116672 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:12.116626 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-wjjvd" event={"ID":"c708e289-89c8-45fa-b250-f564c5da8db3","Type":"ContainerStarted","Data":"87cf3201df5d5a3454b9c699e8784246dbaeadbfe689cec1fc47e1974b861c46"} Apr 20 20:12:15.129347 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:15.129252 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-wjjvd" event={"ID":"c708e289-89c8-45fa-b250-f564c5da8db3","Type":"ContainerStarted","Data":"0e43871683d5ff4158654026c7477a09c9f34c79b9339426833779a2a34f2dd1"} Apr 20 20:12:15.129347 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:15.129332 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-wjjvd" Apr 20 20:12:15.130815 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:15.130781 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-k7xlh" event={"ID":"e5aad4df-a72a-463d-a0ec-2d7703eebc56","Type":"ContainerStarted","Data":"b80f2997a4a834ac0fc26ade0aa3dd0c86863c1d56d5fbf9c2580dc1f157a805"} Apr 20 20:12:15.131211 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:15.130905 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-k7xlh" Apr 20 20:12:15.147496 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:15.147452 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-wjjvd" podStartSLOduration=8.497440389 podStartE2EDuration="12.147440996s" podCreationTimestamp="2026-04-20 20:12:03 +0000 UTC" firstStartedPulling="2026-04-20 20:12:11.220607938 +0000 UTC m=+405.933289914" lastFinishedPulling="2026-04-20 20:12:14.870608532 +0000 UTC m=+409.583290521" observedRunningTime="2026-04-20 20:12:15.145946418 +0000 UTC m=+409.858628417" watchObservedRunningTime="2026-04-20 20:12:15.147440996 +0000 UTC m=+409.860122993" Apr 20 20:12:15.162520 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:15.162478 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-k7xlh" podStartSLOduration=9.236217006 podStartE2EDuration="13.162467109s" podCreationTimestamp="2026-04-20 20:12:02 +0000 UTC" firstStartedPulling="2026-04-20 20:12:10.944149937 +0000 UTC m=+405.656831915" lastFinishedPulling="2026-04-20 20:12:14.870400039 +0000 UTC m=+409.583082018" observedRunningTime="2026-04-20 20:12:15.161043906 +0000 UTC m=+409.873725904" watchObservedRunningTime="2026-04-20 20:12:15.162467109 +0000 UTC m=+409.875149106" Apr 20 20:12:24.087790 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:24.087762 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-qxc8j" Apr 20 20:12:26.138106 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:26.138079 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-wjjvd" Apr 20 20:12:28.101741 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:28.101712 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-q6g8s" Apr 20 20:12:36.136139 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:12:36.136049 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-k7xlh" Apr 20 20:13:10.014710 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:13:10.014679 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-rlh6d"] Apr 20 20:13:10.017979 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:13:10.017959 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-rlh6d" Apr 20 20:13:10.020626 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:13:10.020601 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 20 20:13:10.020873 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:13:10.020856 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-pxbpx\"" Apr 20 20:13:10.021849 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:13:10.021829 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 20 20:13:10.021977 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:13:10.021837 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 20 20:13:10.029355 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:13:10.029332 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-rlh6d"] Apr 20 20:13:10.115913 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:13:10.115869 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtfpt\" (UniqueName: \"kubernetes.io/projected/351f1e3a-984d-49b3-96b5-b629f861d4d2-kube-api-access-dtfpt\") pod \"seaweedfs-86cc847c5c-rlh6d\" (UID: \"351f1e3a-984d-49b3-96b5-b629f861d4d2\") " pod="kserve/seaweedfs-86cc847c5c-rlh6d" Apr 20 20:13:10.116089 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:13:10.116012 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/351f1e3a-984d-49b3-96b5-b629f861d4d2-data\") pod \"seaweedfs-86cc847c5c-rlh6d\" (UID: \"351f1e3a-984d-49b3-96b5-b629f861d4d2\") " pod="kserve/seaweedfs-86cc847c5c-rlh6d" Apr 20 20:13:10.217041 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:13:10.217003 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dtfpt\" (UniqueName: \"kubernetes.io/projected/351f1e3a-984d-49b3-96b5-b629f861d4d2-kube-api-access-dtfpt\") pod \"seaweedfs-86cc847c5c-rlh6d\" (UID: \"351f1e3a-984d-49b3-96b5-b629f861d4d2\") " pod="kserve/seaweedfs-86cc847c5c-rlh6d" Apr 20 20:13:10.217220 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:13:10.217089 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/351f1e3a-984d-49b3-96b5-b629f861d4d2-data\") pod \"seaweedfs-86cc847c5c-rlh6d\" (UID: \"351f1e3a-984d-49b3-96b5-b629f861d4d2\") " pod="kserve/seaweedfs-86cc847c5c-rlh6d" Apr 20 20:13:10.217426 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:13:10.217410 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/351f1e3a-984d-49b3-96b5-b629f861d4d2-data\") pod \"seaweedfs-86cc847c5c-rlh6d\" (UID: \"351f1e3a-984d-49b3-96b5-b629f861d4d2\") " pod="kserve/seaweedfs-86cc847c5c-rlh6d" Apr 20 20:13:10.226595 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:13:10.226569 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtfpt\" (UniqueName: \"kubernetes.io/projected/351f1e3a-984d-49b3-96b5-b629f861d4d2-kube-api-access-dtfpt\") pod \"seaweedfs-86cc847c5c-rlh6d\" (UID: \"351f1e3a-984d-49b3-96b5-b629f861d4d2\") " pod="kserve/seaweedfs-86cc847c5c-rlh6d" Apr 20 20:13:10.328495 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:13:10.328471 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-rlh6d" Apr 20 20:13:10.456231 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:13:10.456202 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-rlh6d"] Apr 20 20:13:10.463075 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:13:10.463046 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod351f1e3a_984d_49b3_96b5_b629f861d4d2.slice/crio-1c7f9e42d3a3f0db31320e4d0b7507021bd8c20f0706e18b462d5384b23a9c40 WatchSource:0}: Error finding container 1c7f9e42d3a3f0db31320e4d0b7507021bd8c20f0706e18b462d5384b23a9c40: Status 404 returned error can't find the container with id 1c7f9e42d3a3f0db31320e4d0b7507021bd8c20f0706e18b462d5384b23a9c40 Apr 20 20:13:11.308550 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:13:11.308510 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-rlh6d" event={"ID":"351f1e3a-984d-49b3-96b5-b629f861d4d2","Type":"ContainerStarted","Data":"1c7f9e42d3a3f0db31320e4d0b7507021bd8c20f0706e18b462d5384b23a9c40"} Apr 20 20:13:13.317346 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:13:13.317305 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-rlh6d" event={"ID":"351f1e3a-984d-49b3-96b5-b629f861d4d2","Type":"ContainerStarted","Data":"18e9f12bff9d6c62f98848b3ff5615d41b6753a91009015a1a33b90d43f1ce31"} Apr 20 20:13:13.317701 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:13:13.317436 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-rlh6d" Apr 20 20:13:13.333176 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:13:13.333126 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-rlh6d" podStartSLOduration=1.790015204 podStartE2EDuration="4.333095937s" podCreationTimestamp="2026-04-20 20:13:09 +0000 UTC" firstStartedPulling="2026-04-20 20:13:10.46425385 +0000 UTC m=+465.176935826" lastFinishedPulling="2026-04-20 20:13:13.007334576 +0000 UTC m=+467.720016559" observedRunningTime="2026-04-20 20:13:13.332060688 +0000 UTC m=+468.044742685" watchObservedRunningTime="2026-04-20 20:13:13.333095937 +0000 UTC m=+468.045777943" Apr 20 20:13:19.322885 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:13:19.322848 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-rlh6d" Apr 20 20:13:56.541017 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:13:56.540985 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-84bf55c695-hszhv"] Apr 20 20:13:56.543499 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:13:56.543481 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84bf55c695-hszhv" Apr 20 20:13:56.557138 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:13:56.557100 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-84bf55c695-hszhv"] Apr 20 20:13:56.649630 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:13:56.649592 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hzl2\" (UniqueName: \"kubernetes.io/projected/ecf08bb4-ea03-45a3-acdb-d8890ef6b9b5-kube-api-access-2hzl2\") pod \"console-84bf55c695-hszhv\" (UID: \"ecf08bb4-ea03-45a3-acdb-d8890ef6b9b5\") " pod="openshift-console/console-84bf55c695-hszhv" Apr 20 20:13:56.649791 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:13:56.649632 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ecf08bb4-ea03-45a3-acdb-d8890ef6b9b5-console-serving-cert\") pod \"console-84bf55c695-hszhv\" (UID: \"ecf08bb4-ea03-45a3-acdb-d8890ef6b9b5\") " pod="openshift-console/console-84bf55c695-hszhv" Apr 20 20:13:56.649791 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:13:56.649685 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ecf08bb4-ea03-45a3-acdb-d8890ef6b9b5-console-oauth-config\") pod \"console-84bf55c695-hszhv\" (UID: \"ecf08bb4-ea03-45a3-acdb-d8890ef6b9b5\") " pod="openshift-console/console-84bf55c695-hszhv" Apr 20 20:13:56.649791 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:13:56.649718 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ecf08bb4-ea03-45a3-acdb-d8890ef6b9b5-service-ca\") pod \"console-84bf55c695-hszhv\" (UID: \"ecf08bb4-ea03-45a3-acdb-d8890ef6b9b5\") " pod="openshift-console/console-84bf55c695-hszhv" Apr 20 20:13:56.649791 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:13:56.649742 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ecf08bb4-ea03-45a3-acdb-d8890ef6b9b5-trusted-ca-bundle\") pod \"console-84bf55c695-hszhv\" (UID: \"ecf08bb4-ea03-45a3-acdb-d8890ef6b9b5\") " pod="openshift-console/console-84bf55c695-hszhv" Apr 20 20:13:56.649791 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:13:56.649782 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ecf08bb4-ea03-45a3-acdb-d8890ef6b9b5-console-config\") pod \"console-84bf55c695-hszhv\" (UID: \"ecf08bb4-ea03-45a3-acdb-d8890ef6b9b5\") " pod="openshift-console/console-84bf55c695-hszhv" Apr 20 20:13:56.649963 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:13:56.649799 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ecf08bb4-ea03-45a3-acdb-d8890ef6b9b5-oauth-serving-cert\") pod \"console-84bf55c695-hszhv\" (UID: \"ecf08bb4-ea03-45a3-acdb-d8890ef6b9b5\") " pod="openshift-console/console-84bf55c695-hszhv" Apr 20 20:13:56.750506 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:13:56.750470 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2hzl2\" (UniqueName: \"kubernetes.io/projected/ecf08bb4-ea03-45a3-acdb-d8890ef6b9b5-kube-api-access-2hzl2\") pod \"console-84bf55c695-hszhv\" (UID: \"ecf08bb4-ea03-45a3-acdb-d8890ef6b9b5\") " pod="openshift-console/console-84bf55c695-hszhv" Apr 20 20:13:56.750506 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:13:56.750509 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ecf08bb4-ea03-45a3-acdb-d8890ef6b9b5-console-serving-cert\") pod \"console-84bf55c695-hszhv\" (UID: \"ecf08bb4-ea03-45a3-acdb-d8890ef6b9b5\") " pod="openshift-console/console-84bf55c695-hszhv" Apr 20 20:13:56.750710 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:13:56.750530 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ecf08bb4-ea03-45a3-acdb-d8890ef6b9b5-console-oauth-config\") pod \"console-84bf55c695-hszhv\" (UID: \"ecf08bb4-ea03-45a3-acdb-d8890ef6b9b5\") " pod="openshift-console/console-84bf55c695-hszhv" Apr 20 20:13:56.750710 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:13:56.750550 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ecf08bb4-ea03-45a3-acdb-d8890ef6b9b5-service-ca\") pod \"console-84bf55c695-hszhv\" (UID: \"ecf08bb4-ea03-45a3-acdb-d8890ef6b9b5\") " pod="openshift-console/console-84bf55c695-hszhv" Apr 20 20:13:56.750710 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:13:56.750579 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ecf08bb4-ea03-45a3-acdb-d8890ef6b9b5-trusted-ca-bundle\") pod \"console-84bf55c695-hszhv\" (UID: \"ecf08bb4-ea03-45a3-acdb-d8890ef6b9b5\") " pod="openshift-console/console-84bf55c695-hszhv" Apr 20 20:13:56.750710 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:13:56.750628 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ecf08bb4-ea03-45a3-acdb-d8890ef6b9b5-console-config\") pod \"console-84bf55c695-hszhv\" (UID: \"ecf08bb4-ea03-45a3-acdb-d8890ef6b9b5\") " pod="openshift-console/console-84bf55c695-hszhv" Apr 20 20:13:56.750710 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:13:56.750653 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ecf08bb4-ea03-45a3-acdb-d8890ef6b9b5-oauth-serving-cert\") pod \"console-84bf55c695-hszhv\" (UID: \"ecf08bb4-ea03-45a3-acdb-d8890ef6b9b5\") " pod="openshift-console/console-84bf55c695-hszhv" Apr 20 20:13:56.751408 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:13:56.751382 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ecf08bb4-ea03-45a3-acdb-d8890ef6b9b5-oauth-serving-cert\") pod \"console-84bf55c695-hszhv\" (UID: \"ecf08bb4-ea03-45a3-acdb-d8890ef6b9b5\") " pod="openshift-console/console-84bf55c695-hszhv" Apr 20 20:13:56.751537 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:13:56.751495 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ecf08bb4-ea03-45a3-acdb-d8890ef6b9b5-service-ca\") pod \"console-84bf55c695-hszhv\" (UID: \"ecf08bb4-ea03-45a3-acdb-d8890ef6b9b5\") " pod="openshift-console/console-84bf55c695-hszhv" Apr 20 20:13:56.751537 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:13:56.751496 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ecf08bb4-ea03-45a3-acdb-d8890ef6b9b5-console-config\") pod \"console-84bf55c695-hszhv\" (UID: \"ecf08bb4-ea03-45a3-acdb-d8890ef6b9b5\") " pod="openshift-console/console-84bf55c695-hszhv" Apr 20 20:13:56.751660 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:13:56.751588 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ecf08bb4-ea03-45a3-acdb-d8890ef6b9b5-trusted-ca-bundle\") pod \"console-84bf55c695-hszhv\" (UID: \"ecf08bb4-ea03-45a3-acdb-d8890ef6b9b5\") " pod="openshift-console/console-84bf55c695-hszhv" Apr 20 20:13:56.753034 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:13:56.753006 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ecf08bb4-ea03-45a3-acdb-d8890ef6b9b5-console-oauth-config\") pod \"console-84bf55c695-hszhv\" (UID: \"ecf08bb4-ea03-45a3-acdb-d8890ef6b9b5\") " pod="openshift-console/console-84bf55c695-hszhv" Apr 20 20:13:56.753192 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:13:56.753172 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ecf08bb4-ea03-45a3-acdb-d8890ef6b9b5-console-serving-cert\") pod \"console-84bf55c695-hszhv\" (UID: \"ecf08bb4-ea03-45a3-acdb-d8890ef6b9b5\") " pod="openshift-console/console-84bf55c695-hszhv" Apr 20 20:13:56.758995 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:13:56.758976 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hzl2\" (UniqueName: \"kubernetes.io/projected/ecf08bb4-ea03-45a3-acdb-d8890ef6b9b5-kube-api-access-2hzl2\") pod \"console-84bf55c695-hszhv\" (UID: \"ecf08bb4-ea03-45a3-acdb-d8890ef6b9b5\") " pod="openshift-console/console-84bf55c695-hszhv" Apr 20 20:13:56.852227 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:13:56.852155 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-84bf55c695-hszhv" Apr 20 20:13:56.977027 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:13:56.977001 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-84bf55c695-hszhv"] Apr 20 20:13:56.979093 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:13:56.979062 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podecf08bb4_ea03_45a3_acdb_d8890ef6b9b5.slice/crio-5cb0d9fb169f6df57579a43e58e339d9dca937d56439cef9ec505a6fd672d812 WatchSource:0}: Error finding container 5cb0d9fb169f6df57579a43e58e339d9dca937d56439cef9ec505a6fd672d812: Status 404 returned error can't find the container with id 5cb0d9fb169f6df57579a43e58e339d9dca937d56439cef9ec505a6fd672d812 Apr 20 20:13:57.455648 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:13:57.455615 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84bf55c695-hszhv" event={"ID":"ecf08bb4-ea03-45a3-acdb-d8890ef6b9b5","Type":"ContainerStarted","Data":"5234de47c16e4eb3926581940cd47eb685c619a6f5836060456475b1a769f6d2"} Apr 20 20:13:57.455648 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:13:57.455650 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-84bf55c695-hszhv" event={"ID":"ecf08bb4-ea03-45a3-acdb-d8890ef6b9b5","Type":"ContainerStarted","Data":"5cb0d9fb169f6df57579a43e58e339d9dca937d56439cef9ec505a6fd672d812"} Apr 20 20:13:57.474955 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:13:57.474911 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-84bf55c695-hszhv" podStartSLOduration=1.474898154 podStartE2EDuration="1.474898154s" podCreationTimestamp="2026-04-20 20:13:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:13:57.47342813 +0000 UTC m=+512.186110139" watchObservedRunningTime="2026-04-20 20:13:57.474898154 +0000 UTC m=+512.187580151" Apr 20 20:14:06.852966 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:06.852863 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-84bf55c695-hszhv" Apr 20 20:14:06.853459 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:06.853024 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-84bf55c695-hszhv" Apr 20 20:14:06.857769 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:06.857748 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-84bf55c695-hszhv" Apr 20 20:14:07.492423 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:07.492395 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-84bf55c695-hszhv" Apr 20 20:14:07.545051 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:07.545018 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7c4f755c7b-22lkx"] Apr 20 20:14:20.846047 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:20.845998 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-dr5bg"] Apr 20 20:14:20.849489 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:20.849465 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-dr5bg" Apr 20 20:14:20.852090 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:20.852069 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 20 20:14:20.852216 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:20.852157 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-dnk6q\"" Apr 20 20:14:20.861230 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:20.861207 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-dr5bg"] Apr 20 20:14:20.863001 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:20.862982 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-vm6h5"] Apr 20 20:14:20.866235 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:20.866214 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-vm6h5" Apr 20 20:14:20.868774 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:20.868755 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-6sv45\"" Apr 20 20:14:20.868878 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:20.868842 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 20 20:14:20.876371 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:20.876343 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-vm6h5"] Apr 20 20:14:20.944329 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:20.944297 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlhm2\" (UniqueName: \"kubernetes.io/projected/c0890812-2ed0-426a-b98f-17b91b07c72b-kube-api-access-nlhm2\") pod \"model-serving-api-86f7b4b499-dr5bg\" (UID: \"c0890812-2ed0-426a-b98f-17b91b07c72b\") " pod="kserve/model-serving-api-86f7b4b499-dr5bg" Apr 20 20:14:20.944506 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:20.944365 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c0890812-2ed0-426a-b98f-17b91b07c72b-tls-certs\") pod \"model-serving-api-86f7b4b499-dr5bg\" (UID: \"c0890812-2ed0-426a-b98f-17b91b07c72b\") " pod="kserve/model-serving-api-86f7b4b499-dr5bg" Apr 20 20:14:21.045655 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:21.045616 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e8d16292-03dc-4ace-a410-d17256e8596f-cert\") pod \"odh-model-controller-696fc77849-vm6h5\" (UID: \"e8d16292-03dc-4ace-a410-d17256e8596f\") " pod="kserve/odh-model-controller-696fc77849-vm6h5" Apr 20 20:14:21.045852 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:21.045671 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c0890812-2ed0-426a-b98f-17b91b07c72b-tls-certs\") pod \"model-serving-api-86f7b4b499-dr5bg\" (UID: \"c0890812-2ed0-426a-b98f-17b91b07c72b\") " pod="kserve/model-serving-api-86f7b4b499-dr5bg" Apr 20 20:14:21.045852 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:21.045771 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dngm4\" (UniqueName: \"kubernetes.io/projected/e8d16292-03dc-4ace-a410-d17256e8596f-kube-api-access-dngm4\") pod \"odh-model-controller-696fc77849-vm6h5\" (UID: \"e8d16292-03dc-4ace-a410-d17256e8596f\") " pod="kserve/odh-model-controller-696fc77849-vm6h5" Apr 20 20:14:21.045852 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:14:21.045794 2576 secret.go:189] Couldn't get secret kserve/model-serving-api-tls: secret "model-serving-api-tls" not found Apr 20 20:14:21.045852 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:21.045804 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nlhm2\" (UniqueName: \"kubernetes.io/projected/c0890812-2ed0-426a-b98f-17b91b07c72b-kube-api-access-nlhm2\") pod \"model-serving-api-86f7b4b499-dr5bg\" (UID: \"c0890812-2ed0-426a-b98f-17b91b07c72b\") " pod="kserve/model-serving-api-86f7b4b499-dr5bg" Apr 20 20:14:21.046033 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:14:21.045863 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0890812-2ed0-426a-b98f-17b91b07c72b-tls-certs podName:c0890812-2ed0-426a-b98f-17b91b07c72b nodeName:}" failed. No retries permitted until 2026-04-20 20:14:21.545841774 +0000 UTC m=+536.258523748 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/c0890812-2ed0-426a-b98f-17b91b07c72b-tls-certs") pod "model-serving-api-86f7b4b499-dr5bg" (UID: "c0890812-2ed0-426a-b98f-17b91b07c72b") : secret "model-serving-api-tls" not found Apr 20 20:14:21.056825 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:21.056798 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlhm2\" (UniqueName: \"kubernetes.io/projected/c0890812-2ed0-426a-b98f-17b91b07c72b-kube-api-access-nlhm2\") pod \"model-serving-api-86f7b4b499-dr5bg\" (UID: \"c0890812-2ed0-426a-b98f-17b91b07c72b\") " pod="kserve/model-serving-api-86f7b4b499-dr5bg" Apr 20 20:14:21.146699 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:21.146612 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dngm4\" (UniqueName: \"kubernetes.io/projected/e8d16292-03dc-4ace-a410-d17256e8596f-kube-api-access-dngm4\") pod \"odh-model-controller-696fc77849-vm6h5\" (UID: \"e8d16292-03dc-4ace-a410-d17256e8596f\") " pod="kserve/odh-model-controller-696fc77849-vm6h5" Apr 20 20:14:21.146699 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:21.146683 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e8d16292-03dc-4ace-a410-d17256e8596f-cert\") pod \"odh-model-controller-696fc77849-vm6h5\" (UID: \"e8d16292-03dc-4ace-a410-d17256e8596f\") " pod="kserve/odh-model-controller-696fc77849-vm6h5" Apr 20 20:14:21.149162 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:21.149129 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e8d16292-03dc-4ace-a410-d17256e8596f-cert\") pod \"odh-model-controller-696fc77849-vm6h5\" (UID: \"e8d16292-03dc-4ace-a410-d17256e8596f\") " pod="kserve/odh-model-controller-696fc77849-vm6h5" Apr 20 20:14:21.154797 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:21.154770 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dngm4\" (UniqueName: \"kubernetes.io/projected/e8d16292-03dc-4ace-a410-d17256e8596f-kube-api-access-dngm4\") pod \"odh-model-controller-696fc77849-vm6h5\" (UID: \"e8d16292-03dc-4ace-a410-d17256e8596f\") " pod="kserve/odh-model-controller-696fc77849-vm6h5" Apr 20 20:14:21.178040 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:21.178003 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-vm6h5" Apr 20 20:14:21.298727 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:21.298612 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-vm6h5"] Apr 20 20:14:21.301382 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:14:21.301353 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8d16292_03dc_4ace_a410_d17256e8596f.slice/crio-0a4ecd939a7a4fa8162d0f15711f2eb425352a74830bd2c83a69ee864bf712c1 WatchSource:0}: Error finding container 0a4ecd939a7a4fa8162d0f15711f2eb425352a74830bd2c83a69ee864bf712c1: Status 404 returned error can't find the container with id 0a4ecd939a7a4fa8162d0f15711f2eb425352a74830bd2c83a69ee864bf712c1 Apr 20 20:14:21.535529 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:21.535497 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-vm6h5" event={"ID":"e8d16292-03dc-4ace-a410-d17256e8596f","Type":"ContainerStarted","Data":"0a4ecd939a7a4fa8162d0f15711f2eb425352a74830bd2c83a69ee864bf712c1"} Apr 20 20:14:21.550609 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:21.550586 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c0890812-2ed0-426a-b98f-17b91b07c72b-tls-certs\") pod \"model-serving-api-86f7b4b499-dr5bg\" (UID: \"c0890812-2ed0-426a-b98f-17b91b07c72b\") " pod="kserve/model-serving-api-86f7b4b499-dr5bg" Apr 20 20:14:21.552892 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:21.552875 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c0890812-2ed0-426a-b98f-17b91b07c72b-tls-certs\") pod \"model-serving-api-86f7b4b499-dr5bg\" (UID: \"c0890812-2ed0-426a-b98f-17b91b07c72b\") " pod="kserve/model-serving-api-86f7b4b499-dr5bg" Apr 20 20:14:21.761770 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:21.761738 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-dr5bg" Apr 20 20:14:21.892027 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:21.891925 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-dr5bg"] Apr 20 20:14:21.894467 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:14:21.894432 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0890812_2ed0_426a_b98f_17b91b07c72b.slice/crio-499a419604ee5144628d7a4478effbbc3078aa2f58f8ffe1a67ed6584ff276f9 WatchSource:0}: Error finding container 499a419604ee5144628d7a4478effbbc3078aa2f58f8ffe1a67ed6584ff276f9: Status 404 returned error can't find the container with id 499a419604ee5144628d7a4478effbbc3078aa2f58f8ffe1a67ed6584ff276f9 Apr 20 20:14:22.540508 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:22.540464 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-dr5bg" event={"ID":"c0890812-2ed0-426a-b98f-17b91b07c72b","Type":"ContainerStarted","Data":"499a419604ee5144628d7a4478effbbc3078aa2f58f8ffe1a67ed6584ff276f9"} Apr 20 20:14:25.552533 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:25.552497 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-vm6h5" event={"ID":"e8d16292-03dc-4ace-a410-d17256e8596f","Type":"ContainerStarted","Data":"cd6176fe9c2316307e28371a124aa45675a4e91b350355d18d90fc7bbdf76b10"} Apr 20 20:14:25.552985 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:25.552582 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-vm6h5" Apr 20 20:14:25.553974 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:25.553943 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-dr5bg" event={"ID":"c0890812-2ed0-426a-b98f-17b91b07c72b","Type":"ContainerStarted","Data":"9517e29b29ec682568848434b267bb020a86853b304e960c495d8356bd46ee02"} Apr 20 20:14:25.554104 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:25.554073 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-dr5bg" Apr 20 20:14:25.571879 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:25.571834 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-vm6h5" podStartSLOduration=1.745937535 podStartE2EDuration="5.571821763s" podCreationTimestamp="2026-04-20 20:14:20 +0000 UTC" firstStartedPulling="2026-04-20 20:14:21.302601818 +0000 UTC m=+536.015283792" lastFinishedPulling="2026-04-20 20:14:25.128486032 +0000 UTC m=+539.841168020" observedRunningTime="2026-04-20 20:14:25.570396178 +0000 UTC m=+540.283078175" watchObservedRunningTime="2026-04-20 20:14:25.571821763 +0000 UTC m=+540.284503822" Apr 20 20:14:25.588438 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:25.588399 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-dr5bg" podStartSLOduration=2.304371587 podStartE2EDuration="5.588388699s" podCreationTimestamp="2026-04-20 20:14:20 +0000 UTC" firstStartedPulling="2026-04-20 20:14:21.896949454 +0000 UTC m=+536.609631442" lastFinishedPulling="2026-04-20 20:14:25.180966578 +0000 UTC m=+539.893648554" observedRunningTime="2026-04-20 20:14:25.586596688 +0000 UTC m=+540.299278685" watchObservedRunningTime="2026-04-20 20:14:25.588388699 +0000 UTC m=+540.301070734" Apr 20 20:14:32.563981 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:32.563907 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7c4f755c7b-22lkx" podUID="2966d7cd-1c3a-408e-bb89-146938f42ec7" containerName="console" containerID="cri-o://0cecb1bc40336e3c59a2334f0dbe39e04f2eb37554ade77aa583b571de3405b3" gracePeriod=15 Apr 20 20:14:32.807342 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:32.807321 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7c4f755c7b-22lkx_2966d7cd-1c3a-408e-bb89-146938f42ec7/console/0.log" Apr 20 20:14:32.807459 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:32.807380 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c4f755c7b-22lkx" Apr 20 20:14:32.833638 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:32.833579 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2966d7cd-1c3a-408e-bb89-146938f42ec7-oauth-serving-cert\") pod \"2966d7cd-1c3a-408e-bb89-146938f42ec7\" (UID: \"2966d7cd-1c3a-408e-bb89-146938f42ec7\") " Apr 20 20:14:32.833638 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:32.833615 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2966d7cd-1c3a-408e-bb89-146938f42ec7-service-ca\") pod \"2966d7cd-1c3a-408e-bb89-146938f42ec7\" (UID: \"2966d7cd-1c3a-408e-bb89-146938f42ec7\") " Apr 20 20:14:32.833808 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:32.833653 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2966d7cd-1c3a-408e-bb89-146938f42ec7-console-oauth-config\") pod \"2966d7cd-1c3a-408e-bb89-146938f42ec7\" (UID: \"2966d7cd-1c3a-408e-bb89-146938f42ec7\") " Apr 20 20:14:32.833808 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:32.833764 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2966d7cd-1c3a-408e-bb89-146938f42ec7-trusted-ca-bundle\") pod \"2966d7cd-1c3a-408e-bb89-146938f42ec7\" (UID: \"2966d7cd-1c3a-408e-bb89-146938f42ec7\") " Apr 20 20:14:32.833918 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:32.833827 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sssmc\" (UniqueName: \"kubernetes.io/projected/2966d7cd-1c3a-408e-bb89-146938f42ec7-kube-api-access-sssmc\") pod \"2966d7cd-1c3a-408e-bb89-146938f42ec7\" (UID: \"2966d7cd-1c3a-408e-bb89-146938f42ec7\") " Apr 20 20:14:32.833918 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:32.833861 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2966d7cd-1c3a-408e-bb89-146938f42ec7-console-serving-cert\") pod \"2966d7cd-1c3a-408e-bb89-146938f42ec7\" (UID: \"2966d7cd-1c3a-408e-bb89-146938f42ec7\") " Apr 20 20:14:32.834020 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:32.833914 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2966d7cd-1c3a-408e-bb89-146938f42ec7-console-config\") pod \"2966d7cd-1c3a-408e-bb89-146938f42ec7\" (UID: \"2966d7cd-1c3a-408e-bb89-146938f42ec7\") " Apr 20 20:14:32.834094 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:32.834057 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2966d7cd-1c3a-408e-bb89-146938f42ec7-service-ca" (OuterVolumeSpecName: "service-ca") pod "2966d7cd-1c3a-408e-bb89-146938f42ec7" (UID: "2966d7cd-1c3a-408e-bb89-146938f42ec7"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:14:32.834292 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:32.834065 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2966d7cd-1c3a-408e-bb89-146938f42ec7-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "2966d7cd-1c3a-408e-bb89-146938f42ec7" (UID: "2966d7cd-1c3a-408e-bb89-146938f42ec7"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:14:32.834292 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:32.834210 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2966d7cd-1c3a-408e-bb89-146938f42ec7-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "2966d7cd-1c3a-408e-bb89-146938f42ec7" (UID: "2966d7cd-1c3a-408e-bb89-146938f42ec7"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:14:32.834292 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:32.834236 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2966d7cd-1c3a-408e-bb89-146938f42ec7-oauth-serving-cert\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:14:32.834292 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:32.834257 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2966d7cd-1c3a-408e-bb89-146938f42ec7-service-ca\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:14:32.834508 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:32.834419 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2966d7cd-1c3a-408e-bb89-146938f42ec7-console-config" (OuterVolumeSpecName: "console-config") pod "2966d7cd-1c3a-408e-bb89-146938f42ec7" (UID: "2966d7cd-1c3a-408e-bb89-146938f42ec7"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:14:32.836144 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:32.836100 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2966d7cd-1c3a-408e-bb89-146938f42ec7-kube-api-access-sssmc" (OuterVolumeSpecName: "kube-api-access-sssmc") pod "2966d7cd-1c3a-408e-bb89-146938f42ec7" (UID: "2966d7cd-1c3a-408e-bb89-146938f42ec7"). InnerVolumeSpecName "kube-api-access-sssmc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:14:32.836447 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:32.836421 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2966d7cd-1c3a-408e-bb89-146938f42ec7-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "2966d7cd-1c3a-408e-bb89-146938f42ec7" (UID: "2966d7cd-1c3a-408e-bb89-146938f42ec7"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:14:32.836529 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:32.836433 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2966d7cd-1c3a-408e-bb89-146938f42ec7-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "2966d7cd-1c3a-408e-bb89-146938f42ec7" (UID: "2966d7cd-1c3a-408e-bb89-146938f42ec7"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:14:32.934987 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:32.934958 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2966d7cd-1c3a-408e-bb89-146938f42ec7-trusted-ca-bundle\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:14:32.934987 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:32.934984 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sssmc\" (UniqueName: \"kubernetes.io/projected/2966d7cd-1c3a-408e-bb89-146938f42ec7-kube-api-access-sssmc\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:14:32.934987 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:32.934994 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2966d7cd-1c3a-408e-bb89-146938f42ec7-console-serving-cert\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:14:32.935245 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:32.935004 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2966d7cd-1c3a-408e-bb89-146938f42ec7-console-config\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:14:32.935245 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:32.935012 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2966d7cd-1c3a-408e-bb89-146938f42ec7-console-oauth-config\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:14:33.581683 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:33.581657 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7c4f755c7b-22lkx_2966d7cd-1c3a-408e-bb89-146938f42ec7/console/0.log" Apr 20 20:14:33.582087 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:33.581696 2576 generic.go:358] "Generic (PLEG): container finished" podID="2966d7cd-1c3a-408e-bb89-146938f42ec7" containerID="0cecb1bc40336e3c59a2334f0dbe39e04f2eb37554ade77aa583b571de3405b3" exitCode=2 Apr 20 20:14:33.582087 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:33.581768 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7c4f755c7b-22lkx" Apr 20 20:14:33.582087 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:33.581786 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c4f755c7b-22lkx" event={"ID":"2966d7cd-1c3a-408e-bb89-146938f42ec7","Type":"ContainerDied","Data":"0cecb1bc40336e3c59a2334f0dbe39e04f2eb37554ade77aa583b571de3405b3"} Apr 20 20:14:33.582087 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:33.581829 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7c4f755c7b-22lkx" event={"ID":"2966d7cd-1c3a-408e-bb89-146938f42ec7","Type":"ContainerDied","Data":"806e46a70c9c4d59a5319d1020a6e99ce8bfd4db03a5feda2309f61f23a9a12c"} Apr 20 20:14:33.582087 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:33.581849 2576 scope.go:117] "RemoveContainer" containerID="0cecb1bc40336e3c59a2334f0dbe39e04f2eb37554ade77aa583b571de3405b3" Apr 20 20:14:33.590829 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:33.590812 2576 scope.go:117] "RemoveContainer" containerID="0cecb1bc40336e3c59a2334f0dbe39e04f2eb37554ade77aa583b571de3405b3" Apr 20 20:14:33.591072 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:14:33.591055 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cecb1bc40336e3c59a2334f0dbe39e04f2eb37554ade77aa583b571de3405b3\": container with ID starting with 0cecb1bc40336e3c59a2334f0dbe39e04f2eb37554ade77aa583b571de3405b3 not found: ID does not exist" containerID="0cecb1bc40336e3c59a2334f0dbe39e04f2eb37554ade77aa583b571de3405b3" Apr 20 20:14:33.591149 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:33.591082 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cecb1bc40336e3c59a2334f0dbe39e04f2eb37554ade77aa583b571de3405b3"} err="failed to get container status \"0cecb1bc40336e3c59a2334f0dbe39e04f2eb37554ade77aa583b571de3405b3\": rpc error: code = NotFound desc = could not find container \"0cecb1bc40336e3c59a2334f0dbe39e04f2eb37554ade77aa583b571de3405b3\": container with ID starting with 0cecb1bc40336e3c59a2334f0dbe39e04f2eb37554ade77aa583b571de3405b3 not found: ID does not exist" Apr 20 20:14:33.603590 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:33.603559 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7c4f755c7b-22lkx"] Apr 20 20:14:33.607368 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:33.607330 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7c4f755c7b-22lkx"] Apr 20 20:14:33.945067 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:33.944990 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2966d7cd-1c3a-408e-bb89-146938f42ec7" path="/var/lib/kubelet/pods/2966d7cd-1c3a-408e-bb89-146938f42ec7/volumes" Apr 20 20:14:36.559306 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:36.559275 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-vm6h5" Apr 20 20:14:36.561355 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:36.561326 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-dr5bg" Apr 20 20:14:37.417087 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:37.417055 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-6gm2r"] Apr 20 20:14:37.417415 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:37.417402 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2966d7cd-1c3a-408e-bb89-146938f42ec7" containerName="console" Apr 20 20:14:37.417463 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:37.417417 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2966d7cd-1c3a-408e-bb89-146938f42ec7" containerName="console" Apr 20 20:14:37.417508 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:37.417487 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="2966d7cd-1c3a-408e-bb89-146938f42ec7" containerName="console" Apr 20 20:14:37.421948 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:37.421931 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-6gm2r" Apr 20 20:14:37.427214 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:37.427191 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-6gm2r"] Apr 20 20:14:37.467175 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:37.467142 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2rbl\" (UniqueName: \"kubernetes.io/projected/5f256a79-a998-41f8-be60-f43eed0bbc4d-kube-api-access-g2rbl\") pod \"s3-init-6gm2r\" (UID: \"5f256a79-a998-41f8-be60-f43eed0bbc4d\") " pod="kserve/s3-init-6gm2r" Apr 20 20:14:37.568193 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:37.568159 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g2rbl\" (UniqueName: \"kubernetes.io/projected/5f256a79-a998-41f8-be60-f43eed0bbc4d-kube-api-access-g2rbl\") pod \"s3-init-6gm2r\" (UID: \"5f256a79-a998-41f8-be60-f43eed0bbc4d\") " pod="kserve/s3-init-6gm2r" Apr 20 20:14:37.577844 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:37.577819 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2rbl\" (UniqueName: \"kubernetes.io/projected/5f256a79-a998-41f8-be60-f43eed0bbc4d-kube-api-access-g2rbl\") pod \"s3-init-6gm2r\" (UID: \"5f256a79-a998-41f8-be60-f43eed0bbc4d\") " pod="kserve/s3-init-6gm2r" Apr 20 20:14:37.743692 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:37.743614 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-6gm2r" Apr 20 20:14:37.861870 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:37.861839 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-6gm2r"] Apr 20 20:14:37.864553 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:14:37.864526 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f256a79_a998_41f8_be60_f43eed0bbc4d.slice/crio-299ddfbba4b933ec3047cf6fac755ff941bfae5dad2f5b76dad48be715d6c97f WatchSource:0}: Error finding container 299ddfbba4b933ec3047cf6fac755ff941bfae5dad2f5b76dad48be715d6c97f: Status 404 returned error can't find the container with id 299ddfbba4b933ec3047cf6fac755ff941bfae5dad2f5b76dad48be715d6c97f Apr 20 20:14:38.602364 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:38.602322 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-6gm2r" event={"ID":"5f256a79-a998-41f8-be60-f43eed0bbc4d","Type":"ContainerStarted","Data":"299ddfbba4b933ec3047cf6fac755ff941bfae5dad2f5b76dad48be715d6c97f"} Apr 20 20:14:42.617684 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:42.617605 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-6gm2r" event={"ID":"5f256a79-a998-41f8-be60-f43eed0bbc4d","Type":"ContainerStarted","Data":"de1a5b2fc2489c00ad3c572a61d34ef9c22f8dbb6f8d81e6cca2ee82de123087"} Apr 20 20:14:42.635800 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:42.635752 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-6gm2r" podStartSLOduration=1.193111573 podStartE2EDuration="5.63573853s" podCreationTimestamp="2026-04-20 20:14:37 +0000 UTC" firstStartedPulling="2026-04-20 20:14:37.866309243 +0000 UTC m=+552.578991218" lastFinishedPulling="2026-04-20 20:14:42.308936198 +0000 UTC m=+557.021618175" observedRunningTime="2026-04-20 20:14:42.634425479 +0000 UTC m=+557.347107474" watchObservedRunningTime="2026-04-20 20:14:42.63573853 +0000 UTC m=+557.348420527" Apr 20 20:14:45.627910 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:45.627874 2576 generic.go:358] "Generic (PLEG): container finished" podID="5f256a79-a998-41f8-be60-f43eed0bbc4d" containerID="de1a5b2fc2489c00ad3c572a61d34ef9c22f8dbb6f8d81e6cca2ee82de123087" exitCode=0 Apr 20 20:14:45.628299 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:45.627947 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-6gm2r" event={"ID":"5f256a79-a998-41f8-be60-f43eed0bbc4d","Type":"ContainerDied","Data":"de1a5b2fc2489c00ad3c572a61d34ef9c22f8dbb6f8d81e6cca2ee82de123087"} Apr 20 20:14:46.756309 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:46.756286 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-6gm2r" Apr 20 20:14:46.843234 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:46.843194 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2rbl\" (UniqueName: \"kubernetes.io/projected/5f256a79-a998-41f8-be60-f43eed0bbc4d-kube-api-access-g2rbl\") pod \"5f256a79-a998-41f8-be60-f43eed0bbc4d\" (UID: \"5f256a79-a998-41f8-be60-f43eed0bbc4d\") " Apr 20 20:14:46.845372 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:46.845344 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f256a79-a998-41f8-be60-f43eed0bbc4d-kube-api-access-g2rbl" (OuterVolumeSpecName: "kube-api-access-g2rbl") pod "5f256a79-a998-41f8-be60-f43eed0bbc4d" (UID: "5f256a79-a998-41f8-be60-f43eed0bbc4d"). InnerVolumeSpecName "kube-api-access-g2rbl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:14:46.944747 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:46.944686 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g2rbl\" (UniqueName: \"kubernetes.io/projected/5f256a79-a998-41f8-be60-f43eed0bbc4d-kube-api-access-g2rbl\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:14:47.635862 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:47.635832 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-6gm2r" Apr 20 20:14:47.635862 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:47.635849 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-6gm2r" event={"ID":"5f256a79-a998-41f8-be60-f43eed0bbc4d","Type":"ContainerDied","Data":"299ddfbba4b933ec3047cf6fac755ff941bfae5dad2f5b76dad48be715d6c97f"} Apr 20 20:14:47.636051 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:47.635879 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="299ddfbba4b933ec3047cf6fac755ff941bfae5dad2f5b76dad48be715d6c97f" Apr 20 20:14:48.347982 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:48.347949 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-plp7g"] Apr 20 20:14:48.348397 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:48.348298 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5f256a79-a998-41f8-be60-f43eed0bbc4d" containerName="s3-init" Apr 20 20:14:48.348397 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:48.348312 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f256a79-a998-41f8-be60-f43eed0bbc4d" containerName="s3-init" Apr 20 20:14:48.348397 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:48.348371 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="5f256a79-a998-41f8-be60-f43eed0bbc4d" containerName="s3-init" Apr 20 20:14:48.351382 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:48.351367 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-plp7g" Apr 20 20:14:48.353880 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:48.353861 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 20 20:14:48.359306 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:48.359284 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-plp7g"] Apr 20 20:14:48.454935 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:48.454913 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/89d9fcfa-767b-4c8f-a4a0-0406ebdcf945-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-plp7g\" (UID: \"89d9fcfa-767b-4c8f-a4a0-0406ebdcf945\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-plp7g" Apr 20 20:14:48.455045 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:48.454944 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz4w6\" (UniqueName: \"kubernetes.io/projected/89d9fcfa-767b-4c8f-a4a0-0406ebdcf945-kube-api-access-wz4w6\") pod \"seaweedfs-tls-custom-ddd4dbfd-plp7g\" (UID: \"89d9fcfa-767b-4c8f-a4a0-0406ebdcf945\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-plp7g" Apr 20 20:14:48.555751 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:48.555727 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/89d9fcfa-767b-4c8f-a4a0-0406ebdcf945-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-plp7g\" (UID: \"89d9fcfa-767b-4c8f-a4a0-0406ebdcf945\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-plp7g" Apr 20 20:14:48.555830 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:48.555758 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wz4w6\" (UniqueName: \"kubernetes.io/projected/89d9fcfa-767b-4c8f-a4a0-0406ebdcf945-kube-api-access-wz4w6\") pod \"seaweedfs-tls-custom-ddd4dbfd-plp7g\" (UID: \"89d9fcfa-767b-4c8f-a4a0-0406ebdcf945\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-plp7g" Apr 20 20:14:48.556044 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:48.556027 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/89d9fcfa-767b-4c8f-a4a0-0406ebdcf945-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-plp7g\" (UID: \"89d9fcfa-767b-4c8f-a4a0-0406ebdcf945\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-plp7g" Apr 20 20:14:48.566471 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:48.566442 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz4w6\" (UniqueName: \"kubernetes.io/projected/89d9fcfa-767b-4c8f-a4a0-0406ebdcf945-kube-api-access-wz4w6\") pod \"seaweedfs-tls-custom-ddd4dbfd-plp7g\" (UID: \"89d9fcfa-767b-4c8f-a4a0-0406ebdcf945\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-plp7g" Apr 20 20:14:48.660903 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:48.660843 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-plp7g" Apr 20 20:14:48.782193 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:48.782056 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-plp7g"] Apr 20 20:14:48.784897 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:14:48.784867 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89d9fcfa_767b_4c8f_a4a0_0406ebdcf945.slice/crio-4bb7b5ebce532048188faffb36f3898175ea6ee50fa21ed935dd652b522ed5a9 WatchSource:0}: Error finding container 4bb7b5ebce532048188faffb36f3898175ea6ee50fa21ed935dd652b522ed5a9: Status 404 returned error can't find the container with id 4bb7b5ebce532048188faffb36f3898175ea6ee50fa21ed935dd652b522ed5a9 Apr 20 20:14:49.644003 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:49.643965 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-plp7g" event={"ID":"89d9fcfa-767b-4c8f-a4a0-0406ebdcf945","Type":"ContainerStarted","Data":"6eae44f7596f593c879e94f40ae61e0735ebd8e0db6573bf3bbd7237728f4790"} Apr 20 20:14:49.644407 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:49.644011 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-plp7g" event={"ID":"89d9fcfa-767b-4c8f-a4a0-0406ebdcf945","Type":"ContainerStarted","Data":"4bb7b5ebce532048188faffb36f3898175ea6ee50fa21ed935dd652b522ed5a9"} Apr 20 20:14:49.659767 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:49.659715 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-plp7g" podStartSLOduration=1.380109303 podStartE2EDuration="1.659701188s" podCreationTimestamp="2026-04-20 20:14:48 +0000 UTC" firstStartedPulling="2026-04-20 20:14:48.786589407 +0000 UTC m=+563.499271381" lastFinishedPulling="2026-04-20 20:14:49.066181292 +0000 UTC m=+563.778863266" observedRunningTime="2026-04-20 20:14:49.658965147 +0000 UTC m=+564.371647155" watchObservedRunningTime="2026-04-20 20:14:49.659701188 +0000 UTC m=+564.372383185" Apr 20 20:14:50.749501 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:50.749469 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-plp7g"] Apr 20 20:14:51.651743 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:51.651686 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-plp7g" podUID="89d9fcfa-767b-4c8f-a4a0-0406ebdcf945" containerName="seaweedfs-tls-custom" containerID="cri-o://6eae44f7596f593c879e94f40ae61e0735ebd8e0db6573bf3bbd7237728f4790" gracePeriod=30 Apr 20 20:14:52.901074 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:52.901047 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-plp7g" Apr 20 20:14:52.993821 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:52.993741 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/89d9fcfa-767b-4c8f-a4a0-0406ebdcf945-data\") pod \"89d9fcfa-767b-4c8f-a4a0-0406ebdcf945\" (UID: \"89d9fcfa-767b-4c8f-a4a0-0406ebdcf945\") " Apr 20 20:14:52.993821 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:52.993808 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wz4w6\" (UniqueName: \"kubernetes.io/projected/89d9fcfa-767b-4c8f-a4a0-0406ebdcf945-kube-api-access-wz4w6\") pod \"89d9fcfa-767b-4c8f-a4a0-0406ebdcf945\" (UID: \"89d9fcfa-767b-4c8f-a4a0-0406ebdcf945\") " Apr 20 20:14:52.994905 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:52.994878 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89d9fcfa-767b-4c8f-a4a0-0406ebdcf945-data" (OuterVolumeSpecName: "data") pod "89d9fcfa-767b-4c8f-a4a0-0406ebdcf945" (UID: "89d9fcfa-767b-4c8f-a4a0-0406ebdcf945"). InnerVolumeSpecName "data". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:14:52.995767 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:52.995740 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89d9fcfa-767b-4c8f-a4a0-0406ebdcf945-kube-api-access-wz4w6" (OuterVolumeSpecName: "kube-api-access-wz4w6") pod "89d9fcfa-767b-4c8f-a4a0-0406ebdcf945" (UID: "89d9fcfa-767b-4c8f-a4a0-0406ebdcf945"). InnerVolumeSpecName "kube-api-access-wz4w6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:14:53.094482 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:53.094452 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wz4w6\" (UniqueName: \"kubernetes.io/projected/89d9fcfa-767b-4c8f-a4a0-0406ebdcf945-kube-api-access-wz4w6\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:14:53.094482 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:53.094478 2576 reconciler_common.go:299] "Volume detached for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/89d9fcfa-767b-4c8f-a4a0-0406ebdcf945-data\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:14:53.659747 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:53.659713 2576 generic.go:358] "Generic (PLEG): container finished" podID="89d9fcfa-767b-4c8f-a4a0-0406ebdcf945" containerID="6eae44f7596f593c879e94f40ae61e0735ebd8e0db6573bf3bbd7237728f4790" exitCode=0 Apr 20 20:14:53.659923 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:53.659771 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-plp7g" Apr 20 20:14:53.659923 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:53.659772 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-plp7g" event={"ID":"89d9fcfa-767b-4c8f-a4a0-0406ebdcf945","Type":"ContainerDied","Data":"6eae44f7596f593c879e94f40ae61e0735ebd8e0db6573bf3bbd7237728f4790"} Apr 20 20:14:53.659923 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:53.659872 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-plp7g" event={"ID":"89d9fcfa-767b-4c8f-a4a0-0406ebdcf945","Type":"ContainerDied","Data":"4bb7b5ebce532048188faffb36f3898175ea6ee50fa21ed935dd652b522ed5a9"} Apr 20 20:14:53.659923 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:53.659886 2576 scope.go:117] "RemoveContainer" containerID="6eae44f7596f593c879e94f40ae61e0735ebd8e0db6573bf3bbd7237728f4790" Apr 20 20:14:53.668997 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:53.668981 2576 scope.go:117] "RemoveContainer" containerID="6eae44f7596f593c879e94f40ae61e0735ebd8e0db6573bf3bbd7237728f4790" Apr 20 20:14:53.669329 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:14:53.669311 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6eae44f7596f593c879e94f40ae61e0735ebd8e0db6573bf3bbd7237728f4790\": container with ID starting with 6eae44f7596f593c879e94f40ae61e0735ebd8e0db6573bf3bbd7237728f4790 not found: ID does not exist" containerID="6eae44f7596f593c879e94f40ae61e0735ebd8e0db6573bf3bbd7237728f4790" Apr 20 20:14:53.669392 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:53.669338 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6eae44f7596f593c879e94f40ae61e0735ebd8e0db6573bf3bbd7237728f4790"} err="failed to get container status \"6eae44f7596f593c879e94f40ae61e0735ebd8e0db6573bf3bbd7237728f4790\": rpc error: code = NotFound desc = could not find container \"6eae44f7596f593c879e94f40ae61e0735ebd8e0db6573bf3bbd7237728f4790\": container with ID starting with 6eae44f7596f593c879e94f40ae61e0735ebd8e0db6573bf3bbd7237728f4790 not found: ID does not exist" Apr 20 20:14:53.680829 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:53.680808 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-plp7g"] Apr 20 20:14:53.684613 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:53.684594 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-plp7g"] Apr 20 20:14:53.713545 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:53.713521 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-zgmn9"] Apr 20 20:14:53.713831 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:53.713820 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="89d9fcfa-767b-4c8f-a4a0-0406ebdcf945" containerName="seaweedfs-tls-custom" Apr 20 20:14:53.713873 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:53.713833 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="89d9fcfa-767b-4c8f-a4a0-0406ebdcf945" containerName="seaweedfs-tls-custom" Apr 20 20:14:53.713909 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:53.713886 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="89d9fcfa-767b-4c8f-a4a0-0406ebdcf945" containerName="seaweedfs-tls-custom" Apr 20 20:14:53.721352 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:53.721337 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-zgmn9" Apr 20 20:14:53.724086 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:53.724065 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 20 20:14:53.724202 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:53.724094 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-zgmn9"] Apr 20 20:14:53.724260 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:53.724205 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom\"" Apr 20 20:14:53.799937 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:53.799912 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/14e6e679-f921-4d34-a883-989f705c6e43-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-zgmn9\" (UID: \"14e6e679-f921-4d34-a883-989f705c6e43\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-zgmn9" Apr 20 20:14:53.800026 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:53.799947 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/14e6e679-f921-4d34-a883-989f705c6e43-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-zgmn9\" (UID: \"14e6e679-f921-4d34-a883-989f705c6e43\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-zgmn9" Apr 20 20:14:53.800071 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:53.800049 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jv54t\" (UniqueName: \"kubernetes.io/projected/14e6e679-f921-4d34-a883-989f705c6e43-kube-api-access-jv54t\") pod \"seaweedfs-tls-custom-5c88b85bb7-zgmn9\" (UID: \"14e6e679-f921-4d34-a883-989f705c6e43\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-zgmn9" Apr 20 20:14:53.900735 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:53.900710 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jv54t\" (UniqueName: \"kubernetes.io/projected/14e6e679-f921-4d34-a883-989f705c6e43-kube-api-access-jv54t\") pod \"seaweedfs-tls-custom-5c88b85bb7-zgmn9\" (UID: \"14e6e679-f921-4d34-a883-989f705c6e43\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-zgmn9" Apr 20 20:14:53.900830 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:53.900748 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/14e6e679-f921-4d34-a883-989f705c6e43-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-zgmn9\" (UID: \"14e6e679-f921-4d34-a883-989f705c6e43\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-zgmn9" Apr 20 20:14:53.900830 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:53.900773 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/14e6e679-f921-4d34-a883-989f705c6e43-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-zgmn9\" (UID: \"14e6e679-f921-4d34-a883-989f705c6e43\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-zgmn9" Apr 20 20:14:53.901077 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:53.901059 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/14e6e679-f921-4d34-a883-989f705c6e43-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-zgmn9\" (UID: \"14e6e679-f921-4d34-a883-989f705c6e43\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-zgmn9" Apr 20 20:14:53.903065 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:53.903048 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/14e6e679-f921-4d34-a883-989f705c6e43-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-zgmn9\" (UID: \"14e6e679-f921-4d34-a883-989f705c6e43\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-zgmn9" Apr 20 20:14:53.909053 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:53.909032 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jv54t\" (UniqueName: \"kubernetes.io/projected/14e6e679-f921-4d34-a883-989f705c6e43-kube-api-access-jv54t\") pod \"seaweedfs-tls-custom-5c88b85bb7-zgmn9\" (UID: \"14e6e679-f921-4d34-a883-989f705c6e43\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-zgmn9" Apr 20 20:14:53.944775 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:53.944725 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89d9fcfa-767b-4c8f-a4a0-0406ebdcf945" path="/var/lib/kubelet/pods/89d9fcfa-767b-4c8f-a4a0-0406ebdcf945/volumes" Apr 20 20:14:54.032304 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:54.032283 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-zgmn9" Apr 20 20:14:54.153294 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:54.153268 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-zgmn9"] Apr 20 20:14:54.155462 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:14:54.155433 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14e6e679_f921_4d34_a883_989f705c6e43.slice/crio-f482a2f329ef45bab6f15bd41cebfd97726e1eb0c259f309df4e7ca59d9dc395 WatchSource:0}: Error finding container f482a2f329ef45bab6f15bd41cebfd97726e1eb0c259f309df4e7ca59d9dc395: Status 404 returned error can't find the container with id f482a2f329ef45bab6f15bd41cebfd97726e1eb0c259f309df4e7ca59d9dc395 Apr 20 20:14:54.664086 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:54.664056 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-zgmn9" event={"ID":"14e6e679-f921-4d34-a883-989f705c6e43","Type":"ContainerStarted","Data":"f482a2f329ef45bab6f15bd41cebfd97726e1eb0c259f309df4e7ca59d9dc395"} Apr 20 20:14:55.669681 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:55.669640 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-zgmn9" event={"ID":"14e6e679-f921-4d34-a883-989f705c6e43","Type":"ContainerStarted","Data":"95672c94ac009d967683aa6032ae6d2cd729c4ac786ee2e2d8b37d3d35385b9f"} Apr 20 20:14:55.687073 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:55.687026 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-zgmn9" podStartSLOduration=2.245969723 podStartE2EDuration="2.687013415s" podCreationTimestamp="2026-04-20 20:14:53 +0000 UTC" firstStartedPulling="2026-04-20 20:14:54.156683295 +0000 UTC m=+568.869365269" lastFinishedPulling="2026-04-20 20:14:54.597726982 +0000 UTC m=+569.310408961" observedRunningTime="2026-04-20 20:14:55.685883261 +0000 UTC m=+570.398565258" watchObservedRunningTime="2026-04-20 20:14:55.687013415 +0000 UTC m=+570.399695412" Apr 20 20:14:55.972055 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:55.971975 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-custom-njtxf"] Apr 20 20:14:55.976719 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:55.976698 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-njtxf" Apr 20 20:14:55.982001 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:55.981979 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-njtxf"] Apr 20 20:14:56.017977 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:56.017952 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpfk9\" (UniqueName: \"kubernetes.io/projected/a33aa541-18c2-4900-943e-7be3c40c5e1b-kube-api-access-lpfk9\") pod \"s3-tls-init-custom-njtxf\" (UID: \"a33aa541-18c2-4900-943e-7be3c40c5e1b\") " pod="kserve/s3-tls-init-custom-njtxf" Apr 20 20:14:56.118997 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:56.118968 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lpfk9\" (UniqueName: \"kubernetes.io/projected/a33aa541-18c2-4900-943e-7be3c40c5e1b-kube-api-access-lpfk9\") pod \"s3-tls-init-custom-njtxf\" (UID: \"a33aa541-18c2-4900-943e-7be3c40c5e1b\") " pod="kserve/s3-tls-init-custom-njtxf" Apr 20 20:14:56.126854 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:56.126822 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpfk9\" (UniqueName: \"kubernetes.io/projected/a33aa541-18c2-4900-943e-7be3c40c5e1b-kube-api-access-lpfk9\") pod \"s3-tls-init-custom-njtxf\" (UID: \"a33aa541-18c2-4900-943e-7be3c40c5e1b\") " pod="kserve/s3-tls-init-custom-njtxf" Apr 20 20:14:56.299866 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:56.299840 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-njtxf" Apr 20 20:14:56.416556 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:56.416528 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-njtxf"] Apr 20 20:14:56.418851 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:14:56.418813 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda33aa541_18c2_4900_943e_7be3c40c5e1b.slice/crio-6b0bce072f515f62ed5bc52584fe05f5fe4b173c8ec2cd2448b59b718e0f0354 WatchSource:0}: Error finding container 6b0bce072f515f62ed5bc52584fe05f5fe4b173c8ec2cd2448b59b718e0f0354: Status 404 returned error can't find the container with id 6b0bce072f515f62ed5bc52584fe05f5fe4b173c8ec2cd2448b59b718e0f0354 Apr 20 20:14:56.674383 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:56.674291 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-njtxf" event={"ID":"a33aa541-18c2-4900-943e-7be3c40c5e1b","Type":"ContainerStarted","Data":"1badcba3869b93ad0185cabfe05e119f8845555b2262a5ccefa1abe8b19644e7"} Apr 20 20:14:56.674383 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:56.674337 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-njtxf" event={"ID":"a33aa541-18c2-4900-943e-7be3c40c5e1b","Type":"ContainerStarted","Data":"6b0bce072f515f62ed5bc52584fe05f5fe4b173c8ec2cd2448b59b718e0f0354"} Apr 20 20:14:56.691272 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:14:56.691223 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-custom-njtxf" podStartSLOduration=1.6912081319999999 podStartE2EDuration="1.691208132s" podCreationTimestamp="2026-04-20 20:14:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:14:56.690412214 +0000 UTC m=+571.403094212" watchObservedRunningTime="2026-04-20 20:14:56.691208132 +0000 UTC m=+571.403890131" Apr 20 20:15:01.691482 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:01.691449 2576 generic.go:358] "Generic (PLEG): container finished" podID="a33aa541-18c2-4900-943e-7be3c40c5e1b" containerID="1badcba3869b93ad0185cabfe05e119f8845555b2262a5ccefa1abe8b19644e7" exitCode=0 Apr 20 20:15:01.691896 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:01.691509 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-njtxf" event={"ID":"a33aa541-18c2-4900-943e-7be3c40c5e1b","Type":"ContainerDied","Data":"1badcba3869b93ad0185cabfe05e119f8845555b2262a5ccefa1abe8b19644e7"} Apr 20 20:15:02.820273 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:02.820251 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-njtxf" Apr 20 20:15:02.872950 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:02.872924 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpfk9\" (UniqueName: \"kubernetes.io/projected/a33aa541-18c2-4900-943e-7be3c40c5e1b-kube-api-access-lpfk9\") pod \"a33aa541-18c2-4900-943e-7be3c40c5e1b\" (UID: \"a33aa541-18c2-4900-943e-7be3c40c5e1b\") " Apr 20 20:15:02.874968 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:02.874942 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a33aa541-18c2-4900-943e-7be3c40c5e1b-kube-api-access-lpfk9" (OuterVolumeSpecName: "kube-api-access-lpfk9") pod "a33aa541-18c2-4900-943e-7be3c40c5e1b" (UID: "a33aa541-18c2-4900-943e-7be3c40c5e1b"). InnerVolumeSpecName "kube-api-access-lpfk9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:15:02.973938 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:02.973871 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lpfk9\" (UniqueName: \"kubernetes.io/projected/a33aa541-18c2-4900-943e-7be3c40c5e1b-kube-api-access-lpfk9\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:15:03.698960 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:03.698931 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-njtxf" Apr 20 20:15:03.699144 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:03.698925 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-njtxf" event={"ID":"a33aa541-18c2-4900-943e-7be3c40c5e1b","Type":"ContainerDied","Data":"6b0bce072f515f62ed5bc52584fe05f5fe4b173c8ec2cd2448b59b718e0f0354"} Apr 20 20:15:03.699144 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:03.699034 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b0bce072f515f62ed5bc52584fe05f5fe4b173c8ec2cd2448b59b718e0f0354" Apr 20 20:15:04.270101 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:04.270066 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-p4qzl"] Apr 20 20:15:04.270826 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:04.270807 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a33aa541-18c2-4900-943e-7be3c40c5e1b" containerName="s3-tls-init-custom" Apr 20 20:15:04.270923 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:04.270828 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a33aa541-18c2-4900-943e-7be3c40c5e1b" containerName="s3-tls-init-custom" Apr 20 20:15:04.270975 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:04.270931 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a33aa541-18c2-4900-943e-7be3c40c5e1b" containerName="s3-tls-init-custom" Apr 20 20:15:04.274150 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:04.274129 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-p4qzl" Apr 20 20:15:04.275519 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:04.275493 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-p4qzl"] Apr 20 20:15:04.276678 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:04.276651 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving\"" Apr 20 20:15:04.276796 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:04.276651 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving-artifact\"" Apr 20 20:15:04.384082 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:04.384057 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9cwg\" (UniqueName: \"kubernetes.io/projected/80102916-20d8-463f-9795-f3e3974b86f4-kube-api-access-p9cwg\") pod \"seaweedfs-tls-serving-7fd5766db9-p4qzl\" (UID: \"80102916-20d8-463f-9795-f3e3974b86f4\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-p4qzl" Apr 20 20:15:04.384212 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:04.384099 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/80102916-20d8-463f-9795-f3e3974b86f4-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-p4qzl\" (UID: \"80102916-20d8-463f-9795-f3e3974b86f4\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-p4qzl" Apr 20 20:15:04.384212 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:04.384141 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/80102916-20d8-463f-9795-f3e3974b86f4-data\") pod \"seaweedfs-tls-serving-7fd5766db9-p4qzl\" (UID: \"80102916-20d8-463f-9795-f3e3974b86f4\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-p4qzl" Apr 20 20:15:04.485190 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:04.485157 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/80102916-20d8-463f-9795-f3e3974b86f4-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-p4qzl\" (UID: \"80102916-20d8-463f-9795-f3e3974b86f4\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-p4qzl" Apr 20 20:15:04.485336 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:04.485195 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/80102916-20d8-463f-9795-f3e3974b86f4-data\") pod \"seaweedfs-tls-serving-7fd5766db9-p4qzl\" (UID: \"80102916-20d8-463f-9795-f3e3974b86f4\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-p4qzl" Apr 20 20:15:04.485336 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:04.485252 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p9cwg\" (UniqueName: \"kubernetes.io/projected/80102916-20d8-463f-9795-f3e3974b86f4-kube-api-access-p9cwg\") pod \"seaweedfs-tls-serving-7fd5766db9-p4qzl\" (UID: \"80102916-20d8-463f-9795-f3e3974b86f4\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-p4qzl" Apr 20 20:15:04.485617 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:04.485597 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/80102916-20d8-463f-9795-f3e3974b86f4-data\") pod \"seaweedfs-tls-serving-7fd5766db9-p4qzl\" (UID: \"80102916-20d8-463f-9795-f3e3974b86f4\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-p4qzl" Apr 20 20:15:04.487501 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:04.487484 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/80102916-20d8-463f-9795-f3e3974b86f4-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-p4qzl\" (UID: \"80102916-20d8-463f-9795-f3e3974b86f4\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-p4qzl" Apr 20 20:15:04.493356 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:04.493327 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9cwg\" (UniqueName: \"kubernetes.io/projected/80102916-20d8-463f-9795-f3e3974b86f4-kube-api-access-p9cwg\") pod \"seaweedfs-tls-serving-7fd5766db9-p4qzl\" (UID: \"80102916-20d8-463f-9795-f3e3974b86f4\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-p4qzl" Apr 20 20:15:04.584992 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:04.584958 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-p4qzl" Apr 20 20:15:04.717055 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:04.717031 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-p4qzl"] Apr 20 20:15:04.719103 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:15:04.719071 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80102916_20d8_463f_9795_f3e3974b86f4.slice/crio-2294f8a76998d11c97a13a5afe61e495c6ee32506ec001bc055e5d2b4c765cae WatchSource:0}: Error finding container 2294f8a76998d11c97a13a5afe61e495c6ee32506ec001bc055e5d2b4c765cae: Status 404 returned error can't find the container with id 2294f8a76998d11c97a13a5afe61e495c6ee32506ec001bc055e5d2b4c765cae Apr 20 20:15:05.707954 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:05.707923 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-p4qzl" event={"ID":"80102916-20d8-463f-9795-f3e3974b86f4","Type":"ContainerStarted","Data":"9768d89b6978df6bf7c3e3a4911a76bba03a03239feb7d99d7eb2c0144edf13e"} Apr 20 20:15:05.707954 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:05.707958 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-p4qzl" event={"ID":"80102916-20d8-463f-9795-f3e3974b86f4","Type":"ContainerStarted","Data":"2294f8a76998d11c97a13a5afe61e495c6ee32506ec001bc055e5d2b4c765cae"} Apr 20 20:15:05.724516 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:05.724464 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-serving-7fd5766db9-p4qzl" podStartSLOduration=1.423586901 podStartE2EDuration="1.724448903s" podCreationTimestamp="2026-04-20 20:15:04 +0000 UTC" firstStartedPulling="2026-04-20 20:15:04.720204164 +0000 UTC m=+579.432886140" lastFinishedPulling="2026-04-20 20:15:05.021066165 +0000 UTC m=+579.733748142" observedRunningTime="2026-04-20 20:15:05.723306791 +0000 UTC m=+580.435988790" watchObservedRunningTime="2026-04-20 20:15:05.724448903 +0000 UTC m=+580.437130937" Apr 20 20:15:25.819901 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:25.819877 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z55qt_f78ac3d9-bcf1-43dd-aac7-1678831ee3ba/ovn-acl-logging/0.log" Apr 20 20:15:25.820427 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:25.819878 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z55qt_f78ac3d9-bcf1-43dd-aac7-1678831ee3ba/ovn-acl-logging/0.log" Apr 20 20:15:28.581121 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:28.581071 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d"] Apr 20 20:15:28.586168 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:28.586150 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" Apr 20 20:15:28.589160 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:28.588919 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-predictor-serving-cert\"" Apr 20 20:15:28.589405 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:28.589387 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\"" Apr 20 20:15:28.589551 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:28.589528 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-c5zv6\"" Apr 20 20:15:28.589748 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:28.589727 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 20 20:15:28.589875 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:28.589407 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 20 20:15:28.596207 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:28.596183 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d"] Apr 20 20:15:28.775173 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:28.775131 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d08af062-20ce-42c9-af90-0b4308d2e366-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d\" (UID: \"d08af062-20ce-42c9-af90-0b4308d2e366\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" Apr 20 20:15:28.775370 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:28.775202 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d08af062-20ce-42c9-af90-0b4308d2e366-proxy-tls\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d\" (UID: \"d08af062-20ce-42c9-af90-0b4308d2e366\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" Apr 20 20:15:28.775370 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:28.775295 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d08af062-20ce-42c9-af90-0b4308d2e366-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d\" (UID: \"d08af062-20ce-42c9-af90-0b4308d2e366\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" Apr 20 20:15:28.775370 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:28.775345 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d6wq\" (UniqueName: \"kubernetes.io/projected/d08af062-20ce-42c9-af90-0b4308d2e366-kube-api-access-4d6wq\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d\" (UID: \"d08af062-20ce-42c9-af90-0b4308d2e366\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" Apr 20 20:15:28.876616 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:28.876532 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d08af062-20ce-42c9-af90-0b4308d2e366-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d\" (UID: \"d08af062-20ce-42c9-af90-0b4308d2e366\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" Apr 20 20:15:28.876616 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:28.876576 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4d6wq\" (UniqueName: \"kubernetes.io/projected/d08af062-20ce-42c9-af90-0b4308d2e366-kube-api-access-4d6wq\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d\" (UID: \"d08af062-20ce-42c9-af90-0b4308d2e366\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" Apr 20 20:15:28.876830 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:28.876673 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d08af062-20ce-42c9-af90-0b4308d2e366-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d\" (UID: \"d08af062-20ce-42c9-af90-0b4308d2e366\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" Apr 20 20:15:28.876830 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:28.876708 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d08af062-20ce-42c9-af90-0b4308d2e366-proxy-tls\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d\" (UID: \"d08af062-20ce-42c9-af90-0b4308d2e366\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" Apr 20 20:15:28.876830 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:15:28.876816 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-serving-cert: secret "isvc-sklearn-batcher-predictor-serving-cert" not found Apr 20 20:15:28.877000 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:15:28.876874 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d08af062-20ce-42c9-af90-0b4308d2e366-proxy-tls podName:d08af062-20ce-42c9-af90-0b4308d2e366 nodeName:}" failed. No retries permitted until 2026-04-20 20:15:29.376856381 +0000 UTC m=+604.089538357 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/d08af062-20ce-42c9-af90-0b4308d2e366-proxy-tls") pod "isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" (UID: "d08af062-20ce-42c9-af90-0b4308d2e366") : secret "isvc-sklearn-batcher-predictor-serving-cert" not found Apr 20 20:15:28.877000 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:28.876980 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d08af062-20ce-42c9-af90-0b4308d2e366-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d\" (UID: \"d08af062-20ce-42c9-af90-0b4308d2e366\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" Apr 20 20:15:28.877466 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:28.877443 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d08af062-20ce-42c9-af90-0b4308d2e366-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d\" (UID: \"d08af062-20ce-42c9-af90-0b4308d2e366\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" Apr 20 20:15:28.888953 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:28.888934 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d6wq\" (UniqueName: \"kubernetes.io/projected/d08af062-20ce-42c9-af90-0b4308d2e366-kube-api-access-4d6wq\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d\" (UID: \"d08af062-20ce-42c9-af90-0b4308d2e366\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" Apr 20 20:15:29.380386 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:29.380340 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d08af062-20ce-42c9-af90-0b4308d2e366-proxy-tls\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d\" (UID: \"d08af062-20ce-42c9-af90-0b4308d2e366\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" Apr 20 20:15:29.382843 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:29.382816 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d08af062-20ce-42c9-af90-0b4308d2e366-proxy-tls\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d\" (UID: \"d08af062-20ce-42c9-af90-0b4308d2e366\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" Apr 20 20:15:29.502991 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:29.502957 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" Apr 20 20:15:29.834336 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:29.834252 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d"] Apr 20 20:15:29.836482 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:15:29.836451 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd08af062_20ce_42c9_af90_0b4308d2e366.slice/crio-098c53e49b4a058bb997081a650223efbc5e0b056ff031dc6d6e4d300f345e03 WatchSource:0}: Error finding container 098c53e49b4a058bb997081a650223efbc5e0b056ff031dc6d6e4d300f345e03: Status 404 returned error can't find the container with id 098c53e49b4a058bb997081a650223efbc5e0b056ff031dc6d6e4d300f345e03 Apr 20 20:15:30.790171 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:30.790123 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" event={"ID":"d08af062-20ce-42c9-af90-0b4308d2e366","Type":"ContainerStarted","Data":"098c53e49b4a058bb997081a650223efbc5e0b056ff031dc6d6e4d300f345e03"} Apr 20 20:15:33.801829 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:33.801794 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" event={"ID":"d08af062-20ce-42c9-af90-0b4308d2e366","Type":"ContainerStarted","Data":"2e222b591474332b027c9b30a13aa9a6eb07a15a29770764a31f627f26dcb344"} Apr 20 20:15:37.816271 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:37.816236 2576 generic.go:358] "Generic (PLEG): container finished" podID="d08af062-20ce-42c9-af90-0b4308d2e366" containerID="2e222b591474332b027c9b30a13aa9a6eb07a15a29770764a31f627f26dcb344" exitCode=0 Apr 20 20:15:37.816694 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:37.816308 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" event={"ID":"d08af062-20ce-42c9-af90-0b4308d2e366","Type":"ContainerDied","Data":"2e222b591474332b027c9b30a13aa9a6eb07a15a29770764a31f627f26dcb344"} Apr 20 20:15:51.875537 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:51.875494 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" event={"ID":"d08af062-20ce-42c9-af90-0b4308d2e366","Type":"ContainerStarted","Data":"e84570c41cde812133b7074634d7883ef65befff7d7bb492ec795091a537dea7"} Apr 20 20:15:53.884557 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:53.884521 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" event={"ID":"d08af062-20ce-42c9-af90-0b4308d2e366","Type":"ContainerStarted","Data":"73b33b30ff51379af32a92b84aa3d7f0f27da7b134f62e5b272b32b824e6fa2c"} Apr 20 20:15:56.903148 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:56.903057 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" event={"ID":"d08af062-20ce-42c9-af90-0b4308d2e366","Type":"ContainerStarted","Data":"4eed1548ad7c7561353683247dbf440e04ca588b7632d584c8d0249fd4fb48e9"} Apr 20 20:15:56.903555 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:56.903330 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" Apr 20 20:15:56.903555 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:56.903434 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" Apr 20 20:15:56.904623 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:56.904598 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" podUID="d08af062-20ce-42c9-af90-0b4308d2e366" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 20 20:15:56.924536 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:56.924490 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" podStartSLOduration=2.185810228 podStartE2EDuration="28.924480005s" podCreationTimestamp="2026-04-20 20:15:28 +0000 UTC" firstStartedPulling="2026-04-20 20:15:29.838494944 +0000 UTC m=+604.551176920" lastFinishedPulling="2026-04-20 20:15:56.577164723 +0000 UTC m=+631.289846697" observedRunningTime="2026-04-20 20:15:56.92268242 +0000 UTC m=+631.635364432" watchObservedRunningTime="2026-04-20 20:15:56.924480005 +0000 UTC m=+631.637162001" Apr 20 20:15:57.907058 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:57.907026 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" Apr 20 20:15:57.907533 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:57.907202 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" podUID="d08af062-20ce-42c9-af90-0b4308d2e366" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 20 20:15:57.908039 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:57.908015 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" podUID="d08af062-20ce-42c9-af90-0b4308d2e366" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:15:58.910642 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:58.910596 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" podUID="d08af062-20ce-42c9-af90-0b4308d2e366" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 20 20:15:58.911075 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:58.911016 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" podUID="d08af062-20ce-42c9-af90-0b4308d2e366" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:15:58.914683 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:58.914663 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" Apr 20 20:15:59.913377 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:59.913346 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" podUID="d08af062-20ce-42c9-af90-0b4308d2e366" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 20 20:15:59.913772 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:15:59.913632 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" podUID="d08af062-20ce-42c9-af90-0b4308d2e366" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:16:09.913319 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:16:09.913279 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" podUID="d08af062-20ce-42c9-af90-0b4308d2e366" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 20 20:16:09.913815 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:16:09.913767 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" podUID="d08af062-20ce-42c9-af90-0b4308d2e366" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:16:19.914210 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:16:19.914168 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" podUID="d08af062-20ce-42c9-af90-0b4308d2e366" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 20 20:16:19.914677 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:16:19.914529 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" podUID="d08af062-20ce-42c9-af90-0b4308d2e366" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:16:29.914196 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:16:29.914147 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" podUID="d08af062-20ce-42c9-af90-0b4308d2e366" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 20 20:16:29.914682 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:16:29.914544 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" podUID="d08af062-20ce-42c9-af90-0b4308d2e366" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:16:39.913681 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:16:39.913624 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" podUID="d08af062-20ce-42c9-af90-0b4308d2e366" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 20 20:16:39.914097 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:16:39.914077 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" podUID="d08af062-20ce-42c9-af90-0b4308d2e366" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:16:49.913907 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:16:49.913861 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" podUID="d08af062-20ce-42c9-af90-0b4308d2e366" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 20 20:16:49.914418 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:16:49.914392 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" podUID="d08af062-20ce-42c9-af90-0b4308d2e366" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:16:59.914326 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:16:59.914289 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" Apr 20 20:16:59.914820 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:16:59.914800 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" Apr 20 20:17:13.703265 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:13.703064 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d"] Apr 20 20:17:13.703746 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:13.703599 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" podUID="d08af062-20ce-42c9-af90-0b4308d2e366" containerName="kserve-container" containerID="cri-o://e84570c41cde812133b7074634d7883ef65befff7d7bb492ec795091a537dea7" gracePeriod=30 Apr 20 20:17:13.703835 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:13.703803 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" podUID="d08af062-20ce-42c9-af90-0b4308d2e366" containerName="agent" containerID="cri-o://4eed1548ad7c7561353683247dbf440e04ca588b7632d584c8d0249fd4fb48e9" gracePeriod=30 Apr 20 20:17:13.703894 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:13.703819 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" podUID="d08af062-20ce-42c9-af90-0b4308d2e366" containerName="kube-rbac-proxy" containerID="cri-o://73b33b30ff51379af32a92b84aa3d7f0f27da7b134f62e5b272b32b824e6fa2c" gracePeriod=30 Apr 20 20:17:13.805545 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:13.805517 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v"] Apr 20 20:17:13.810379 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:13.810340 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" Apr 20 20:17:13.814483 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:13.814433 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-custom-predictor-serving-cert\"" Apr 20 20:17:13.814605 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:13.814487 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\"" Apr 20 20:17:13.818573 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:13.818545 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v"] Apr 20 20:17:13.911309 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:13.911276 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" podUID="d08af062-20ce-42c9-af90-0b4308d2e366" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.37:8643/healthz\": dial tcp 10.132.0.37:8643: connect: connection refused" Apr 20 20:17:13.913715 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:13.913685 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxklw\" (UniqueName: \"kubernetes.io/projected/975ec177-b69b-4b8b-9297-309ea4fec83a-kube-api-access-pxklw\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v\" (UID: \"975ec177-b69b-4b8b-9297-309ea4fec83a\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" Apr 20 20:17:13.913830 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:13.913738 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/975ec177-b69b-4b8b-9297-309ea4fec83a-proxy-tls\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v\" (UID: \"975ec177-b69b-4b8b-9297-309ea4fec83a\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" Apr 20 20:17:13.913830 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:13.913783 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/975ec177-b69b-4b8b-9297-309ea4fec83a-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v\" (UID: \"975ec177-b69b-4b8b-9297-309ea4fec83a\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" Apr 20 20:17:13.913830 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:13.913825 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/975ec177-b69b-4b8b-9297-309ea4fec83a-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v\" (UID: \"975ec177-b69b-4b8b-9297-309ea4fec83a\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" Apr 20 20:17:14.015301 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:14.015232 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pxklw\" (UniqueName: \"kubernetes.io/projected/975ec177-b69b-4b8b-9297-309ea4fec83a-kube-api-access-pxklw\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v\" (UID: \"975ec177-b69b-4b8b-9297-309ea4fec83a\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" Apr 20 20:17:14.015301 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:14.015264 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/975ec177-b69b-4b8b-9297-309ea4fec83a-proxy-tls\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v\" (UID: \"975ec177-b69b-4b8b-9297-309ea4fec83a\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" Apr 20 20:17:14.015301 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:14.015289 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/975ec177-b69b-4b8b-9297-309ea4fec83a-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v\" (UID: \"975ec177-b69b-4b8b-9297-309ea4fec83a\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" Apr 20 20:17:14.015494 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:14.015344 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/975ec177-b69b-4b8b-9297-309ea4fec83a-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v\" (UID: \"975ec177-b69b-4b8b-9297-309ea4fec83a\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" Apr 20 20:17:14.015764 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:14.015744 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/975ec177-b69b-4b8b-9297-309ea4fec83a-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v\" (UID: \"975ec177-b69b-4b8b-9297-309ea4fec83a\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" Apr 20 20:17:14.015892 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:14.015861 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/975ec177-b69b-4b8b-9297-309ea4fec83a-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v\" (UID: \"975ec177-b69b-4b8b-9297-309ea4fec83a\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" Apr 20 20:17:14.017832 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:14.017811 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/975ec177-b69b-4b8b-9297-309ea4fec83a-proxy-tls\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v\" (UID: \"975ec177-b69b-4b8b-9297-309ea4fec83a\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" Apr 20 20:17:14.023922 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:14.023903 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxklw\" (UniqueName: \"kubernetes.io/projected/975ec177-b69b-4b8b-9297-309ea4fec83a-kube-api-access-pxklw\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v\" (UID: \"975ec177-b69b-4b8b-9297-309ea4fec83a\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" Apr 20 20:17:14.124961 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:14.124934 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" Apr 20 20:17:14.153889 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:14.153861 2576 generic.go:358] "Generic (PLEG): container finished" podID="d08af062-20ce-42c9-af90-0b4308d2e366" containerID="73b33b30ff51379af32a92b84aa3d7f0f27da7b134f62e5b272b32b824e6fa2c" exitCode=2 Apr 20 20:17:14.154015 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:14.153914 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" event={"ID":"d08af062-20ce-42c9-af90-0b4308d2e366","Type":"ContainerDied","Data":"73b33b30ff51379af32a92b84aa3d7f0f27da7b134f62e5b272b32b824e6fa2c"} Apr 20 20:17:14.249377 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:14.249352 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v"] Apr 20 20:17:14.251650 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:17:14.251623 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod975ec177_b69b_4b8b_9297_309ea4fec83a.slice/crio-4470ddd38ac4a68dda2112f586e100e3854d078aed1a140b2fc4275cc050ce7e WatchSource:0}: Error finding container 4470ddd38ac4a68dda2112f586e100e3854d078aed1a140b2fc4275cc050ce7e: Status 404 returned error can't find the container with id 4470ddd38ac4a68dda2112f586e100e3854d078aed1a140b2fc4275cc050ce7e Apr 20 20:17:14.253466 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:14.253450 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:17:15.158220 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:15.158184 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" event={"ID":"975ec177-b69b-4b8b-9297-309ea4fec83a","Type":"ContainerStarted","Data":"8caf7f055159cf29c23c08f4b1e980877df07c0c5367c1fc31824ef2ffceabfc"} Apr 20 20:17:15.158220 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:15.158220 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" event={"ID":"975ec177-b69b-4b8b-9297-309ea4fec83a","Type":"ContainerStarted","Data":"4470ddd38ac4a68dda2112f586e100e3854d078aed1a140b2fc4275cc050ce7e"} Apr 20 20:17:18.168982 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:18.168950 2576 generic.go:358] "Generic (PLEG): container finished" podID="975ec177-b69b-4b8b-9297-309ea4fec83a" containerID="8caf7f055159cf29c23c08f4b1e980877df07c0c5367c1fc31824ef2ffceabfc" exitCode=0 Apr 20 20:17:18.169385 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:18.169031 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" event={"ID":"975ec177-b69b-4b8b-9297-309ea4fec83a","Type":"ContainerDied","Data":"8caf7f055159cf29c23c08f4b1e980877df07c0c5367c1fc31824ef2ffceabfc"} Apr 20 20:17:18.171277 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:18.171255 2576 generic.go:358] "Generic (PLEG): container finished" podID="d08af062-20ce-42c9-af90-0b4308d2e366" containerID="e84570c41cde812133b7074634d7883ef65befff7d7bb492ec795091a537dea7" exitCode=0 Apr 20 20:17:18.171395 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:18.171331 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" event={"ID":"d08af062-20ce-42c9-af90-0b4308d2e366","Type":"ContainerDied","Data":"e84570c41cde812133b7074634d7883ef65befff7d7bb492ec795091a537dea7"} Apr 20 20:17:18.911053 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:18.911005 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" podUID="d08af062-20ce-42c9-af90-0b4308d2e366" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.37:8643/healthz\": dial tcp 10.132.0.37:8643: connect: connection refused" Apr 20 20:17:19.176694 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:19.176605 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" event={"ID":"975ec177-b69b-4b8b-9297-309ea4fec83a","Type":"ContainerStarted","Data":"e25ed67c16919565d5ae741276062ee2cd76a8b0719f5c864f55e3e904d2b0a5"} Apr 20 20:17:19.176694 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:19.176651 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" event={"ID":"975ec177-b69b-4b8b-9297-309ea4fec83a","Type":"ContainerStarted","Data":"0481bf64fd5b33fda03010fc300e08125d2381e54882b036262136458aa70988"} Apr 20 20:17:19.176694 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:19.176661 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" event={"ID":"975ec177-b69b-4b8b-9297-309ea4fec83a","Type":"ContainerStarted","Data":"85e2f487a1bbfb330d5d9c3d57dedf6fd51e5057e344e24558e01ee490639688"} Apr 20 20:17:19.177293 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:19.176953 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" Apr 20 20:17:19.177293 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:19.177085 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" Apr 20 20:17:19.178409 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:19.178383 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" podUID="975ec177-b69b-4b8b-9297-309ea4fec83a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:5000: connect: connection refused" Apr 20 20:17:19.198755 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:19.198714 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" podStartSLOduration=6.198702762 podStartE2EDuration="6.198702762s" podCreationTimestamp="2026-04-20 20:17:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:17:19.196973175 +0000 UTC m=+713.909655199" watchObservedRunningTime="2026-04-20 20:17:19.198702762 +0000 UTC m=+713.911384759" Apr 20 20:17:19.913673 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:19.913634 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" podUID="d08af062-20ce-42c9-af90-0b4308d2e366" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 20 20:17:19.915258 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:19.915229 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" podUID="d08af062-20ce-42c9-af90-0b4308d2e366" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:17:20.180701 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:20.180610 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" podUID="975ec177-b69b-4b8b-9297-309ea4fec83a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:5000: connect: connection refused" Apr 20 20:17:20.180701 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:20.180666 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" Apr 20 20:17:20.181666 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:20.181645 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" podUID="975ec177-b69b-4b8b-9297-309ea4fec83a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:17:21.183355 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:21.183320 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" podUID="975ec177-b69b-4b8b-9297-309ea4fec83a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:5000: connect: connection refused" Apr 20 20:17:21.183802 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:21.183778 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" podUID="975ec177-b69b-4b8b-9297-309ea4fec83a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:17:23.911584 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:23.911543 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" podUID="d08af062-20ce-42c9-af90-0b4308d2e366" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.37:8643/healthz\": dial tcp 10.132.0.37:8643: connect: connection refused" Apr 20 20:17:23.911962 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:23.911694 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" Apr 20 20:17:26.187837 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:26.187811 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" Apr 20 20:17:26.188399 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:26.188358 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" podUID="975ec177-b69b-4b8b-9297-309ea4fec83a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:5000: connect: connection refused" Apr 20 20:17:26.188634 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:26.188611 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" podUID="975ec177-b69b-4b8b-9297-309ea4fec83a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:17:28.910849 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:28.910808 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" podUID="d08af062-20ce-42c9-af90-0b4308d2e366" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.37:8643/healthz\": dial tcp 10.132.0.37:8643: connect: connection refused" Apr 20 20:17:29.913774 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:29.913731 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" podUID="d08af062-20ce-42c9-af90-0b4308d2e366" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 20 20:17:29.915274 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:29.915250 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" podUID="d08af062-20ce-42c9-af90-0b4308d2e366" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:17:33.911465 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:33.911424 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" podUID="d08af062-20ce-42c9-af90-0b4308d2e366" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.37:8643/healthz\": dial tcp 10.132.0.37:8643: connect: connection refused" Apr 20 20:17:36.189079 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:36.189032 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" podUID="975ec177-b69b-4b8b-9297-309ea4fec83a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:5000: connect: connection refused" Apr 20 20:17:36.189510 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:36.189487 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" podUID="975ec177-b69b-4b8b-9297-309ea4fec83a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:17:38.911377 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:38.911336 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" podUID="d08af062-20ce-42c9-af90-0b4308d2e366" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.37:8643/healthz\": dial tcp 10.132.0.37:8643: connect: connection refused" Apr 20 20:17:39.914314 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:39.914275 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" podUID="d08af062-20ce-42c9-af90-0b4308d2e366" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 20 20:17:39.914747 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:39.914426 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" Apr 20 20:17:39.915293 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:39.915262 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" podUID="d08af062-20ce-42c9-af90-0b4308d2e366" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:17:39.915405 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:39.915372 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" Apr 20 20:17:43.844293 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:43.844268 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" Apr 20 20:17:43.858707 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:43.858683 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d08af062-20ce-42c9-af90-0b4308d2e366-kserve-provision-location\") pod \"d08af062-20ce-42c9-af90-0b4308d2e366\" (UID: \"d08af062-20ce-42c9-af90-0b4308d2e366\") " Apr 20 20:17:43.858809 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:43.858742 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d08af062-20ce-42c9-af90-0b4308d2e366-proxy-tls\") pod \"d08af062-20ce-42c9-af90-0b4308d2e366\" (UID: \"d08af062-20ce-42c9-af90-0b4308d2e366\") " Apr 20 20:17:43.858809 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:43.858792 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d6wq\" (UniqueName: \"kubernetes.io/projected/d08af062-20ce-42c9-af90-0b4308d2e366-kube-api-access-4d6wq\") pod \"d08af062-20ce-42c9-af90-0b4308d2e366\" (UID: \"d08af062-20ce-42c9-af90-0b4308d2e366\") " Apr 20 20:17:43.858896 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:43.858841 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d08af062-20ce-42c9-af90-0b4308d2e366-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"d08af062-20ce-42c9-af90-0b4308d2e366\" (UID: \"d08af062-20ce-42c9-af90-0b4308d2e366\") " Apr 20 20:17:43.859011 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:43.858986 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d08af062-20ce-42c9-af90-0b4308d2e366-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d08af062-20ce-42c9-af90-0b4308d2e366" (UID: "d08af062-20ce-42c9-af90-0b4308d2e366"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:17:43.859150 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:43.859070 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d08af062-20ce-42c9-af90-0b4308d2e366-kserve-provision-location\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:17:43.859295 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:43.859271 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d08af062-20ce-42c9-af90-0b4308d2e366-isvc-sklearn-batcher-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-batcher-kube-rbac-proxy-sar-config") pod "d08af062-20ce-42c9-af90-0b4308d2e366" (UID: "d08af062-20ce-42c9-af90-0b4308d2e366"). InnerVolumeSpecName "isvc-sklearn-batcher-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:17:43.860950 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:43.860914 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d08af062-20ce-42c9-af90-0b4308d2e366-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d08af062-20ce-42c9-af90-0b4308d2e366" (UID: "d08af062-20ce-42c9-af90-0b4308d2e366"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:17:43.861038 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:43.860979 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d08af062-20ce-42c9-af90-0b4308d2e366-kube-api-access-4d6wq" (OuterVolumeSpecName: "kube-api-access-4d6wq") pod "d08af062-20ce-42c9-af90-0b4308d2e366" (UID: "d08af062-20ce-42c9-af90-0b4308d2e366"). InnerVolumeSpecName "kube-api-access-4d6wq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:17:43.959960 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:43.959915 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d08af062-20ce-42c9-af90-0b4308d2e366-proxy-tls\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:17:43.959960 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:43.959938 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4d6wq\" (UniqueName: \"kubernetes.io/projected/d08af062-20ce-42c9-af90-0b4308d2e366-kube-api-access-4d6wq\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:17:43.959960 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:43.959947 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d08af062-20ce-42c9-af90-0b4308d2e366-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:17:44.264525 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:44.264441 2576 generic.go:358] "Generic (PLEG): container finished" podID="d08af062-20ce-42c9-af90-0b4308d2e366" containerID="4eed1548ad7c7561353683247dbf440e04ca588b7632d584c8d0249fd4fb48e9" exitCode=0 Apr 20 20:17:44.264651 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:44.264519 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" event={"ID":"d08af062-20ce-42c9-af90-0b4308d2e366","Type":"ContainerDied","Data":"4eed1548ad7c7561353683247dbf440e04ca588b7632d584c8d0249fd4fb48e9"} Apr 20 20:17:44.264651 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:44.264559 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" event={"ID":"d08af062-20ce-42c9-af90-0b4308d2e366","Type":"ContainerDied","Data":"098c53e49b4a058bb997081a650223efbc5e0b056ff031dc6d6e4d300f345e03"} Apr 20 20:17:44.264651 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:44.264565 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d" Apr 20 20:17:44.264651 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:44.264576 2576 scope.go:117] "RemoveContainer" containerID="4eed1548ad7c7561353683247dbf440e04ca588b7632d584c8d0249fd4fb48e9" Apr 20 20:17:44.272238 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:44.272220 2576 scope.go:117] "RemoveContainer" containerID="73b33b30ff51379af32a92b84aa3d7f0f27da7b134f62e5b272b32b824e6fa2c" Apr 20 20:17:44.279060 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:44.279044 2576 scope.go:117] "RemoveContainer" containerID="e84570c41cde812133b7074634d7883ef65befff7d7bb492ec795091a537dea7" Apr 20 20:17:44.284197 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:44.284175 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d"] Apr 20 20:17:44.286560 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:44.286541 2576 scope.go:117] "RemoveContainer" containerID="2e222b591474332b027c9b30a13aa9a6eb07a15a29770764a31f627f26dcb344" Apr 20 20:17:44.288997 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:44.288980 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-47k5d"] Apr 20 20:17:44.293407 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:44.293386 2576 scope.go:117] "RemoveContainer" containerID="4eed1548ad7c7561353683247dbf440e04ca588b7632d584c8d0249fd4fb48e9" Apr 20 20:17:44.293641 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:17:44.293623 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4eed1548ad7c7561353683247dbf440e04ca588b7632d584c8d0249fd4fb48e9\": container with ID starting with 4eed1548ad7c7561353683247dbf440e04ca588b7632d584c8d0249fd4fb48e9 not found: ID does not exist" containerID="4eed1548ad7c7561353683247dbf440e04ca588b7632d584c8d0249fd4fb48e9" Apr 20 20:17:44.293683 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:44.293649 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4eed1548ad7c7561353683247dbf440e04ca588b7632d584c8d0249fd4fb48e9"} err="failed to get container status \"4eed1548ad7c7561353683247dbf440e04ca588b7632d584c8d0249fd4fb48e9\": rpc error: code = NotFound desc = could not find container \"4eed1548ad7c7561353683247dbf440e04ca588b7632d584c8d0249fd4fb48e9\": container with ID starting with 4eed1548ad7c7561353683247dbf440e04ca588b7632d584c8d0249fd4fb48e9 not found: ID does not exist" Apr 20 20:17:44.293683 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:44.293666 2576 scope.go:117] "RemoveContainer" containerID="73b33b30ff51379af32a92b84aa3d7f0f27da7b134f62e5b272b32b824e6fa2c" Apr 20 20:17:44.293888 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:17:44.293875 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73b33b30ff51379af32a92b84aa3d7f0f27da7b134f62e5b272b32b824e6fa2c\": container with ID starting with 73b33b30ff51379af32a92b84aa3d7f0f27da7b134f62e5b272b32b824e6fa2c not found: ID does not exist" containerID="73b33b30ff51379af32a92b84aa3d7f0f27da7b134f62e5b272b32b824e6fa2c" Apr 20 20:17:44.293926 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:44.293892 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73b33b30ff51379af32a92b84aa3d7f0f27da7b134f62e5b272b32b824e6fa2c"} err="failed to get container status \"73b33b30ff51379af32a92b84aa3d7f0f27da7b134f62e5b272b32b824e6fa2c\": rpc error: code = NotFound desc = could not find container \"73b33b30ff51379af32a92b84aa3d7f0f27da7b134f62e5b272b32b824e6fa2c\": container with ID starting with 73b33b30ff51379af32a92b84aa3d7f0f27da7b134f62e5b272b32b824e6fa2c not found: ID does not exist" Apr 20 20:17:44.293926 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:44.293913 2576 scope.go:117] "RemoveContainer" containerID="e84570c41cde812133b7074634d7883ef65befff7d7bb492ec795091a537dea7" Apr 20 20:17:44.294129 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:17:44.294099 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e84570c41cde812133b7074634d7883ef65befff7d7bb492ec795091a537dea7\": container with ID starting with e84570c41cde812133b7074634d7883ef65befff7d7bb492ec795091a537dea7 not found: ID does not exist" containerID="e84570c41cde812133b7074634d7883ef65befff7d7bb492ec795091a537dea7" Apr 20 20:17:44.294169 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:44.294137 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e84570c41cde812133b7074634d7883ef65befff7d7bb492ec795091a537dea7"} err="failed to get container status \"e84570c41cde812133b7074634d7883ef65befff7d7bb492ec795091a537dea7\": rpc error: code = NotFound desc = could not find container \"e84570c41cde812133b7074634d7883ef65befff7d7bb492ec795091a537dea7\": container with ID starting with e84570c41cde812133b7074634d7883ef65befff7d7bb492ec795091a537dea7 not found: ID does not exist" Apr 20 20:17:44.294169 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:44.294153 2576 scope.go:117] "RemoveContainer" containerID="2e222b591474332b027c9b30a13aa9a6eb07a15a29770764a31f627f26dcb344" Apr 20 20:17:44.294358 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:17:44.294343 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e222b591474332b027c9b30a13aa9a6eb07a15a29770764a31f627f26dcb344\": container with ID starting with 2e222b591474332b027c9b30a13aa9a6eb07a15a29770764a31f627f26dcb344 not found: ID does not exist" containerID="2e222b591474332b027c9b30a13aa9a6eb07a15a29770764a31f627f26dcb344" Apr 20 20:17:44.294407 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:44.294361 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e222b591474332b027c9b30a13aa9a6eb07a15a29770764a31f627f26dcb344"} err="failed to get container status \"2e222b591474332b027c9b30a13aa9a6eb07a15a29770764a31f627f26dcb344\": rpc error: code = NotFound desc = could not find container \"2e222b591474332b027c9b30a13aa9a6eb07a15a29770764a31f627f26dcb344\": container with ID starting with 2e222b591474332b027c9b30a13aa9a6eb07a15a29770764a31f627f26dcb344 not found: ID does not exist" Apr 20 20:17:45.947044 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:45.946520 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d08af062-20ce-42c9-af90-0b4308d2e366" path="/var/lib/kubelet/pods/d08af062-20ce-42c9-af90-0b4308d2e366/volumes" Apr 20 20:17:46.188655 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:46.188613 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" podUID="975ec177-b69b-4b8b-9297-309ea4fec83a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:5000: connect: connection refused" Apr 20 20:17:46.189104 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:46.189074 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" podUID="975ec177-b69b-4b8b-9297-309ea4fec83a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:17:56.188568 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:56.188529 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" podUID="975ec177-b69b-4b8b-9297-309ea4fec83a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:5000: connect: connection refused" Apr 20 20:17:56.188950 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:17:56.188927 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" podUID="975ec177-b69b-4b8b-9297-309ea4fec83a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:18:06.188386 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:06.188345 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" podUID="975ec177-b69b-4b8b-9297-309ea4fec83a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:5000: connect: connection refused" Apr 20 20:18:06.188813 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:06.188729 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" podUID="975ec177-b69b-4b8b-9297-309ea4fec83a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:18:16.188847 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:16.188799 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" podUID="975ec177-b69b-4b8b-9297-309ea4fec83a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:5000: connect: connection refused" Apr 20 20:18:16.189283 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:16.189257 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" podUID="975ec177-b69b-4b8b-9297-309ea4fec83a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:18:26.189088 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:26.188999 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" Apr 20 20:18:26.189490 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:26.189244 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" Apr 20 20:18:38.867209 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:38.867121 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v"] Apr 20 20:18:38.867753 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:38.867480 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" podUID="975ec177-b69b-4b8b-9297-309ea4fec83a" containerName="kserve-container" containerID="cri-o://85e2f487a1bbfb330d5d9c3d57dedf6fd51e5057e344e24558e01ee490639688" gracePeriod=30 Apr 20 20:18:38.867753 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:38.867541 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" podUID="975ec177-b69b-4b8b-9297-309ea4fec83a" containerName="kube-rbac-proxy" containerID="cri-o://0481bf64fd5b33fda03010fc300e08125d2381e54882b036262136458aa70988" gracePeriod=30 Apr 20 20:18:38.867753 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:38.867500 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" podUID="975ec177-b69b-4b8b-9297-309ea4fec83a" containerName="agent" containerID="cri-o://e25ed67c16919565d5ae741276062ee2cd76a8b0719f5c864f55e3e904d2b0a5" gracePeriod=30 Apr 20 20:18:38.939558 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:38.939525 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-twhw4"] Apr 20 20:18:38.939912 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:38.939899 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d08af062-20ce-42c9-af90-0b4308d2e366" containerName="storage-initializer" Apr 20 20:18:38.939954 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:38.939914 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d08af062-20ce-42c9-af90-0b4308d2e366" containerName="storage-initializer" Apr 20 20:18:38.939954 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:38.939922 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d08af062-20ce-42c9-af90-0b4308d2e366" containerName="kserve-container" Apr 20 20:18:38.939954 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:38.939928 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d08af062-20ce-42c9-af90-0b4308d2e366" containerName="kserve-container" Apr 20 20:18:38.939954 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:38.939942 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d08af062-20ce-42c9-af90-0b4308d2e366" containerName="agent" Apr 20 20:18:38.939954 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:38.939948 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d08af062-20ce-42c9-af90-0b4308d2e366" containerName="agent" Apr 20 20:18:38.940101 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:38.939965 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d08af062-20ce-42c9-af90-0b4308d2e366" containerName="kube-rbac-proxy" Apr 20 20:18:38.940101 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:38.939973 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d08af062-20ce-42c9-af90-0b4308d2e366" containerName="kube-rbac-proxy" Apr 20 20:18:38.940101 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:38.940025 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="d08af062-20ce-42c9-af90-0b4308d2e366" containerName="kube-rbac-proxy" Apr 20 20:18:38.940101 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:38.940039 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="d08af062-20ce-42c9-af90-0b4308d2e366" containerName="kserve-container" Apr 20 20:18:38.940101 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:38.940049 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="d08af062-20ce-42c9-af90-0b4308d2e366" containerName="agent" Apr 20 20:18:38.944397 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:38.944376 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-twhw4" Apr 20 20:18:38.947310 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:38.947285 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-kube-rbac-proxy-sar-config\"" Apr 20 20:18:38.947310 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:38.947306 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-predictor-serving-cert\"" Apr 20 20:18:38.953472 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:38.953449 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-twhw4"] Apr 20 20:18:39.089914 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:39.089883 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1962a842-eb66-4ef0-9c63-01c8d73366e5-proxy-tls\") pod \"message-dumper-predictor-c7d86bcbd-twhw4\" (UID: \"1962a842-eb66-4ef0-9c63-01c8d73366e5\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-twhw4" Apr 20 20:18:39.090088 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:39.089981 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1962a842-eb66-4ef0-9c63-01c8d73366e5-message-dumper-kube-rbac-proxy-sar-config\") pod \"message-dumper-predictor-c7d86bcbd-twhw4\" (UID: \"1962a842-eb66-4ef0-9c63-01c8d73366e5\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-twhw4" Apr 20 20:18:39.090185 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:39.090095 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cj4z9\" (UniqueName: \"kubernetes.io/projected/1962a842-eb66-4ef0-9c63-01c8d73366e5-kube-api-access-cj4z9\") pod \"message-dumper-predictor-c7d86bcbd-twhw4\" (UID: \"1962a842-eb66-4ef0-9c63-01c8d73366e5\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-twhw4" Apr 20 20:18:39.190812 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:39.190750 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1962a842-eb66-4ef0-9c63-01c8d73366e5-message-dumper-kube-rbac-proxy-sar-config\") pod \"message-dumper-predictor-c7d86bcbd-twhw4\" (UID: \"1962a842-eb66-4ef0-9c63-01c8d73366e5\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-twhw4" Apr 20 20:18:39.190952 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:39.190821 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cj4z9\" (UniqueName: \"kubernetes.io/projected/1962a842-eb66-4ef0-9c63-01c8d73366e5-kube-api-access-cj4z9\") pod \"message-dumper-predictor-c7d86bcbd-twhw4\" (UID: \"1962a842-eb66-4ef0-9c63-01c8d73366e5\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-twhw4" Apr 20 20:18:39.190952 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:39.190866 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1962a842-eb66-4ef0-9c63-01c8d73366e5-proxy-tls\") pod \"message-dumper-predictor-c7d86bcbd-twhw4\" (UID: \"1962a842-eb66-4ef0-9c63-01c8d73366e5\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-twhw4" Apr 20 20:18:39.191507 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:39.191484 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1962a842-eb66-4ef0-9c63-01c8d73366e5-message-dumper-kube-rbac-proxy-sar-config\") pod \"message-dumper-predictor-c7d86bcbd-twhw4\" (UID: \"1962a842-eb66-4ef0-9c63-01c8d73366e5\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-twhw4" Apr 20 20:18:39.193140 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:39.193099 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1962a842-eb66-4ef0-9c63-01c8d73366e5-proxy-tls\") pod \"message-dumper-predictor-c7d86bcbd-twhw4\" (UID: \"1962a842-eb66-4ef0-9c63-01c8d73366e5\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-twhw4" Apr 20 20:18:39.199265 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:39.199242 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cj4z9\" (UniqueName: \"kubernetes.io/projected/1962a842-eb66-4ef0-9c63-01c8d73366e5-kube-api-access-cj4z9\") pod \"message-dumper-predictor-c7d86bcbd-twhw4\" (UID: \"1962a842-eb66-4ef0-9c63-01c8d73366e5\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-twhw4" Apr 20 20:18:39.257047 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:39.257012 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-twhw4" Apr 20 20:18:39.377624 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:39.377080 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-twhw4"] Apr 20 20:18:39.383242 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:18:39.383190 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1962a842_eb66_4ef0_9c63_01c8d73366e5.slice/crio-8f8c8cc4867605b55c2b3c216aedbcfc4623a4299b888824b66b8b4260525374 WatchSource:0}: Error finding container 8f8c8cc4867605b55c2b3c216aedbcfc4623a4299b888824b66b8b4260525374: Status 404 returned error can't find the container with id 8f8c8cc4867605b55c2b3c216aedbcfc4623a4299b888824b66b8b4260525374 Apr 20 20:18:39.447538 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:39.447469 2576 generic.go:358] "Generic (PLEG): container finished" podID="975ec177-b69b-4b8b-9297-309ea4fec83a" containerID="0481bf64fd5b33fda03010fc300e08125d2381e54882b036262136458aa70988" exitCode=2 Apr 20 20:18:39.447654 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:39.447547 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" event={"ID":"975ec177-b69b-4b8b-9297-309ea4fec83a","Type":"ContainerDied","Data":"0481bf64fd5b33fda03010fc300e08125d2381e54882b036262136458aa70988"} Apr 20 20:18:39.448524 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:39.448504 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-twhw4" event={"ID":"1962a842-eb66-4ef0-9c63-01c8d73366e5","Type":"ContainerStarted","Data":"8f8c8cc4867605b55c2b3c216aedbcfc4623a4299b888824b66b8b4260525374"} Apr 20 20:18:41.183656 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:41.183613 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" podUID="975ec177-b69b-4b8b-9297-309ea4fec83a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.38:8643/healthz\": dial tcp 10.132.0.38:8643: connect: connection refused" Apr 20 20:18:41.459449 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:41.459365 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-twhw4" event={"ID":"1962a842-eb66-4ef0-9c63-01c8d73366e5","Type":"ContainerStarted","Data":"dffbf78f02d02b613e2e547010933b5c30e0e69347ebc429401b8939e04d2e95"} Apr 20 20:18:41.459449 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:41.459401 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-twhw4" event={"ID":"1962a842-eb66-4ef0-9c63-01c8d73366e5","Type":"ContainerStarted","Data":"7523719ce20844b7d6fce95840037d278514bd5721b9d030e6f904072801fc85"} Apr 20 20:18:41.459651 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:41.459562 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-twhw4" Apr 20 20:18:41.477426 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:41.477374 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-twhw4" podStartSLOduration=2.286599286 podStartE2EDuration="3.477359s" podCreationTimestamp="2026-04-20 20:18:38 +0000 UTC" firstStartedPulling="2026-04-20 20:18:39.384675297 +0000 UTC m=+794.097357272" lastFinishedPulling="2026-04-20 20:18:40.575435009 +0000 UTC m=+795.288116986" observedRunningTime="2026-04-20 20:18:41.475840154 +0000 UTC m=+796.188522152" watchObservedRunningTime="2026-04-20 20:18:41.477359 +0000 UTC m=+796.190041001" Apr 20 20:18:42.463150 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:42.463097 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-twhw4" Apr 20 20:18:42.464769 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:42.464747 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-twhw4" Apr 20 20:18:43.468090 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:43.468057 2576 generic.go:358] "Generic (PLEG): container finished" podID="975ec177-b69b-4b8b-9297-309ea4fec83a" containerID="85e2f487a1bbfb330d5d9c3d57dedf6fd51e5057e344e24558e01ee490639688" exitCode=0 Apr 20 20:18:43.468487 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:43.468145 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" event={"ID":"975ec177-b69b-4b8b-9297-309ea4fec83a","Type":"ContainerDied","Data":"85e2f487a1bbfb330d5d9c3d57dedf6fd51e5057e344e24558e01ee490639688"} Apr 20 20:18:46.183927 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:46.183881 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" podUID="975ec177-b69b-4b8b-9297-309ea4fec83a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.38:8643/healthz\": dial tcp 10.132.0.38:8643: connect: connection refused" Apr 20 20:18:46.189309 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:46.189280 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" podUID="975ec177-b69b-4b8b-9297-309ea4fec83a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:5000: connect: connection refused" Apr 20 20:18:46.189614 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:46.189590 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" podUID="975ec177-b69b-4b8b-9297-309ea4fec83a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:18:49.475958 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:49.475930 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-twhw4" Apr 20 20:18:51.183676 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:51.183638 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" podUID="975ec177-b69b-4b8b-9297-309ea4fec83a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.38:8643/healthz\": dial tcp 10.132.0.38:8643: connect: connection refused" Apr 20 20:18:51.184141 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:51.183759 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" Apr 20 20:18:56.183905 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:56.183861 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" podUID="975ec177-b69b-4b8b-9297-309ea4fec83a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.38:8643/healthz\": dial tcp 10.132.0.38:8643: connect: connection refused" Apr 20 20:18:56.189222 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:56.189194 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" podUID="975ec177-b69b-4b8b-9297-309ea4fec83a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:5000: connect: connection refused" Apr 20 20:18:56.189561 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:56.189545 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" podUID="975ec177-b69b-4b8b-9297-309ea4fec83a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:18:58.991127 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:58.991088 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs"] Apr 20 20:18:58.994869 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:58.994852 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" Apr 20 20:18:58.997628 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:58.997606 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-predictor-serving-cert\"" Apr 20 20:18:58.997725 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:58.997702 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-kube-rbac-proxy-sar-config\"" Apr 20 20:18:59.004680 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:59.004661 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs"] Apr 20 20:18:59.149825 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:59.149791 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99dadaa7-f559-4fe7-9267-042eeea2992f-proxy-tls\") pod \"isvc-logger-predictor-64d54fcc88-r98gs\" (UID: \"99dadaa7-f559-4fe7-9267-042eeea2992f\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" Apr 20 20:18:59.149825 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:59.149827 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/99dadaa7-f559-4fe7-9267-042eeea2992f-isvc-logger-kube-rbac-proxy-sar-config\") pod \"isvc-logger-predictor-64d54fcc88-r98gs\" (UID: \"99dadaa7-f559-4fe7-9267-042eeea2992f\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" Apr 20 20:18:59.150081 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:59.149852 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/99dadaa7-f559-4fe7-9267-042eeea2992f-kserve-provision-location\") pod \"isvc-logger-predictor-64d54fcc88-r98gs\" (UID: \"99dadaa7-f559-4fe7-9267-042eeea2992f\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" Apr 20 20:18:59.150081 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:59.149932 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8t5k8\" (UniqueName: \"kubernetes.io/projected/99dadaa7-f559-4fe7-9267-042eeea2992f-kube-api-access-8t5k8\") pod \"isvc-logger-predictor-64d54fcc88-r98gs\" (UID: \"99dadaa7-f559-4fe7-9267-042eeea2992f\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" Apr 20 20:18:59.250917 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:59.250826 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99dadaa7-f559-4fe7-9267-042eeea2992f-proxy-tls\") pod \"isvc-logger-predictor-64d54fcc88-r98gs\" (UID: \"99dadaa7-f559-4fe7-9267-042eeea2992f\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" Apr 20 20:18:59.250917 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:59.250880 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/99dadaa7-f559-4fe7-9267-042eeea2992f-isvc-logger-kube-rbac-proxy-sar-config\") pod \"isvc-logger-predictor-64d54fcc88-r98gs\" (UID: \"99dadaa7-f559-4fe7-9267-042eeea2992f\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" Apr 20 20:18:59.251222 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:59.250918 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/99dadaa7-f559-4fe7-9267-042eeea2992f-kserve-provision-location\") pod \"isvc-logger-predictor-64d54fcc88-r98gs\" (UID: \"99dadaa7-f559-4fe7-9267-042eeea2992f\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" Apr 20 20:18:59.251222 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:59.250963 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8t5k8\" (UniqueName: \"kubernetes.io/projected/99dadaa7-f559-4fe7-9267-042eeea2992f-kube-api-access-8t5k8\") pod \"isvc-logger-predictor-64d54fcc88-r98gs\" (UID: \"99dadaa7-f559-4fe7-9267-042eeea2992f\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" Apr 20 20:18:59.251222 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:18:59.250988 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-logger-predictor-serving-cert: secret "isvc-logger-predictor-serving-cert" not found Apr 20 20:18:59.251222 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:18:59.251045 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99dadaa7-f559-4fe7-9267-042eeea2992f-proxy-tls podName:99dadaa7-f559-4fe7-9267-042eeea2992f nodeName:}" failed. No retries permitted until 2026-04-20 20:18:59.751029779 +0000 UTC m=+814.463711759 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/99dadaa7-f559-4fe7-9267-042eeea2992f-proxy-tls") pod "isvc-logger-predictor-64d54fcc88-r98gs" (UID: "99dadaa7-f559-4fe7-9267-042eeea2992f") : secret "isvc-logger-predictor-serving-cert" not found Apr 20 20:18:59.251422 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:59.251389 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/99dadaa7-f559-4fe7-9267-042eeea2992f-kserve-provision-location\") pod \"isvc-logger-predictor-64d54fcc88-r98gs\" (UID: \"99dadaa7-f559-4fe7-9267-042eeea2992f\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" Apr 20 20:18:59.251643 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:59.251624 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/99dadaa7-f559-4fe7-9267-042eeea2992f-isvc-logger-kube-rbac-proxy-sar-config\") pod \"isvc-logger-predictor-64d54fcc88-r98gs\" (UID: \"99dadaa7-f559-4fe7-9267-042eeea2992f\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" Apr 20 20:18:59.259637 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:59.259611 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8t5k8\" (UniqueName: \"kubernetes.io/projected/99dadaa7-f559-4fe7-9267-042eeea2992f-kube-api-access-8t5k8\") pod \"isvc-logger-predictor-64d54fcc88-r98gs\" (UID: \"99dadaa7-f559-4fe7-9267-042eeea2992f\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" Apr 20 20:18:59.755624 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:59.755577 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99dadaa7-f559-4fe7-9267-042eeea2992f-proxy-tls\") pod \"isvc-logger-predictor-64d54fcc88-r98gs\" (UID: \"99dadaa7-f559-4fe7-9267-042eeea2992f\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" Apr 20 20:18:59.758080 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:59.758048 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99dadaa7-f559-4fe7-9267-042eeea2992f-proxy-tls\") pod \"isvc-logger-predictor-64d54fcc88-r98gs\" (UID: \"99dadaa7-f559-4fe7-9267-042eeea2992f\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" Apr 20 20:18:59.907245 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:18:59.907214 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" Apr 20 20:19:00.032708 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:00.032666 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs"] Apr 20 20:19:00.034518 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:19:00.034482 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99dadaa7_f559_4fe7_9267_042eeea2992f.slice/crio-a288264925e52c26341e8adfc20e8348bb904242f620071bbb213d1af5e6154d WatchSource:0}: Error finding container a288264925e52c26341e8adfc20e8348bb904242f620071bbb213d1af5e6154d: Status 404 returned error can't find the container with id a288264925e52c26341e8adfc20e8348bb904242f620071bbb213d1af5e6154d Apr 20 20:19:00.531095 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:00.531048 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" event={"ID":"99dadaa7-f559-4fe7-9267-042eeea2992f","Type":"ContainerStarted","Data":"5d4365d0eabd6278c7c1eec14da24868f62a63fa62512b943f39450484924a7c"} Apr 20 20:19:00.531095 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:00.531093 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" event={"ID":"99dadaa7-f559-4fe7-9267-042eeea2992f","Type":"ContainerStarted","Data":"a288264925e52c26341e8adfc20e8348bb904242f620071bbb213d1af5e6154d"} Apr 20 20:19:01.183887 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:01.183846 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" podUID="975ec177-b69b-4b8b-9297-309ea4fec83a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.38:8643/healthz\": dial tcp 10.132.0.38:8643: connect: connection refused" Apr 20 20:19:04.545694 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:04.545662 2576 generic.go:358] "Generic (PLEG): container finished" podID="99dadaa7-f559-4fe7-9267-042eeea2992f" containerID="5d4365d0eabd6278c7c1eec14da24868f62a63fa62512b943f39450484924a7c" exitCode=0 Apr 20 20:19:04.546087 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:04.545719 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" event={"ID":"99dadaa7-f559-4fe7-9267-042eeea2992f","Type":"ContainerDied","Data":"5d4365d0eabd6278c7c1eec14da24868f62a63fa62512b943f39450484924a7c"} Apr 20 20:19:05.551289 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:05.551251 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" event={"ID":"99dadaa7-f559-4fe7-9267-042eeea2992f","Type":"ContainerStarted","Data":"ccb5ca53f7095f789d16bd67877acff8382367674630344480f23c1714cd1a07"} Apr 20 20:19:05.551289 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:05.551293 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" event={"ID":"99dadaa7-f559-4fe7-9267-042eeea2992f","Type":"ContainerStarted","Data":"a39aa83ee2f64039af6b6c7a210ff73f4f890d2e6abada19330a977a13b5222c"} Apr 20 20:19:05.551690 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:05.551303 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" event={"ID":"99dadaa7-f559-4fe7-9267-042eeea2992f","Type":"ContainerStarted","Data":"d627ea5c10a89fe0750c832b9df94b1a17085566e0b1e964aa535a1ff5513ecf"} Apr 20 20:19:05.551690 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:05.551521 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" Apr 20 20:19:05.573090 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:05.573039 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" podStartSLOduration=7.573025905 podStartE2EDuration="7.573025905s" podCreationTimestamp="2026-04-20 20:18:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:19:05.570537882 +0000 UTC m=+820.283219879" watchObservedRunningTime="2026-04-20 20:19:05.573025905 +0000 UTC m=+820.285707901" Apr 20 20:19:06.184005 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:06.183963 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" podUID="975ec177-b69b-4b8b-9297-309ea4fec83a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.38:8643/healthz\": dial tcp 10.132.0.38:8643: connect: connection refused" Apr 20 20:19:06.188717 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:06.188692 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" podUID="975ec177-b69b-4b8b-9297-309ea4fec83a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:5000: connect: connection refused" Apr 20 20:19:06.188828 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:06.188812 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" Apr 20 20:19:06.189144 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:06.189096 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" podUID="975ec177-b69b-4b8b-9297-309ea4fec83a" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:19:06.189234 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:06.189223 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" Apr 20 20:19:06.554963 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:06.554935 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" Apr 20 20:19:06.555344 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:06.554974 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" Apr 20 20:19:06.556337 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:06.556303 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" podUID="99dadaa7-f559-4fe7-9267-042eeea2992f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 20 20:19:06.556993 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:06.556970 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" podUID="99dadaa7-f559-4fe7-9267-042eeea2992f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:19:07.558321 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:07.558282 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" podUID="99dadaa7-f559-4fe7-9267-042eeea2992f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 20 20:19:07.558737 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:07.558649 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" podUID="99dadaa7-f559-4fe7-9267-042eeea2992f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:19:09.020577 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:09.020552 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" Apr 20 20:19:09.030467 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:09.030446 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/975ec177-b69b-4b8b-9297-309ea4fec83a-kserve-provision-location\") pod \"975ec177-b69b-4b8b-9297-309ea4fec83a\" (UID: \"975ec177-b69b-4b8b-9297-309ea4fec83a\") " Apr 20 20:19:09.030553 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:09.030489 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxklw\" (UniqueName: \"kubernetes.io/projected/975ec177-b69b-4b8b-9297-309ea4fec83a-kube-api-access-pxklw\") pod \"975ec177-b69b-4b8b-9297-309ea4fec83a\" (UID: \"975ec177-b69b-4b8b-9297-309ea4fec83a\") " Apr 20 20:19:09.030553 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:09.030509 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/975ec177-b69b-4b8b-9297-309ea4fec83a-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"975ec177-b69b-4b8b-9297-309ea4fec83a\" (UID: \"975ec177-b69b-4b8b-9297-309ea4fec83a\") " Apr 20 20:19:09.030553 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:09.030537 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/975ec177-b69b-4b8b-9297-309ea4fec83a-proxy-tls\") pod \"975ec177-b69b-4b8b-9297-309ea4fec83a\" (UID: \"975ec177-b69b-4b8b-9297-309ea4fec83a\") " Apr 20 20:19:09.030800 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:09.030778 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/975ec177-b69b-4b8b-9297-309ea4fec83a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "975ec177-b69b-4b8b-9297-309ea4fec83a" (UID: "975ec177-b69b-4b8b-9297-309ea4fec83a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:19:09.030887 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:09.030833 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/975ec177-b69b-4b8b-9297-309ea4fec83a-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config") pod "975ec177-b69b-4b8b-9297-309ea4fec83a" (UID: "975ec177-b69b-4b8b-9297-309ea4fec83a"). InnerVolumeSpecName "isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:19:09.032413 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:09.032394 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/975ec177-b69b-4b8b-9297-309ea4fec83a-kube-api-access-pxklw" (OuterVolumeSpecName: "kube-api-access-pxklw") pod "975ec177-b69b-4b8b-9297-309ea4fec83a" (UID: "975ec177-b69b-4b8b-9297-309ea4fec83a"). InnerVolumeSpecName "kube-api-access-pxklw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:19:09.032604 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:09.032584 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/975ec177-b69b-4b8b-9297-309ea4fec83a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "975ec177-b69b-4b8b-9297-309ea4fec83a" (UID: "975ec177-b69b-4b8b-9297-309ea4fec83a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:19:09.131586 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:09.131500 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/975ec177-b69b-4b8b-9297-309ea4fec83a-kserve-provision-location\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:19:09.131586 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:09.131530 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pxklw\" (UniqueName: \"kubernetes.io/projected/975ec177-b69b-4b8b-9297-309ea4fec83a-kube-api-access-pxklw\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:19:09.131586 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:09.131541 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/975ec177-b69b-4b8b-9297-309ea4fec83a-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:19:09.131586 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:09.131551 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/975ec177-b69b-4b8b-9297-309ea4fec83a-proxy-tls\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:19:09.567216 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:09.567182 2576 generic.go:358] "Generic (PLEG): container finished" podID="975ec177-b69b-4b8b-9297-309ea4fec83a" containerID="e25ed67c16919565d5ae741276062ee2cd76a8b0719f5c864f55e3e904d2b0a5" exitCode=0 Apr 20 20:19:09.567370 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:09.567221 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" event={"ID":"975ec177-b69b-4b8b-9297-309ea4fec83a","Type":"ContainerDied","Data":"e25ed67c16919565d5ae741276062ee2cd76a8b0719f5c864f55e3e904d2b0a5"} Apr 20 20:19:09.567370 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:09.567249 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" event={"ID":"975ec177-b69b-4b8b-9297-309ea4fec83a","Type":"ContainerDied","Data":"4470ddd38ac4a68dda2112f586e100e3854d078aed1a140b2fc4275cc050ce7e"} Apr 20 20:19:09.567370 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:09.567264 2576 scope.go:117] "RemoveContainer" containerID="e25ed67c16919565d5ae741276062ee2cd76a8b0719f5c864f55e3e904d2b0a5" Apr 20 20:19:09.567370 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:09.567274 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v" Apr 20 20:19:09.575622 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:09.575605 2576 scope.go:117] "RemoveContainer" containerID="0481bf64fd5b33fda03010fc300e08125d2381e54882b036262136458aa70988" Apr 20 20:19:09.582719 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:09.582701 2576 scope.go:117] "RemoveContainer" containerID="85e2f487a1bbfb330d5d9c3d57dedf6fd51e5057e344e24558e01ee490639688" Apr 20 20:19:09.589714 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:09.589692 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v"] Apr 20 20:19:09.590334 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:09.590322 2576 scope.go:117] "RemoveContainer" containerID="8caf7f055159cf29c23c08f4b1e980877df07c0c5367c1fc31824ef2ffceabfc" Apr 20 20:19:09.593940 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:09.593919 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-b8x7v"] Apr 20 20:19:09.597505 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:09.597481 2576 scope.go:117] "RemoveContainer" containerID="e25ed67c16919565d5ae741276062ee2cd76a8b0719f5c864f55e3e904d2b0a5" Apr 20 20:19:09.597759 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:19:09.597740 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e25ed67c16919565d5ae741276062ee2cd76a8b0719f5c864f55e3e904d2b0a5\": container with ID starting with e25ed67c16919565d5ae741276062ee2cd76a8b0719f5c864f55e3e904d2b0a5 not found: ID does not exist" containerID="e25ed67c16919565d5ae741276062ee2cd76a8b0719f5c864f55e3e904d2b0a5" Apr 20 20:19:09.597828 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:09.597767 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e25ed67c16919565d5ae741276062ee2cd76a8b0719f5c864f55e3e904d2b0a5"} err="failed to get container status \"e25ed67c16919565d5ae741276062ee2cd76a8b0719f5c864f55e3e904d2b0a5\": rpc error: code = NotFound desc = could not find container \"e25ed67c16919565d5ae741276062ee2cd76a8b0719f5c864f55e3e904d2b0a5\": container with ID starting with e25ed67c16919565d5ae741276062ee2cd76a8b0719f5c864f55e3e904d2b0a5 not found: ID does not exist" Apr 20 20:19:09.597828 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:09.597787 2576 scope.go:117] "RemoveContainer" containerID="0481bf64fd5b33fda03010fc300e08125d2381e54882b036262136458aa70988" Apr 20 20:19:09.598020 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:19:09.597996 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0481bf64fd5b33fda03010fc300e08125d2381e54882b036262136458aa70988\": container with ID starting with 0481bf64fd5b33fda03010fc300e08125d2381e54882b036262136458aa70988 not found: ID does not exist" containerID="0481bf64fd5b33fda03010fc300e08125d2381e54882b036262136458aa70988" Apr 20 20:19:09.598060 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:09.598024 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0481bf64fd5b33fda03010fc300e08125d2381e54882b036262136458aa70988"} err="failed to get container status \"0481bf64fd5b33fda03010fc300e08125d2381e54882b036262136458aa70988\": rpc error: code = NotFound desc = could not find container \"0481bf64fd5b33fda03010fc300e08125d2381e54882b036262136458aa70988\": container with ID starting with 0481bf64fd5b33fda03010fc300e08125d2381e54882b036262136458aa70988 not found: ID does not exist" Apr 20 20:19:09.598060 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:09.598038 2576 scope.go:117] "RemoveContainer" containerID="85e2f487a1bbfb330d5d9c3d57dedf6fd51e5057e344e24558e01ee490639688" Apr 20 20:19:09.598282 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:19:09.598263 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85e2f487a1bbfb330d5d9c3d57dedf6fd51e5057e344e24558e01ee490639688\": container with ID starting with 85e2f487a1bbfb330d5d9c3d57dedf6fd51e5057e344e24558e01ee490639688 not found: ID does not exist" containerID="85e2f487a1bbfb330d5d9c3d57dedf6fd51e5057e344e24558e01ee490639688" Apr 20 20:19:09.598328 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:09.598287 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85e2f487a1bbfb330d5d9c3d57dedf6fd51e5057e344e24558e01ee490639688"} err="failed to get container status \"85e2f487a1bbfb330d5d9c3d57dedf6fd51e5057e344e24558e01ee490639688\": rpc error: code = NotFound desc = could not find container \"85e2f487a1bbfb330d5d9c3d57dedf6fd51e5057e344e24558e01ee490639688\": container with ID starting with 85e2f487a1bbfb330d5d9c3d57dedf6fd51e5057e344e24558e01ee490639688 not found: ID does not exist" Apr 20 20:19:09.598328 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:09.598302 2576 scope.go:117] "RemoveContainer" containerID="8caf7f055159cf29c23c08f4b1e980877df07c0c5367c1fc31824ef2ffceabfc" Apr 20 20:19:09.598505 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:19:09.598491 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8caf7f055159cf29c23c08f4b1e980877df07c0c5367c1fc31824ef2ffceabfc\": container with ID starting with 8caf7f055159cf29c23c08f4b1e980877df07c0c5367c1fc31824ef2ffceabfc not found: ID does not exist" containerID="8caf7f055159cf29c23c08f4b1e980877df07c0c5367c1fc31824ef2ffceabfc" Apr 20 20:19:09.598543 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:09.598511 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8caf7f055159cf29c23c08f4b1e980877df07c0c5367c1fc31824ef2ffceabfc"} err="failed to get container status \"8caf7f055159cf29c23c08f4b1e980877df07c0c5367c1fc31824ef2ffceabfc\": rpc error: code = NotFound desc = could not find container \"8caf7f055159cf29c23c08f4b1e980877df07c0c5367c1fc31824ef2ffceabfc\": container with ID starting with 8caf7f055159cf29c23c08f4b1e980877df07c0c5367c1fc31824ef2ffceabfc not found: ID does not exist" Apr 20 20:19:09.946392 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:09.946314 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="975ec177-b69b-4b8b-9297-309ea4fec83a" path="/var/lib/kubelet/pods/975ec177-b69b-4b8b-9297-309ea4fec83a/volumes" Apr 20 20:19:12.562260 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:12.562229 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" Apr 20 20:19:12.562885 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:12.562850 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" podUID="99dadaa7-f559-4fe7-9267-042eeea2992f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 20 20:19:12.563337 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:12.563315 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" podUID="99dadaa7-f559-4fe7-9267-042eeea2992f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:19:22.563427 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:22.563385 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" podUID="99dadaa7-f559-4fe7-9267-042eeea2992f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 20 20:19:22.563922 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:22.563855 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" podUID="99dadaa7-f559-4fe7-9267-042eeea2992f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:19:32.563181 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:32.563138 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" podUID="99dadaa7-f559-4fe7-9267-042eeea2992f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 20 20:19:32.563695 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:32.563673 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" podUID="99dadaa7-f559-4fe7-9267-042eeea2992f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:19:42.563625 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:42.563581 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" podUID="99dadaa7-f559-4fe7-9267-042eeea2992f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 20 20:19:42.564138 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:42.564096 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" podUID="99dadaa7-f559-4fe7-9267-042eeea2992f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:19:52.563593 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:52.563550 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" podUID="99dadaa7-f559-4fe7-9267-042eeea2992f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 20 20:19:52.564079 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:19:52.564055 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" podUID="99dadaa7-f559-4fe7-9267-042eeea2992f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:20:02.563600 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:02.563560 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" podUID="99dadaa7-f559-4fe7-9267-042eeea2992f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 20 20:20:02.564100 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:02.564077 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" podUID="99dadaa7-f559-4fe7-9267-042eeea2992f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:20:12.563288 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:12.563212 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" Apr 20 20:20:12.563727 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:12.563299 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" Apr 20 20:20:24.028084 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.028047 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-predictor-c7d86bcbd-twhw4_1962a842-eb66-4ef0-9c63-01c8d73366e5/kserve-container/0.log" Apr 20 20:20:24.302446 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.302419 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-twhw4"] Apr 20 20:20:24.302751 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.302697 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-twhw4" podUID="1962a842-eb66-4ef0-9c63-01c8d73366e5" containerName="kserve-container" containerID="cri-o://7523719ce20844b7d6fce95840037d278514bd5721b9d030e6f904072801fc85" gracePeriod=30 Apr 20 20:20:24.302830 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.302715 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-twhw4" podUID="1962a842-eb66-4ef0-9c63-01c8d73366e5" containerName="kube-rbac-proxy" containerID="cri-o://dffbf78f02d02b613e2e547010933b5c30e0e69347ebc429401b8939e04d2e95" gracePeriod=30 Apr 20 20:20:24.551456 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.551434 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-twhw4" Apr 20 20:20:24.616776 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.616710 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cj4z9\" (UniqueName: \"kubernetes.io/projected/1962a842-eb66-4ef0-9c63-01c8d73366e5-kube-api-access-cj4z9\") pod \"1962a842-eb66-4ef0-9c63-01c8d73366e5\" (UID: \"1962a842-eb66-4ef0-9c63-01c8d73366e5\") " Apr 20 20:20:24.616776 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.616748 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1962a842-eb66-4ef0-9c63-01c8d73366e5-proxy-tls\") pod \"1962a842-eb66-4ef0-9c63-01c8d73366e5\" (UID: \"1962a842-eb66-4ef0-9c63-01c8d73366e5\") " Apr 20 20:20:24.616958 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.616787 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1962a842-eb66-4ef0-9c63-01c8d73366e5-message-dumper-kube-rbac-proxy-sar-config\") pod \"1962a842-eb66-4ef0-9c63-01c8d73366e5\" (UID: \"1962a842-eb66-4ef0-9c63-01c8d73366e5\") " Apr 20 20:20:24.617272 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.617243 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1962a842-eb66-4ef0-9c63-01c8d73366e5-message-dumper-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "message-dumper-kube-rbac-proxy-sar-config") pod "1962a842-eb66-4ef0-9c63-01c8d73366e5" (UID: "1962a842-eb66-4ef0-9c63-01c8d73366e5"). InnerVolumeSpecName "message-dumper-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:20:24.618792 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.618774 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1962a842-eb66-4ef0-9c63-01c8d73366e5-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "1962a842-eb66-4ef0-9c63-01c8d73366e5" (UID: "1962a842-eb66-4ef0-9c63-01c8d73366e5"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:20:24.618854 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.618834 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1962a842-eb66-4ef0-9c63-01c8d73366e5-kube-api-access-cj4z9" (OuterVolumeSpecName: "kube-api-access-cj4z9") pod "1962a842-eb66-4ef0-9c63-01c8d73366e5" (UID: "1962a842-eb66-4ef0-9c63-01c8d73366e5"). InnerVolumeSpecName "kube-api-access-cj4z9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:20:24.717541 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.717497 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cj4z9\" (UniqueName: \"kubernetes.io/projected/1962a842-eb66-4ef0-9c63-01c8d73366e5-kube-api-access-cj4z9\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:20:24.717541 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.717528 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1962a842-eb66-4ef0-9c63-01c8d73366e5-proxy-tls\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:20:24.717541 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.717545 2576 reconciler_common.go:299] "Volume detached for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1962a842-eb66-4ef0-9c63-01c8d73366e5-message-dumper-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:20:24.826952 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.826906 2576 generic.go:358] "Generic (PLEG): container finished" podID="1962a842-eb66-4ef0-9c63-01c8d73366e5" containerID="dffbf78f02d02b613e2e547010933b5c30e0e69347ebc429401b8939e04d2e95" exitCode=2 Apr 20 20:20:24.826952 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.826949 2576 generic.go:358] "Generic (PLEG): container finished" podID="1962a842-eb66-4ef0-9c63-01c8d73366e5" containerID="7523719ce20844b7d6fce95840037d278514bd5721b9d030e6f904072801fc85" exitCode=2 Apr 20 20:20:24.827152 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.826948 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-twhw4" event={"ID":"1962a842-eb66-4ef0-9c63-01c8d73366e5","Type":"ContainerDied","Data":"dffbf78f02d02b613e2e547010933b5c30e0e69347ebc429401b8939e04d2e95"} Apr 20 20:20:24.827152 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.826982 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-twhw4" Apr 20 20:20:24.827152 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.826995 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-twhw4" event={"ID":"1962a842-eb66-4ef0-9c63-01c8d73366e5","Type":"ContainerDied","Data":"7523719ce20844b7d6fce95840037d278514bd5721b9d030e6f904072801fc85"} Apr 20 20:20:24.827152 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.827013 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-twhw4" event={"ID":"1962a842-eb66-4ef0-9c63-01c8d73366e5","Type":"ContainerDied","Data":"8f8c8cc4867605b55c2b3c216aedbcfc4623a4299b888824b66b8b4260525374"} Apr 20 20:20:24.827152 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.827025 2576 scope.go:117] "RemoveContainer" containerID="dffbf78f02d02b613e2e547010933b5c30e0e69347ebc429401b8939e04d2e95" Apr 20 20:20:24.843937 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.843919 2576 scope.go:117] "RemoveContainer" containerID="7523719ce20844b7d6fce95840037d278514bd5721b9d030e6f904072801fc85" Apr 20 20:20:24.851314 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.851265 2576 scope.go:117] "RemoveContainer" containerID="dffbf78f02d02b613e2e547010933b5c30e0e69347ebc429401b8939e04d2e95" Apr 20 20:20:24.851737 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:20:24.851696 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dffbf78f02d02b613e2e547010933b5c30e0e69347ebc429401b8939e04d2e95\": container with ID starting with dffbf78f02d02b613e2e547010933b5c30e0e69347ebc429401b8939e04d2e95 not found: ID does not exist" containerID="dffbf78f02d02b613e2e547010933b5c30e0e69347ebc429401b8939e04d2e95" Apr 20 20:20:24.851887 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.851855 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dffbf78f02d02b613e2e547010933b5c30e0e69347ebc429401b8939e04d2e95"} err="failed to get container status \"dffbf78f02d02b613e2e547010933b5c30e0e69347ebc429401b8939e04d2e95\": rpc error: code = NotFound desc = could not find container \"dffbf78f02d02b613e2e547010933b5c30e0e69347ebc429401b8939e04d2e95\": container with ID starting with dffbf78f02d02b613e2e547010933b5c30e0e69347ebc429401b8939e04d2e95 not found: ID does not exist" Apr 20 20:20:24.851887 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.851888 2576 scope.go:117] "RemoveContainer" containerID="7523719ce20844b7d6fce95840037d278514bd5721b9d030e6f904072801fc85" Apr 20 20:20:24.852203 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:20:24.852183 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7523719ce20844b7d6fce95840037d278514bd5721b9d030e6f904072801fc85\": container with ID starting with 7523719ce20844b7d6fce95840037d278514bd5721b9d030e6f904072801fc85 not found: ID does not exist" containerID="7523719ce20844b7d6fce95840037d278514bd5721b9d030e6f904072801fc85" Apr 20 20:20:24.852290 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.852218 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7523719ce20844b7d6fce95840037d278514bd5721b9d030e6f904072801fc85"} err="failed to get container status \"7523719ce20844b7d6fce95840037d278514bd5721b9d030e6f904072801fc85\": rpc error: code = NotFound desc = could not find container \"7523719ce20844b7d6fce95840037d278514bd5721b9d030e6f904072801fc85\": container with ID starting with 7523719ce20844b7d6fce95840037d278514bd5721b9d030e6f904072801fc85 not found: ID does not exist" Apr 20 20:20:24.852290 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.852241 2576 scope.go:117] "RemoveContainer" containerID="dffbf78f02d02b613e2e547010933b5c30e0e69347ebc429401b8939e04d2e95" Apr 20 20:20:24.852544 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.852516 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dffbf78f02d02b613e2e547010933b5c30e0e69347ebc429401b8939e04d2e95"} err="failed to get container status \"dffbf78f02d02b613e2e547010933b5c30e0e69347ebc429401b8939e04d2e95\": rpc error: code = NotFound desc = could not find container \"dffbf78f02d02b613e2e547010933b5c30e0e69347ebc429401b8939e04d2e95\": container with ID starting with dffbf78f02d02b613e2e547010933b5c30e0e69347ebc429401b8939e04d2e95 not found: ID does not exist" Apr 20 20:20:24.852622 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.852545 2576 scope.go:117] "RemoveContainer" containerID="7523719ce20844b7d6fce95840037d278514bd5721b9d030e6f904072801fc85" Apr 20 20:20:24.852846 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.852818 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7523719ce20844b7d6fce95840037d278514bd5721b9d030e6f904072801fc85"} err="failed to get container status \"7523719ce20844b7d6fce95840037d278514bd5721b9d030e6f904072801fc85\": rpc error: code = NotFound desc = could not find container \"7523719ce20844b7d6fce95840037d278514bd5721b9d030e6f904072801fc85\": container with ID starting with 7523719ce20844b7d6fce95840037d278514bd5721b9d030e6f904072801fc85 not found: ID does not exist" Apr 20 20:20:24.853691 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.853671 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-twhw4"] Apr 20 20:20:24.856965 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.856946 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-twhw4"] Apr 20 20:20:24.884909 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.884863 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-99sfd"] Apr 20 20:20:24.885259 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.885245 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="975ec177-b69b-4b8b-9297-309ea4fec83a" containerName="storage-initializer" Apr 20 20:20:24.885306 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.885261 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="975ec177-b69b-4b8b-9297-309ea4fec83a" containerName="storage-initializer" Apr 20 20:20:24.885306 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.885268 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="975ec177-b69b-4b8b-9297-309ea4fec83a" containerName="kserve-container" Apr 20 20:20:24.885306 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.885274 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="975ec177-b69b-4b8b-9297-309ea4fec83a" containerName="kserve-container" Apr 20 20:20:24.885306 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.885282 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1962a842-eb66-4ef0-9c63-01c8d73366e5" containerName="kube-rbac-proxy" Apr 20 20:20:24.885306 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.885288 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="1962a842-eb66-4ef0-9c63-01c8d73366e5" containerName="kube-rbac-proxy" Apr 20 20:20:24.885306 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.885297 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="975ec177-b69b-4b8b-9297-309ea4fec83a" containerName="kube-rbac-proxy" Apr 20 20:20:24.885306 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.885302 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="975ec177-b69b-4b8b-9297-309ea4fec83a" containerName="kube-rbac-proxy" Apr 20 20:20:24.885507 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.885313 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1962a842-eb66-4ef0-9c63-01c8d73366e5" containerName="kserve-container" Apr 20 20:20:24.885507 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.885319 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="1962a842-eb66-4ef0-9c63-01c8d73366e5" containerName="kserve-container" Apr 20 20:20:24.885507 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.885333 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="975ec177-b69b-4b8b-9297-309ea4fec83a" containerName="agent" Apr 20 20:20:24.885507 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.885338 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="975ec177-b69b-4b8b-9297-309ea4fec83a" containerName="agent" Apr 20 20:20:24.885507 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.885386 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="975ec177-b69b-4b8b-9297-309ea4fec83a" containerName="kserve-container" Apr 20 20:20:24.885507 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.885395 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="975ec177-b69b-4b8b-9297-309ea4fec83a" containerName="agent" Apr 20 20:20:24.885507 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.885402 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="1962a842-eb66-4ef0-9c63-01c8d73366e5" containerName="kube-rbac-proxy" Apr 20 20:20:24.885507 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.885408 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="975ec177-b69b-4b8b-9297-309ea4fec83a" containerName="kube-rbac-proxy" Apr 20 20:20:24.885507 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.885416 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="1962a842-eb66-4ef0-9c63-01c8d73366e5" containerName="kserve-container" Apr 20 20:20:24.889708 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.889693 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-99sfd" Apr 20 20:20:24.892264 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.892244 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-kube-rbac-proxy-sar-config\"" Apr 20 20:20:24.892355 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.892296 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-predictor-serving-cert\"" Apr 20 20:20:24.919275 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.919252 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/54f66df5-f27f-4cd2-b9e9-8981034b65ff-proxy-tls\") pod \"isvc-lightgbm-predictor-bdf964bd-99sfd\" (UID: \"54f66df5-f27f-4cd2-b9e9-8981034b65ff\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-99sfd" Apr 20 20:20:24.919381 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.919287 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/54f66df5-f27f-4cd2-b9e9-8981034b65ff-kserve-provision-location\") pod \"isvc-lightgbm-predictor-bdf964bd-99sfd\" (UID: \"54f66df5-f27f-4cd2-b9e9-8981034b65ff\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-99sfd" Apr 20 20:20:24.919381 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.919314 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/54f66df5-f27f-4cd2-b9e9-8981034b65ff-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-predictor-bdf964bd-99sfd\" (UID: \"54f66df5-f27f-4cd2-b9e9-8981034b65ff\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-99sfd" Apr 20 20:20:24.919381 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.919345 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2957z\" (UniqueName: \"kubernetes.io/projected/54f66df5-f27f-4cd2-b9e9-8981034b65ff-kube-api-access-2957z\") pod \"isvc-lightgbm-predictor-bdf964bd-99sfd\" (UID: \"54f66df5-f27f-4cd2-b9e9-8981034b65ff\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-99sfd" Apr 20 20:20:24.941783 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.941758 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs"] Apr 20 20:20:24.942075 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.942040 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" podUID="99dadaa7-f559-4fe7-9267-042eeea2992f" containerName="kserve-container" containerID="cri-o://d627ea5c10a89fe0750c832b9df94b1a17085566e0b1e964aa535a1ff5513ecf" gracePeriod=30 Apr 20 20:20:24.942181 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.942076 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" podUID="99dadaa7-f559-4fe7-9267-042eeea2992f" containerName="agent" containerID="cri-o://ccb5ca53f7095f789d16bd67877acff8382367674630344480f23c1714cd1a07" gracePeriod=30 Apr 20 20:20:24.942181 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.942076 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" podUID="99dadaa7-f559-4fe7-9267-042eeea2992f" containerName="kube-rbac-proxy" containerID="cri-o://a39aa83ee2f64039af6b6c7a210ff73f4f890d2e6abada19330a977a13b5222c" gracePeriod=30 Apr 20 20:20:24.954880 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:24.954859 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-99sfd"] Apr 20 20:20:25.020688 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:25.020660 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/54f66df5-f27f-4cd2-b9e9-8981034b65ff-proxy-tls\") pod \"isvc-lightgbm-predictor-bdf964bd-99sfd\" (UID: \"54f66df5-f27f-4cd2-b9e9-8981034b65ff\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-99sfd" Apr 20 20:20:25.020799 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:25.020694 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/54f66df5-f27f-4cd2-b9e9-8981034b65ff-kserve-provision-location\") pod \"isvc-lightgbm-predictor-bdf964bd-99sfd\" (UID: \"54f66df5-f27f-4cd2-b9e9-8981034b65ff\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-99sfd" Apr 20 20:20:25.020799 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:25.020735 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/54f66df5-f27f-4cd2-b9e9-8981034b65ff-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-predictor-bdf964bd-99sfd\" (UID: \"54f66df5-f27f-4cd2-b9e9-8981034b65ff\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-99sfd" Apr 20 20:20:25.020799 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:25.020768 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2957z\" (UniqueName: \"kubernetes.io/projected/54f66df5-f27f-4cd2-b9e9-8981034b65ff-kube-api-access-2957z\") pod \"isvc-lightgbm-predictor-bdf964bd-99sfd\" (UID: \"54f66df5-f27f-4cd2-b9e9-8981034b65ff\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-99sfd" Apr 20 20:20:25.021072 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:25.021052 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/54f66df5-f27f-4cd2-b9e9-8981034b65ff-kserve-provision-location\") pod \"isvc-lightgbm-predictor-bdf964bd-99sfd\" (UID: \"54f66df5-f27f-4cd2-b9e9-8981034b65ff\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-99sfd" Apr 20 20:20:25.021422 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:25.021396 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/54f66df5-f27f-4cd2-b9e9-8981034b65ff-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-predictor-bdf964bd-99sfd\" (UID: \"54f66df5-f27f-4cd2-b9e9-8981034b65ff\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-99sfd" Apr 20 20:20:25.022957 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:25.022940 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/54f66df5-f27f-4cd2-b9e9-8981034b65ff-proxy-tls\") pod \"isvc-lightgbm-predictor-bdf964bd-99sfd\" (UID: \"54f66df5-f27f-4cd2-b9e9-8981034b65ff\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-99sfd" Apr 20 20:20:25.029223 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:25.029203 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2957z\" (UniqueName: \"kubernetes.io/projected/54f66df5-f27f-4cd2-b9e9-8981034b65ff-kube-api-access-2957z\") pod \"isvc-lightgbm-predictor-bdf964bd-99sfd\" (UID: \"54f66df5-f27f-4cd2-b9e9-8981034b65ff\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-99sfd" Apr 20 20:20:25.199747 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:25.199675 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-99sfd" Apr 20 20:20:25.319181 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:25.319099 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-99sfd"] Apr 20 20:20:25.321469 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:20:25.321442 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54f66df5_f27f_4cd2_b9e9_8981034b65ff.slice/crio-06b112f3ed246b251a117e20cbb34fc66ae95fd430805793e839110baf02ac2c WatchSource:0}: Error finding container 06b112f3ed246b251a117e20cbb34fc66ae95fd430805793e839110baf02ac2c: Status 404 returned error can't find the container with id 06b112f3ed246b251a117e20cbb34fc66ae95fd430805793e839110baf02ac2c Apr 20 20:20:25.473006 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:25.472910 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-twhw4" podUID="1962a842-eb66-4ef0-9c63-01c8d73366e5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.39:8643/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Apr 20 20:20:25.831652 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:25.831622 2576 generic.go:358] "Generic (PLEG): container finished" podID="99dadaa7-f559-4fe7-9267-042eeea2992f" containerID="a39aa83ee2f64039af6b6c7a210ff73f4f890d2e6abada19330a977a13b5222c" exitCode=2 Apr 20 20:20:25.831814 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:25.831681 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" event={"ID":"99dadaa7-f559-4fe7-9267-042eeea2992f","Type":"ContainerDied","Data":"a39aa83ee2f64039af6b6c7a210ff73f4f890d2e6abada19330a977a13b5222c"} Apr 20 20:20:25.833009 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:25.832980 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-99sfd" event={"ID":"54f66df5-f27f-4cd2-b9e9-8981034b65ff","Type":"ContainerStarted","Data":"6ab177640eb43a7ea6579761554ff1897dee7d1036f3dd5725001c5c72d484c7"} Apr 20 20:20:25.833154 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:25.833017 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-99sfd" event={"ID":"54f66df5-f27f-4cd2-b9e9-8981034b65ff","Type":"ContainerStarted","Data":"06b112f3ed246b251a117e20cbb34fc66ae95fd430805793e839110baf02ac2c"} Apr 20 20:20:25.847078 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:25.847058 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z55qt_f78ac3d9-bcf1-43dd-aac7-1678831ee3ba/ovn-acl-logging/0.log" Apr 20 20:20:25.847534 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:25.847511 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z55qt_f78ac3d9-bcf1-43dd-aac7-1678831ee3ba/ovn-acl-logging/0.log" Apr 20 20:20:25.945085 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:25.945056 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1962a842-eb66-4ef0-9c63-01c8d73366e5" path="/var/lib/kubelet/pods/1962a842-eb66-4ef0-9c63-01c8d73366e5/volumes" Apr 20 20:20:27.558689 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:27.558650 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" podUID="99dadaa7-f559-4fe7-9267-042eeea2992f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.40:8643/healthz\": dial tcp 10.132.0.40:8643: connect: connection refused" Apr 20 20:20:28.852085 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:28.852052 2576 generic.go:358] "Generic (PLEG): container finished" podID="99dadaa7-f559-4fe7-9267-042eeea2992f" containerID="d627ea5c10a89fe0750c832b9df94b1a17085566e0b1e964aa535a1ff5513ecf" exitCode=0 Apr 20 20:20:28.852436 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:28.852132 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" event={"ID":"99dadaa7-f559-4fe7-9267-042eeea2992f","Type":"ContainerDied","Data":"d627ea5c10a89fe0750c832b9df94b1a17085566e0b1e964aa535a1ff5513ecf"} Apr 20 20:20:29.856978 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:29.856946 2576 generic.go:358] "Generic (PLEG): container finished" podID="54f66df5-f27f-4cd2-b9e9-8981034b65ff" containerID="6ab177640eb43a7ea6579761554ff1897dee7d1036f3dd5725001c5c72d484c7" exitCode=0 Apr 20 20:20:29.857346 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:29.857023 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-99sfd" event={"ID":"54f66df5-f27f-4cd2-b9e9-8981034b65ff","Type":"ContainerDied","Data":"6ab177640eb43a7ea6579761554ff1897dee7d1036f3dd5725001c5c72d484c7"} Apr 20 20:20:32.558959 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:32.558922 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" podUID="99dadaa7-f559-4fe7-9267-042eeea2992f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.40:8643/healthz\": dial tcp 10.132.0.40:8643: connect: connection refused" Apr 20 20:20:32.563330 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:32.563299 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" podUID="99dadaa7-f559-4fe7-9267-042eeea2992f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 20 20:20:32.563624 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:32.563600 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" podUID="99dadaa7-f559-4fe7-9267-042eeea2992f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:20:37.558553 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:37.558510 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" podUID="99dadaa7-f559-4fe7-9267-042eeea2992f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.40:8643/healthz\": dial tcp 10.132.0.40:8643: connect: connection refused" Apr 20 20:20:37.560933 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:37.558654 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" Apr 20 20:20:37.888859 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:37.888784 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-99sfd" event={"ID":"54f66df5-f27f-4cd2-b9e9-8981034b65ff","Type":"ContainerStarted","Data":"8d136ccde902756eb905af502a368f635b75cc0af1987d7f5ca1c48788d96bc1"} Apr 20 20:20:37.888859 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:37.888822 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-99sfd" event={"ID":"54f66df5-f27f-4cd2-b9e9-8981034b65ff","Type":"ContainerStarted","Data":"d303596dd509e77679bcdcb447b240f145af0ac689e6f15d38d50a0bf110d462"} Apr 20 20:20:37.889048 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:37.889004 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-99sfd" Apr 20 20:20:37.907743 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:37.907695 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-99sfd" podStartSLOduration=6.927940217 podStartE2EDuration="13.907683887s" podCreationTimestamp="2026-04-20 20:20:24 +0000 UTC" firstStartedPulling="2026-04-20 20:20:29.858450624 +0000 UTC m=+904.571132598" lastFinishedPulling="2026-04-20 20:20:36.838194289 +0000 UTC m=+911.550876268" observedRunningTime="2026-04-20 20:20:37.906605216 +0000 UTC m=+912.619287212" watchObservedRunningTime="2026-04-20 20:20:37.907683887 +0000 UTC m=+912.620365886" Apr 20 20:20:38.891883 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:38.891854 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-99sfd" Apr 20 20:20:38.892996 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:38.892971 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-99sfd" podUID="54f66df5-f27f-4cd2-b9e9-8981034b65ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 20 20:20:39.897261 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:39.897214 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-99sfd" podUID="54f66df5-f27f-4cd2-b9e9-8981034b65ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 20 20:20:42.559209 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:42.559170 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" podUID="99dadaa7-f559-4fe7-9267-042eeea2992f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.40:8643/healthz\": dial tcp 10.132.0.40:8643: connect: connection refused" Apr 20 20:20:42.563450 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:42.563409 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" podUID="99dadaa7-f559-4fe7-9267-042eeea2992f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 20 20:20:42.563729 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:42.563707 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" podUID="99dadaa7-f559-4fe7-9267-042eeea2992f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:20:44.901149 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:44.901123 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-99sfd" Apr 20 20:20:44.901738 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:44.901710 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-99sfd" podUID="54f66df5-f27f-4cd2-b9e9-8981034b65ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 20 20:20:47.558978 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:47.558936 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" podUID="99dadaa7-f559-4fe7-9267-042eeea2992f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.40:8643/healthz\": dial tcp 10.132.0.40:8643: connect: connection refused" Apr 20 20:20:52.559181 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:52.559137 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" podUID="99dadaa7-f559-4fe7-9267-042eeea2992f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.40:8643/healthz\": dial tcp 10.132.0.40:8643: connect: connection refused" Apr 20 20:20:52.563444 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:52.563412 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" podUID="99dadaa7-f559-4fe7-9267-042eeea2992f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 20 20:20:52.563596 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:52.563579 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" Apr 20 20:20:52.563812 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:52.563786 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" podUID="99dadaa7-f559-4fe7-9267-042eeea2992f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:20:52.563893 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:52.563882 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" Apr 20 20:20:54.901614 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:54.901574 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-99sfd" podUID="54f66df5-f27f-4cd2-b9e9-8981034b65ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 20 20:20:55.119387 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:55.119364 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" Apr 20 20:20:55.285532 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:55.285507 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/99dadaa7-f559-4fe7-9267-042eeea2992f-kserve-provision-location\") pod \"99dadaa7-f559-4fe7-9267-042eeea2992f\" (UID: \"99dadaa7-f559-4fe7-9267-042eeea2992f\") " Apr 20 20:20:55.285674 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:55.285547 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/99dadaa7-f559-4fe7-9267-042eeea2992f-isvc-logger-kube-rbac-proxy-sar-config\") pod \"99dadaa7-f559-4fe7-9267-042eeea2992f\" (UID: \"99dadaa7-f559-4fe7-9267-042eeea2992f\") " Apr 20 20:20:55.285674 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:55.285580 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8t5k8\" (UniqueName: \"kubernetes.io/projected/99dadaa7-f559-4fe7-9267-042eeea2992f-kube-api-access-8t5k8\") pod \"99dadaa7-f559-4fe7-9267-042eeea2992f\" (UID: \"99dadaa7-f559-4fe7-9267-042eeea2992f\") " Apr 20 20:20:55.285674 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:55.285639 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99dadaa7-f559-4fe7-9267-042eeea2992f-proxy-tls\") pod \"99dadaa7-f559-4fe7-9267-042eeea2992f\" (UID: \"99dadaa7-f559-4fe7-9267-042eeea2992f\") " Apr 20 20:20:55.285862 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:55.285837 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/99dadaa7-f559-4fe7-9267-042eeea2992f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "99dadaa7-f559-4fe7-9267-042eeea2992f" (UID: "99dadaa7-f559-4fe7-9267-042eeea2992f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:20:55.285922 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:55.285899 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99dadaa7-f559-4fe7-9267-042eeea2992f-isvc-logger-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-logger-kube-rbac-proxy-sar-config") pod "99dadaa7-f559-4fe7-9267-042eeea2992f" (UID: "99dadaa7-f559-4fe7-9267-042eeea2992f"). InnerVolumeSpecName "isvc-logger-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:20:55.287617 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:55.287594 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99dadaa7-f559-4fe7-9267-042eeea2992f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "99dadaa7-f559-4fe7-9267-042eeea2992f" (UID: "99dadaa7-f559-4fe7-9267-042eeea2992f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:20:55.287711 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:55.287646 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99dadaa7-f559-4fe7-9267-042eeea2992f-kube-api-access-8t5k8" (OuterVolumeSpecName: "kube-api-access-8t5k8") pod "99dadaa7-f559-4fe7-9267-042eeea2992f" (UID: "99dadaa7-f559-4fe7-9267-042eeea2992f"). InnerVolumeSpecName "kube-api-access-8t5k8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:20:55.387094 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:55.387067 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/99dadaa7-f559-4fe7-9267-042eeea2992f-kserve-provision-location\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:20:55.387094 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:55.387090 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/99dadaa7-f559-4fe7-9267-042eeea2992f-isvc-logger-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:20:55.387299 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:55.387101 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8t5k8\" (UniqueName: \"kubernetes.io/projected/99dadaa7-f559-4fe7-9267-042eeea2992f-kube-api-access-8t5k8\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:20:55.387299 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:55.387128 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99dadaa7-f559-4fe7-9267-042eeea2992f-proxy-tls\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:20:55.950091 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:55.950055 2576 generic.go:358] "Generic (PLEG): container finished" podID="99dadaa7-f559-4fe7-9267-042eeea2992f" containerID="ccb5ca53f7095f789d16bd67877acff8382367674630344480f23c1714cd1a07" exitCode=0 Apr 20 20:20:55.950574 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:55.950104 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" event={"ID":"99dadaa7-f559-4fe7-9267-042eeea2992f","Type":"ContainerDied","Data":"ccb5ca53f7095f789d16bd67877acff8382367674630344480f23c1714cd1a07"} Apr 20 20:20:55.950574 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:55.950146 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" event={"ID":"99dadaa7-f559-4fe7-9267-042eeea2992f","Type":"ContainerDied","Data":"a288264925e52c26341e8adfc20e8348bb904242f620071bbb213d1af5e6154d"} Apr 20 20:20:55.950574 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:55.950162 2576 scope.go:117] "RemoveContainer" containerID="ccb5ca53f7095f789d16bd67877acff8382367674630344480f23c1714cd1a07" Apr 20 20:20:55.950574 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:55.950172 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs" Apr 20 20:20:55.958572 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:55.958551 2576 scope.go:117] "RemoveContainer" containerID="a39aa83ee2f64039af6b6c7a210ff73f4f890d2e6abada19330a977a13b5222c" Apr 20 20:20:55.965887 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:55.965872 2576 scope.go:117] "RemoveContainer" containerID="d627ea5c10a89fe0750c832b9df94b1a17085566e0b1e964aa535a1ff5513ecf" Apr 20 20:20:55.970294 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:55.970254 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs"] Apr 20 20:20:55.973640 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:55.973604 2576 scope.go:117] "RemoveContainer" containerID="5d4365d0eabd6278c7c1eec14da24868f62a63fa62512b943f39450484924a7c" Apr 20 20:20:55.975257 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:55.975240 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-r98gs"] Apr 20 20:20:55.980604 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:55.980589 2576 scope.go:117] "RemoveContainer" containerID="ccb5ca53f7095f789d16bd67877acff8382367674630344480f23c1714cd1a07" Apr 20 20:20:55.980852 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:20:55.980835 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccb5ca53f7095f789d16bd67877acff8382367674630344480f23c1714cd1a07\": container with ID starting with ccb5ca53f7095f789d16bd67877acff8382367674630344480f23c1714cd1a07 not found: ID does not exist" containerID="ccb5ca53f7095f789d16bd67877acff8382367674630344480f23c1714cd1a07" Apr 20 20:20:55.980904 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:55.980860 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccb5ca53f7095f789d16bd67877acff8382367674630344480f23c1714cd1a07"} err="failed to get container status \"ccb5ca53f7095f789d16bd67877acff8382367674630344480f23c1714cd1a07\": rpc error: code = NotFound desc = could not find container \"ccb5ca53f7095f789d16bd67877acff8382367674630344480f23c1714cd1a07\": container with ID starting with ccb5ca53f7095f789d16bd67877acff8382367674630344480f23c1714cd1a07 not found: ID does not exist" Apr 20 20:20:55.980904 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:55.980876 2576 scope.go:117] "RemoveContainer" containerID="a39aa83ee2f64039af6b6c7a210ff73f4f890d2e6abada19330a977a13b5222c" Apr 20 20:20:55.981095 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:20:55.981080 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a39aa83ee2f64039af6b6c7a210ff73f4f890d2e6abada19330a977a13b5222c\": container with ID starting with a39aa83ee2f64039af6b6c7a210ff73f4f890d2e6abada19330a977a13b5222c not found: ID does not exist" containerID="a39aa83ee2f64039af6b6c7a210ff73f4f890d2e6abada19330a977a13b5222c" Apr 20 20:20:55.981179 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:55.981098 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a39aa83ee2f64039af6b6c7a210ff73f4f890d2e6abada19330a977a13b5222c"} err="failed to get container status \"a39aa83ee2f64039af6b6c7a210ff73f4f890d2e6abada19330a977a13b5222c\": rpc error: code = NotFound desc = could not find container \"a39aa83ee2f64039af6b6c7a210ff73f4f890d2e6abada19330a977a13b5222c\": container with ID starting with a39aa83ee2f64039af6b6c7a210ff73f4f890d2e6abada19330a977a13b5222c not found: ID does not exist" Apr 20 20:20:55.981179 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:55.981123 2576 scope.go:117] "RemoveContainer" containerID="d627ea5c10a89fe0750c832b9df94b1a17085566e0b1e964aa535a1ff5513ecf" Apr 20 20:20:55.981308 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:20:55.981293 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d627ea5c10a89fe0750c832b9df94b1a17085566e0b1e964aa535a1ff5513ecf\": container with ID starting with d627ea5c10a89fe0750c832b9df94b1a17085566e0b1e964aa535a1ff5513ecf not found: ID does not exist" containerID="d627ea5c10a89fe0750c832b9df94b1a17085566e0b1e964aa535a1ff5513ecf" Apr 20 20:20:55.981343 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:55.981310 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d627ea5c10a89fe0750c832b9df94b1a17085566e0b1e964aa535a1ff5513ecf"} err="failed to get container status \"d627ea5c10a89fe0750c832b9df94b1a17085566e0b1e964aa535a1ff5513ecf\": rpc error: code = NotFound desc = could not find container \"d627ea5c10a89fe0750c832b9df94b1a17085566e0b1e964aa535a1ff5513ecf\": container with ID starting with d627ea5c10a89fe0750c832b9df94b1a17085566e0b1e964aa535a1ff5513ecf not found: ID does not exist" Apr 20 20:20:55.981343 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:55.981322 2576 scope.go:117] "RemoveContainer" containerID="5d4365d0eabd6278c7c1eec14da24868f62a63fa62512b943f39450484924a7c" Apr 20 20:20:55.981504 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:20:55.981489 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d4365d0eabd6278c7c1eec14da24868f62a63fa62512b943f39450484924a7c\": container with ID starting with 5d4365d0eabd6278c7c1eec14da24868f62a63fa62512b943f39450484924a7c not found: ID does not exist" containerID="5d4365d0eabd6278c7c1eec14da24868f62a63fa62512b943f39450484924a7c" Apr 20 20:20:55.981545 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:55.981507 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d4365d0eabd6278c7c1eec14da24868f62a63fa62512b943f39450484924a7c"} err="failed to get container status \"5d4365d0eabd6278c7c1eec14da24868f62a63fa62512b943f39450484924a7c\": rpc error: code = NotFound desc = could not find container \"5d4365d0eabd6278c7c1eec14da24868f62a63fa62512b943f39450484924a7c\": container with ID starting with 5d4365d0eabd6278c7c1eec14da24868f62a63fa62512b943f39450484924a7c not found: ID does not exist" Apr 20 20:20:57.944920 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:20:57.944874 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99dadaa7-f559-4fe7-9267-042eeea2992f" path="/var/lib/kubelet/pods/99dadaa7-f559-4fe7-9267-042eeea2992f/volumes" Apr 20 20:21:04.901723 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:21:04.901684 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-99sfd" podUID="54f66df5-f27f-4cd2-b9e9-8981034b65ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 20 20:21:14.901728 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:21:14.901693 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-99sfd" podUID="54f66df5-f27f-4cd2-b9e9-8981034b65ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 20 20:21:24.901837 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:21:24.901796 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-99sfd" podUID="54f66df5-f27f-4cd2-b9e9-8981034b65ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 20 20:21:34.902414 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:21:34.902373 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-99sfd" podUID="54f66df5-f27f-4cd2-b9e9-8981034b65ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 20 20:21:44.901681 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:21:44.901600 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-99sfd" podUID="54f66df5-f27f-4cd2-b9e9-8981034b65ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 20 20:21:54.902840 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:21:54.902810 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-99sfd" Apr 20 20:22:04.437290 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:04.437251 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-99sfd"] Apr 20 20:22:04.437729 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:04.437585 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-99sfd" podUID="54f66df5-f27f-4cd2-b9e9-8981034b65ff" containerName="kserve-container" containerID="cri-o://d303596dd509e77679bcdcb447b240f145af0ac689e6f15d38d50a0bf110d462" gracePeriod=30 Apr 20 20:22:04.437729 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:04.437664 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-99sfd" podUID="54f66df5-f27f-4cd2-b9e9-8981034b65ff" containerName="kube-rbac-proxy" containerID="cri-o://8d136ccde902756eb905af502a368f635b75cc0af1987d7f5ca1c48788d96bc1" gracePeriod=30 Apr 20 20:22:04.560013 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:04.559986 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc"] Apr 20 20:22:04.560378 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:04.560364 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99dadaa7-f559-4fe7-9267-042eeea2992f" containerName="agent" Apr 20 20:22:04.560442 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:04.560380 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="99dadaa7-f559-4fe7-9267-042eeea2992f" containerName="agent" Apr 20 20:22:04.560442 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:04.560394 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99dadaa7-f559-4fe7-9267-042eeea2992f" containerName="kserve-container" Apr 20 20:22:04.560442 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:04.560402 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="99dadaa7-f559-4fe7-9267-042eeea2992f" containerName="kserve-container" Apr 20 20:22:04.560442 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:04.560418 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99dadaa7-f559-4fe7-9267-042eeea2992f" containerName="storage-initializer" Apr 20 20:22:04.560442 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:04.560425 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="99dadaa7-f559-4fe7-9267-042eeea2992f" containerName="storage-initializer" Apr 20 20:22:04.560442 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:04.560433 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="99dadaa7-f559-4fe7-9267-042eeea2992f" containerName="kube-rbac-proxy" Apr 20 20:22:04.560442 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:04.560438 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="99dadaa7-f559-4fe7-9267-042eeea2992f" containerName="kube-rbac-proxy" Apr 20 20:22:04.560651 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:04.560516 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="99dadaa7-f559-4fe7-9267-042eeea2992f" containerName="kserve-container" Apr 20 20:22:04.560651 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:04.560526 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="99dadaa7-f559-4fe7-9267-042eeea2992f" containerName="kube-rbac-proxy" Apr 20 20:22:04.560651 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:04.560533 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="99dadaa7-f559-4fe7-9267-042eeea2992f" containerName="agent" Apr 20 20:22:04.563915 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:04.563899 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc" Apr 20 20:22:04.566690 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:04.566662 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-runtime-predictor-serving-cert\"" Apr 20 20:22:04.566859 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:04.566669 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\"" Apr 20 20:22:04.574446 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:04.574425 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc"] Apr 20 20:22:04.699711 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:04.699628 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/80b222d6-0d0d-4b9b-baa8-9a04be505252-proxy-tls\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc\" (UID: \"80b222d6-0d0d-4b9b-baa8-9a04be505252\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc" Apr 20 20:22:04.699711 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:04.699686 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/80b222d6-0d0d-4b9b-baa8-9a04be505252-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc\" (UID: \"80b222d6-0d0d-4b9b-baa8-9a04be505252\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc" Apr 20 20:22:04.699910 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:04.699749 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvqkd\" (UniqueName: \"kubernetes.io/projected/80b222d6-0d0d-4b9b-baa8-9a04be505252-kube-api-access-cvqkd\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc\" (UID: \"80b222d6-0d0d-4b9b-baa8-9a04be505252\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc" Apr 20 20:22:04.699910 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:04.699812 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/80b222d6-0d0d-4b9b-baa8-9a04be505252-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc\" (UID: \"80b222d6-0d0d-4b9b-baa8-9a04be505252\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc" Apr 20 20:22:04.801225 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:04.801187 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/80b222d6-0d0d-4b9b-baa8-9a04be505252-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc\" (UID: \"80b222d6-0d0d-4b9b-baa8-9a04be505252\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc" Apr 20 20:22:04.801404 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:04.801273 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cvqkd\" (UniqueName: \"kubernetes.io/projected/80b222d6-0d0d-4b9b-baa8-9a04be505252-kube-api-access-cvqkd\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc\" (UID: \"80b222d6-0d0d-4b9b-baa8-9a04be505252\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc" Apr 20 20:22:04.801404 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:04.801308 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/80b222d6-0d0d-4b9b-baa8-9a04be505252-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc\" (UID: \"80b222d6-0d0d-4b9b-baa8-9a04be505252\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc" Apr 20 20:22:04.801553 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:04.801514 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/80b222d6-0d0d-4b9b-baa8-9a04be505252-proxy-tls\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc\" (UID: \"80b222d6-0d0d-4b9b-baa8-9a04be505252\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc" Apr 20 20:22:04.801616 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:04.801574 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/80b222d6-0d0d-4b9b-baa8-9a04be505252-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc\" (UID: \"80b222d6-0d0d-4b9b-baa8-9a04be505252\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc" Apr 20 20:22:04.801960 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:04.801935 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/80b222d6-0d0d-4b9b-baa8-9a04be505252-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc\" (UID: \"80b222d6-0d0d-4b9b-baa8-9a04be505252\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc" Apr 20 20:22:04.803843 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:04.803820 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/80b222d6-0d0d-4b9b-baa8-9a04be505252-proxy-tls\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc\" (UID: \"80b222d6-0d0d-4b9b-baa8-9a04be505252\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc" Apr 20 20:22:04.809267 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:04.809249 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvqkd\" (UniqueName: \"kubernetes.io/projected/80b222d6-0d0d-4b9b-baa8-9a04be505252-kube-api-access-cvqkd\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc\" (UID: \"80b222d6-0d0d-4b9b-baa8-9a04be505252\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc" Apr 20 20:22:04.874287 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:04.874263 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc" Apr 20 20:22:04.898211 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:04.898179 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-99sfd" podUID="54f66df5-f27f-4cd2-b9e9-8981034b65ff" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.41:8643/healthz\": dial tcp 10.132.0.41:8643: connect: connection refused" Apr 20 20:22:04.902452 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:04.902426 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-99sfd" podUID="54f66df5-f27f-4cd2-b9e9-8981034b65ff" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 20 20:22:04.995121 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:04.995087 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc"] Apr 20 20:22:04.996998 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:22:04.996964 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80b222d6_0d0d_4b9b_baa8_9a04be505252.slice/crio-c76d7959bcc4849bbdbd6a138ab4574f18cc724b4477c3ab2a7376962aaed470 WatchSource:0}: Error finding container c76d7959bcc4849bbdbd6a138ab4574f18cc724b4477c3ab2a7376962aaed470: Status 404 returned error can't find the container with id c76d7959bcc4849bbdbd6a138ab4574f18cc724b4477c3ab2a7376962aaed470 Apr 20 20:22:05.167872 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:05.167839 2576 generic.go:358] "Generic (PLEG): container finished" podID="54f66df5-f27f-4cd2-b9e9-8981034b65ff" containerID="8d136ccde902756eb905af502a368f635b75cc0af1987d7f5ca1c48788d96bc1" exitCode=2 Apr 20 20:22:05.168042 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:05.167912 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-99sfd" event={"ID":"54f66df5-f27f-4cd2-b9e9-8981034b65ff","Type":"ContainerDied","Data":"8d136ccde902756eb905af502a368f635b75cc0af1987d7f5ca1c48788d96bc1"} Apr 20 20:22:05.169168 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:05.169147 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc" event={"ID":"80b222d6-0d0d-4b9b-baa8-9a04be505252","Type":"ContainerStarted","Data":"edf506c5408cb0bef5cdc4293fb7ac1dcac1cc46986d352528a83e12f2470710"} Apr 20 20:22:05.169271 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:05.169174 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc" event={"ID":"80b222d6-0d0d-4b9b-baa8-9a04be505252","Type":"ContainerStarted","Data":"c76d7959bcc4849bbdbd6a138ab4574f18cc724b4477c3ab2a7376962aaed470"} Apr 20 20:22:08.674434 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:08.674405 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-99sfd" Apr 20 20:22:08.734763 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:08.734698 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/54f66df5-f27f-4cd2-b9e9-8981034b65ff-kserve-provision-location\") pod \"54f66df5-f27f-4cd2-b9e9-8981034b65ff\" (UID: \"54f66df5-f27f-4cd2-b9e9-8981034b65ff\") " Apr 20 20:22:08.734763 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:08.734746 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/54f66df5-f27f-4cd2-b9e9-8981034b65ff-proxy-tls\") pod \"54f66df5-f27f-4cd2-b9e9-8981034b65ff\" (UID: \"54f66df5-f27f-4cd2-b9e9-8981034b65ff\") " Apr 20 20:22:08.734930 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:08.734773 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2957z\" (UniqueName: \"kubernetes.io/projected/54f66df5-f27f-4cd2-b9e9-8981034b65ff-kube-api-access-2957z\") pod \"54f66df5-f27f-4cd2-b9e9-8981034b65ff\" (UID: \"54f66df5-f27f-4cd2-b9e9-8981034b65ff\") " Apr 20 20:22:08.734930 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:08.734794 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/54f66df5-f27f-4cd2-b9e9-8981034b65ff-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"54f66df5-f27f-4cd2-b9e9-8981034b65ff\" (UID: \"54f66df5-f27f-4cd2-b9e9-8981034b65ff\") " Apr 20 20:22:08.735041 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:08.734992 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54f66df5-f27f-4cd2-b9e9-8981034b65ff-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "54f66df5-f27f-4cd2-b9e9-8981034b65ff" (UID: "54f66df5-f27f-4cd2-b9e9-8981034b65ff"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:22:08.735219 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:08.735197 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54f66df5-f27f-4cd2-b9e9-8981034b65ff-isvc-lightgbm-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-kube-rbac-proxy-sar-config") pod "54f66df5-f27f-4cd2-b9e9-8981034b65ff" (UID: "54f66df5-f27f-4cd2-b9e9-8981034b65ff"). InnerVolumeSpecName "isvc-lightgbm-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:22:08.736730 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:08.736710 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54f66df5-f27f-4cd2-b9e9-8981034b65ff-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "54f66df5-f27f-4cd2-b9e9-8981034b65ff" (UID: "54f66df5-f27f-4cd2-b9e9-8981034b65ff"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:22:08.736779 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:08.736753 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54f66df5-f27f-4cd2-b9e9-8981034b65ff-kube-api-access-2957z" (OuterVolumeSpecName: "kube-api-access-2957z") pod "54f66df5-f27f-4cd2-b9e9-8981034b65ff" (UID: "54f66df5-f27f-4cd2-b9e9-8981034b65ff"). InnerVolumeSpecName "kube-api-access-2957z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:22:08.835360 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:08.835336 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2957z\" (UniqueName: \"kubernetes.io/projected/54f66df5-f27f-4cd2-b9e9-8981034b65ff-kube-api-access-2957z\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:22:08.835360 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:08.835358 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/54f66df5-f27f-4cd2-b9e9-8981034b65ff-isvc-lightgbm-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:22:08.835515 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:08.835368 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/54f66df5-f27f-4cd2-b9e9-8981034b65ff-kserve-provision-location\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:22:08.835515 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:08.835378 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/54f66df5-f27f-4cd2-b9e9-8981034b65ff-proxy-tls\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:22:09.183509 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:09.183473 2576 generic.go:358] "Generic (PLEG): container finished" podID="54f66df5-f27f-4cd2-b9e9-8981034b65ff" containerID="d303596dd509e77679bcdcb447b240f145af0ac689e6f15d38d50a0bf110d462" exitCode=0 Apr 20 20:22:09.183681 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:09.183558 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-99sfd" Apr 20 20:22:09.183681 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:09.183558 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-99sfd" event={"ID":"54f66df5-f27f-4cd2-b9e9-8981034b65ff","Type":"ContainerDied","Data":"d303596dd509e77679bcdcb447b240f145af0ac689e6f15d38d50a0bf110d462"} Apr 20 20:22:09.183681 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:09.183668 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-99sfd" event={"ID":"54f66df5-f27f-4cd2-b9e9-8981034b65ff","Type":"ContainerDied","Data":"06b112f3ed246b251a117e20cbb34fc66ae95fd430805793e839110baf02ac2c"} Apr 20 20:22:09.183828 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:09.183688 2576 scope.go:117] "RemoveContainer" containerID="8d136ccde902756eb905af502a368f635b75cc0af1987d7f5ca1c48788d96bc1" Apr 20 20:22:09.185086 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:09.185062 2576 generic.go:358] "Generic (PLEG): container finished" podID="80b222d6-0d0d-4b9b-baa8-9a04be505252" containerID="edf506c5408cb0bef5cdc4293fb7ac1dcac1cc46986d352528a83e12f2470710" exitCode=0 Apr 20 20:22:09.185212 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:09.185104 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc" event={"ID":"80b222d6-0d0d-4b9b-baa8-9a04be505252","Type":"ContainerDied","Data":"edf506c5408cb0bef5cdc4293fb7ac1dcac1cc46986d352528a83e12f2470710"} Apr 20 20:22:09.193244 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:09.193209 2576 scope.go:117] "RemoveContainer" containerID="d303596dd509e77679bcdcb447b240f145af0ac689e6f15d38d50a0bf110d462" Apr 20 20:22:09.201652 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:09.201633 2576 scope.go:117] "RemoveContainer" containerID="6ab177640eb43a7ea6579761554ff1897dee7d1036f3dd5725001c5c72d484c7" Apr 20 20:22:09.210104 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:09.210091 2576 scope.go:117] "RemoveContainer" containerID="8d136ccde902756eb905af502a368f635b75cc0af1987d7f5ca1c48788d96bc1" Apr 20 20:22:09.210346 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:22:09.210326 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d136ccde902756eb905af502a368f635b75cc0af1987d7f5ca1c48788d96bc1\": container with ID starting with 8d136ccde902756eb905af502a368f635b75cc0af1987d7f5ca1c48788d96bc1 not found: ID does not exist" containerID="8d136ccde902756eb905af502a368f635b75cc0af1987d7f5ca1c48788d96bc1" Apr 20 20:22:09.210405 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:09.210352 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d136ccde902756eb905af502a368f635b75cc0af1987d7f5ca1c48788d96bc1"} err="failed to get container status \"8d136ccde902756eb905af502a368f635b75cc0af1987d7f5ca1c48788d96bc1\": rpc error: code = NotFound desc = could not find container \"8d136ccde902756eb905af502a368f635b75cc0af1987d7f5ca1c48788d96bc1\": container with ID starting with 8d136ccde902756eb905af502a368f635b75cc0af1987d7f5ca1c48788d96bc1 not found: ID does not exist" Apr 20 20:22:09.210405 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:09.210370 2576 scope.go:117] "RemoveContainer" containerID="d303596dd509e77679bcdcb447b240f145af0ac689e6f15d38d50a0bf110d462" Apr 20 20:22:09.210598 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:22:09.210572 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d303596dd509e77679bcdcb447b240f145af0ac689e6f15d38d50a0bf110d462\": container with ID starting with d303596dd509e77679bcdcb447b240f145af0ac689e6f15d38d50a0bf110d462 not found: ID does not exist" containerID="d303596dd509e77679bcdcb447b240f145af0ac689e6f15d38d50a0bf110d462" Apr 20 20:22:09.210645 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:09.210605 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d303596dd509e77679bcdcb447b240f145af0ac689e6f15d38d50a0bf110d462"} err="failed to get container status \"d303596dd509e77679bcdcb447b240f145af0ac689e6f15d38d50a0bf110d462\": rpc error: code = NotFound desc = could not find container \"d303596dd509e77679bcdcb447b240f145af0ac689e6f15d38d50a0bf110d462\": container with ID starting with d303596dd509e77679bcdcb447b240f145af0ac689e6f15d38d50a0bf110d462 not found: ID does not exist" Apr 20 20:22:09.210645 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:09.210621 2576 scope.go:117] "RemoveContainer" containerID="6ab177640eb43a7ea6579761554ff1897dee7d1036f3dd5725001c5c72d484c7" Apr 20 20:22:09.210839 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:22:09.210822 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ab177640eb43a7ea6579761554ff1897dee7d1036f3dd5725001c5c72d484c7\": container with ID starting with 6ab177640eb43a7ea6579761554ff1897dee7d1036f3dd5725001c5c72d484c7 not found: ID does not exist" containerID="6ab177640eb43a7ea6579761554ff1897dee7d1036f3dd5725001c5c72d484c7" Apr 20 20:22:09.210892 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:09.210850 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ab177640eb43a7ea6579761554ff1897dee7d1036f3dd5725001c5c72d484c7"} err="failed to get container status \"6ab177640eb43a7ea6579761554ff1897dee7d1036f3dd5725001c5c72d484c7\": rpc error: code = NotFound desc = could not find container \"6ab177640eb43a7ea6579761554ff1897dee7d1036f3dd5725001c5c72d484c7\": container with ID starting with 6ab177640eb43a7ea6579761554ff1897dee7d1036f3dd5725001c5c72d484c7 not found: ID does not exist" Apr 20 20:22:09.218890 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:09.218870 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-99sfd"] Apr 20 20:22:09.224830 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:09.224797 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-99sfd"] Apr 20 20:22:09.949721 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:09.949689 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54f66df5-f27f-4cd2-b9e9-8981034b65ff" path="/var/lib/kubelet/pods/54f66df5-f27f-4cd2-b9e9-8981034b65ff/volumes" Apr 20 20:22:10.189948 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:10.189916 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc" event={"ID":"80b222d6-0d0d-4b9b-baa8-9a04be505252","Type":"ContainerStarted","Data":"cd4d7aab42e658e15c74111f9fb6e0663cbdad5a1478647ccc3ad14723ae73e6"} Apr 20 20:22:10.190103 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:10.189958 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc" event={"ID":"80b222d6-0d0d-4b9b-baa8-9a04be505252","Type":"ContainerStarted","Data":"86bb97a7e27b55832b59d2c770ad647e2412b4e530791e0a2a01a35474c5a7bf"} Apr 20 20:22:10.190331 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:10.190312 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc" Apr 20 20:22:10.208791 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:10.208715 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc" podStartSLOduration=6.208703513 podStartE2EDuration="6.208703513s" podCreationTimestamp="2026-04-20 20:22:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:22:10.207305473 +0000 UTC m=+1004.919987470" watchObservedRunningTime="2026-04-20 20:22:10.208703513 +0000 UTC m=+1004.921385509" Apr 20 20:22:11.194075 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:11.194043 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc" Apr 20 20:22:11.195229 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:11.195200 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc" podUID="80b222d6-0d0d-4b9b-baa8-9a04be505252" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 20 20:22:12.197240 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:12.197204 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc" podUID="80b222d6-0d0d-4b9b-baa8-9a04be505252" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 20 20:22:17.201139 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:17.201099 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc" Apr 20 20:22:17.201651 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:17.201628 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc" podUID="80b222d6-0d0d-4b9b-baa8-9a04be505252" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 20 20:22:27.201776 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:27.201737 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc" podUID="80b222d6-0d0d-4b9b-baa8-9a04be505252" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 20 20:22:37.201710 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:37.201670 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc" podUID="80b222d6-0d0d-4b9b-baa8-9a04be505252" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 20 20:22:47.202061 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:47.202018 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc" podUID="80b222d6-0d0d-4b9b-baa8-9a04be505252" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 20 20:22:57.202129 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:22:57.202070 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc" podUID="80b222d6-0d0d-4b9b-baa8-9a04be505252" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 20 20:23:07.201682 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:07.201592 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc" podUID="80b222d6-0d0d-4b9b-baa8-9a04be505252" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 20 20:23:17.202071 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:17.202033 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc" podUID="80b222d6-0d0d-4b9b-baa8-9a04be505252" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 20 20:23:27.202916 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:27.202887 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc" Apr 20 20:23:35.149679 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:35.149644 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc"] Apr 20 20:23:35.150197 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:35.150016 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc" podUID="80b222d6-0d0d-4b9b-baa8-9a04be505252" containerName="kserve-container" containerID="cri-o://86bb97a7e27b55832b59d2c770ad647e2412b4e530791e0a2a01a35474c5a7bf" gracePeriod=30 Apr 20 20:23:35.150197 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:35.150055 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc" podUID="80b222d6-0d0d-4b9b-baa8-9a04be505252" containerName="kube-rbac-proxy" containerID="cri-o://cd4d7aab42e658e15c74111f9fb6e0663cbdad5a1478647ccc3ad14723ae73e6" gracePeriod=30 Apr 20 20:23:35.471848 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:35.471779 2576 generic.go:358] "Generic (PLEG): container finished" podID="80b222d6-0d0d-4b9b-baa8-9a04be505252" containerID="cd4d7aab42e658e15c74111f9fb6e0663cbdad5a1478647ccc3ad14723ae73e6" exitCode=2 Apr 20 20:23:35.471981 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:35.471853 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc" event={"ID":"80b222d6-0d0d-4b9b-baa8-9a04be505252","Type":"ContainerDied","Data":"cd4d7aab42e658e15c74111f9fb6e0663cbdad5a1478647ccc3ad14723ae73e6"} Apr 20 20:23:35.751319 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:35.751243 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dn5mn"] Apr 20 20:23:35.751606 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:35.751594 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="54f66df5-f27f-4cd2-b9e9-8981034b65ff" containerName="storage-initializer" Apr 20 20:23:35.751653 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:35.751608 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="54f66df5-f27f-4cd2-b9e9-8981034b65ff" containerName="storage-initializer" Apr 20 20:23:35.751653 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:35.751626 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="54f66df5-f27f-4cd2-b9e9-8981034b65ff" containerName="kube-rbac-proxy" Apr 20 20:23:35.751653 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:35.751632 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="54f66df5-f27f-4cd2-b9e9-8981034b65ff" containerName="kube-rbac-proxy" Apr 20 20:23:35.751653 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:35.751644 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="54f66df5-f27f-4cd2-b9e9-8981034b65ff" containerName="kserve-container" Apr 20 20:23:35.751653 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:35.751650 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="54f66df5-f27f-4cd2-b9e9-8981034b65ff" containerName="kserve-container" Apr 20 20:23:35.751803 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:35.751715 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="54f66df5-f27f-4cd2-b9e9-8981034b65ff" containerName="kube-rbac-proxy" Apr 20 20:23:35.751803 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:35.751724 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="54f66df5-f27f-4cd2-b9e9-8981034b65ff" containerName="kserve-container" Apr 20 20:23:35.754830 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:35.754814 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dn5mn" Apr 20 20:23:35.757422 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:35.757400 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-runtime-predictor-serving-cert\"" Apr 20 20:23:35.757528 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:35.757403 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 20 20:23:35.818795 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:35.818768 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dn5mn"] Apr 20 20:23:35.911230 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:35.911202 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/516a7d76-1274-4e97-9902-e7e3318f799f-proxy-tls\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-dn5mn\" (UID: \"516a7d76-1274-4e97-9902-e7e3318f799f\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dn5mn" Apr 20 20:23:35.911362 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:35.911239 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/516a7d76-1274-4e97-9902-e7e3318f799f-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-dn5mn\" (UID: \"516a7d76-1274-4e97-9902-e7e3318f799f\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dn5mn" Apr 20 20:23:35.911362 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:35.911259 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/516a7d76-1274-4e97-9902-e7e3318f799f-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-dn5mn\" (UID: \"516a7d76-1274-4e97-9902-e7e3318f799f\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dn5mn" Apr 20 20:23:35.911437 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:35.911382 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzqd7\" (UniqueName: \"kubernetes.io/projected/516a7d76-1274-4e97-9902-e7e3318f799f-kube-api-access-jzqd7\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-dn5mn\" (UID: \"516a7d76-1274-4e97-9902-e7e3318f799f\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dn5mn" Apr 20 20:23:36.012504 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:36.012428 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jzqd7\" (UniqueName: \"kubernetes.io/projected/516a7d76-1274-4e97-9902-e7e3318f799f-kube-api-access-jzqd7\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-dn5mn\" (UID: \"516a7d76-1274-4e97-9902-e7e3318f799f\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dn5mn" Apr 20 20:23:36.012504 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:36.012488 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/516a7d76-1274-4e97-9902-e7e3318f799f-proxy-tls\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-dn5mn\" (UID: \"516a7d76-1274-4e97-9902-e7e3318f799f\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dn5mn" Apr 20 20:23:36.012667 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:36.012512 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/516a7d76-1274-4e97-9902-e7e3318f799f-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-dn5mn\" (UID: \"516a7d76-1274-4e97-9902-e7e3318f799f\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dn5mn" Apr 20 20:23:36.012667 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:36.012530 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/516a7d76-1274-4e97-9902-e7e3318f799f-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-dn5mn\" (UID: \"516a7d76-1274-4e97-9902-e7e3318f799f\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dn5mn" Apr 20 20:23:36.013000 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:36.012982 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/516a7d76-1274-4e97-9902-e7e3318f799f-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-dn5mn\" (UID: \"516a7d76-1274-4e97-9902-e7e3318f799f\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dn5mn" Apr 20 20:23:36.013238 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:36.013221 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/516a7d76-1274-4e97-9902-e7e3318f799f-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-dn5mn\" (UID: \"516a7d76-1274-4e97-9902-e7e3318f799f\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dn5mn" Apr 20 20:23:36.014850 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:36.014833 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/516a7d76-1274-4e97-9902-e7e3318f799f-proxy-tls\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-dn5mn\" (UID: \"516a7d76-1274-4e97-9902-e7e3318f799f\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dn5mn" Apr 20 20:23:36.022884 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:36.022864 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzqd7\" (UniqueName: \"kubernetes.io/projected/516a7d76-1274-4e97-9902-e7e3318f799f-kube-api-access-jzqd7\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-dn5mn\" (UID: \"516a7d76-1274-4e97-9902-e7e3318f799f\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dn5mn" Apr 20 20:23:36.065238 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:36.065216 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dn5mn" Apr 20 20:23:36.185243 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:36.185216 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dn5mn"] Apr 20 20:23:36.186697 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:23:36.186665 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod516a7d76_1274_4e97_9902_e7e3318f799f.slice/crio-bee95ed6469cc1f9d5effd30f3aca388eb4137fc186be22823c248a4af9417ef WatchSource:0}: Error finding container bee95ed6469cc1f9d5effd30f3aca388eb4137fc186be22823c248a4af9417ef: Status 404 returned error can't find the container with id bee95ed6469cc1f9d5effd30f3aca388eb4137fc186be22823c248a4af9417ef Apr 20 20:23:36.188609 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:36.188592 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:23:36.476489 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:36.476455 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dn5mn" event={"ID":"516a7d76-1274-4e97-9902-e7e3318f799f","Type":"ContainerStarted","Data":"7fa3a03f28e24a431eb871a9507709288caeb5c4bb5a4cf9b5cdd49d1339b3f7"} Apr 20 20:23:36.476489 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:36.476490 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dn5mn" event={"ID":"516a7d76-1274-4e97-9902-e7e3318f799f","Type":"ContainerStarted","Data":"bee95ed6469cc1f9d5effd30f3aca388eb4137fc186be22823c248a4af9417ef"} Apr 20 20:23:37.198455 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:37.198415 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc" podUID="80b222d6-0d0d-4b9b-baa8-9a04be505252" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.42:8643/healthz\": dial tcp 10.132.0.42:8643: connect: connection refused" Apr 20 20:23:37.201685 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:37.201663 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc" podUID="80b222d6-0d0d-4b9b-baa8-9a04be505252" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 20 20:23:39.292095 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:39.292069 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc" Apr 20 20:23:39.441090 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:39.441008 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/80b222d6-0d0d-4b9b-baa8-9a04be505252-proxy-tls\") pod \"80b222d6-0d0d-4b9b-baa8-9a04be505252\" (UID: \"80b222d6-0d0d-4b9b-baa8-9a04be505252\") " Apr 20 20:23:39.441090 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:39.441063 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/80b222d6-0d0d-4b9b-baa8-9a04be505252-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"80b222d6-0d0d-4b9b-baa8-9a04be505252\" (UID: \"80b222d6-0d0d-4b9b-baa8-9a04be505252\") " Apr 20 20:23:39.441311 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:39.441094 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvqkd\" (UniqueName: \"kubernetes.io/projected/80b222d6-0d0d-4b9b-baa8-9a04be505252-kube-api-access-cvqkd\") pod \"80b222d6-0d0d-4b9b-baa8-9a04be505252\" (UID: \"80b222d6-0d0d-4b9b-baa8-9a04be505252\") " Apr 20 20:23:39.441311 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:39.441256 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/80b222d6-0d0d-4b9b-baa8-9a04be505252-kserve-provision-location\") pod \"80b222d6-0d0d-4b9b-baa8-9a04be505252\" (UID: \"80b222d6-0d0d-4b9b-baa8-9a04be505252\") " Apr 20 20:23:39.441555 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:39.441524 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80b222d6-0d0d-4b9b-baa8-9a04be505252-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-runtime-kube-rbac-proxy-sar-config") pod "80b222d6-0d0d-4b9b-baa8-9a04be505252" (UID: "80b222d6-0d0d-4b9b-baa8-9a04be505252"). InnerVolumeSpecName "isvc-lightgbm-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:23:39.441749 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:39.441566 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80b222d6-0d0d-4b9b-baa8-9a04be505252-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "80b222d6-0d0d-4b9b-baa8-9a04be505252" (UID: "80b222d6-0d0d-4b9b-baa8-9a04be505252"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:23:39.443193 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:39.443171 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80b222d6-0d0d-4b9b-baa8-9a04be505252-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "80b222d6-0d0d-4b9b-baa8-9a04be505252" (UID: "80b222d6-0d0d-4b9b-baa8-9a04be505252"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:23:39.443312 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:39.443205 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80b222d6-0d0d-4b9b-baa8-9a04be505252-kube-api-access-cvqkd" (OuterVolumeSpecName: "kube-api-access-cvqkd") pod "80b222d6-0d0d-4b9b-baa8-9a04be505252" (UID: "80b222d6-0d0d-4b9b-baa8-9a04be505252"). InnerVolumeSpecName "kube-api-access-cvqkd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:23:39.486891 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:39.486862 2576 generic.go:358] "Generic (PLEG): container finished" podID="80b222d6-0d0d-4b9b-baa8-9a04be505252" containerID="86bb97a7e27b55832b59d2c770ad647e2412b4e530791e0a2a01a35474c5a7bf" exitCode=0 Apr 20 20:23:39.487016 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:39.486939 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc" Apr 20 20:23:39.487016 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:39.486947 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc" event={"ID":"80b222d6-0d0d-4b9b-baa8-9a04be505252","Type":"ContainerDied","Data":"86bb97a7e27b55832b59d2c770ad647e2412b4e530791e0a2a01a35474c5a7bf"} Apr 20 20:23:39.487016 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:39.486990 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc" event={"ID":"80b222d6-0d0d-4b9b-baa8-9a04be505252","Type":"ContainerDied","Data":"c76d7959bcc4849bbdbd6a138ab4574f18cc724b4477c3ab2a7376962aaed470"} Apr 20 20:23:39.487016 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:39.487012 2576 scope.go:117] "RemoveContainer" containerID="cd4d7aab42e658e15c74111f9fb6e0663cbdad5a1478647ccc3ad14723ae73e6" Apr 20 20:23:39.498083 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:39.498055 2576 scope.go:117] "RemoveContainer" containerID="86bb97a7e27b55832b59d2c770ad647e2412b4e530791e0a2a01a35474c5a7bf" Apr 20 20:23:39.505305 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:39.505290 2576 scope.go:117] "RemoveContainer" containerID="edf506c5408cb0bef5cdc4293fb7ac1dcac1cc46986d352528a83e12f2470710" Apr 20 20:23:39.512288 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:39.512244 2576 scope.go:117] "RemoveContainer" containerID="cd4d7aab42e658e15c74111f9fb6e0663cbdad5a1478647ccc3ad14723ae73e6" Apr 20 20:23:39.512743 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:23:39.512715 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd4d7aab42e658e15c74111f9fb6e0663cbdad5a1478647ccc3ad14723ae73e6\": container with ID starting with cd4d7aab42e658e15c74111f9fb6e0663cbdad5a1478647ccc3ad14723ae73e6 not found: ID does not exist" containerID="cd4d7aab42e658e15c74111f9fb6e0663cbdad5a1478647ccc3ad14723ae73e6" Apr 20 20:23:39.512821 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:39.512749 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd4d7aab42e658e15c74111f9fb6e0663cbdad5a1478647ccc3ad14723ae73e6"} err="failed to get container status \"cd4d7aab42e658e15c74111f9fb6e0663cbdad5a1478647ccc3ad14723ae73e6\": rpc error: code = NotFound desc = could not find container \"cd4d7aab42e658e15c74111f9fb6e0663cbdad5a1478647ccc3ad14723ae73e6\": container with ID starting with cd4d7aab42e658e15c74111f9fb6e0663cbdad5a1478647ccc3ad14723ae73e6 not found: ID does not exist" Apr 20 20:23:39.512821 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:39.512766 2576 scope.go:117] "RemoveContainer" containerID="86bb97a7e27b55832b59d2c770ad647e2412b4e530791e0a2a01a35474c5a7bf" Apr 20 20:23:39.513023 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:23:39.513008 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86bb97a7e27b55832b59d2c770ad647e2412b4e530791e0a2a01a35474c5a7bf\": container with ID starting with 86bb97a7e27b55832b59d2c770ad647e2412b4e530791e0a2a01a35474c5a7bf not found: ID does not exist" containerID="86bb97a7e27b55832b59d2c770ad647e2412b4e530791e0a2a01a35474c5a7bf" Apr 20 20:23:39.513076 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:39.513027 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86bb97a7e27b55832b59d2c770ad647e2412b4e530791e0a2a01a35474c5a7bf"} err="failed to get container status \"86bb97a7e27b55832b59d2c770ad647e2412b4e530791e0a2a01a35474c5a7bf\": rpc error: code = NotFound desc = could not find container \"86bb97a7e27b55832b59d2c770ad647e2412b4e530791e0a2a01a35474c5a7bf\": container with ID starting with 86bb97a7e27b55832b59d2c770ad647e2412b4e530791e0a2a01a35474c5a7bf not found: ID does not exist" Apr 20 20:23:39.513076 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:39.513049 2576 scope.go:117] "RemoveContainer" containerID="edf506c5408cb0bef5cdc4293fb7ac1dcac1cc46986d352528a83e12f2470710" Apr 20 20:23:39.513298 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:23:39.513277 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edf506c5408cb0bef5cdc4293fb7ac1dcac1cc46986d352528a83e12f2470710\": container with ID starting with edf506c5408cb0bef5cdc4293fb7ac1dcac1cc46986d352528a83e12f2470710 not found: ID does not exist" containerID="edf506c5408cb0bef5cdc4293fb7ac1dcac1cc46986d352528a83e12f2470710" Apr 20 20:23:39.513415 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:39.513297 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edf506c5408cb0bef5cdc4293fb7ac1dcac1cc46986d352528a83e12f2470710"} err="failed to get container status \"edf506c5408cb0bef5cdc4293fb7ac1dcac1cc46986d352528a83e12f2470710\": rpc error: code = NotFound desc = could not find container \"edf506c5408cb0bef5cdc4293fb7ac1dcac1cc46986d352528a83e12f2470710\": container with ID starting with edf506c5408cb0bef5cdc4293fb7ac1dcac1cc46986d352528a83e12f2470710 not found: ID does not exist" Apr 20 20:23:39.514191 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:39.514167 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc"] Apr 20 20:23:39.517590 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:39.517569 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-m42sc"] Apr 20 20:23:39.542105 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:39.542086 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/80b222d6-0d0d-4b9b-baa8-9a04be505252-kserve-provision-location\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:23:39.542199 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:39.542125 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/80b222d6-0d0d-4b9b-baa8-9a04be505252-proxy-tls\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:23:39.542199 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:39.542137 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/80b222d6-0d0d-4b9b-baa8-9a04be505252-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:23:39.542199 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:39.542146 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cvqkd\" (UniqueName: \"kubernetes.io/projected/80b222d6-0d0d-4b9b-baa8-9a04be505252-kube-api-access-cvqkd\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:23:39.944615 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:39.944584 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80b222d6-0d0d-4b9b-baa8-9a04be505252" path="/var/lib/kubelet/pods/80b222d6-0d0d-4b9b-baa8-9a04be505252/volumes" Apr 20 20:23:40.491954 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:40.491882 2576 generic.go:358] "Generic (PLEG): container finished" podID="516a7d76-1274-4e97-9902-e7e3318f799f" containerID="7fa3a03f28e24a431eb871a9507709288caeb5c4bb5a4cf9b5cdd49d1339b3f7" exitCode=0 Apr 20 20:23:40.492391 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:23:40.491957 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dn5mn" event={"ID":"516a7d76-1274-4e97-9902-e7e3318f799f","Type":"ContainerDied","Data":"7fa3a03f28e24a431eb871a9507709288caeb5c4bb5a4cf9b5cdd49d1339b3f7"} Apr 20 20:25:54.357987 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:25:54.352769 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z55qt_f78ac3d9-bcf1-43dd-aac7-1678831ee3ba/ovn-acl-logging/0.log" Apr 20 20:25:54.357987 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:25:54.354018 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z55qt_f78ac3d9-bcf1-43dd-aac7-1678831ee3ba/ovn-acl-logging/0.log" Apr 20 20:25:54.989675 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:25:54.989644 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dn5mn" event={"ID":"516a7d76-1274-4e97-9902-e7e3318f799f","Type":"ContainerStarted","Data":"545197127e07c9d87b4b8f637a204227d464fe362875870c083623dc5a263268"} Apr 20 20:25:55.994736 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:25:55.994700 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dn5mn" event={"ID":"516a7d76-1274-4e97-9902-e7e3318f799f","Type":"ContainerStarted","Data":"88c4c7e8d4d31ac721705ba3a45e45bdb7346440c39cc4b36dff08cd014a9018"} Apr 20 20:25:55.995127 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:25:55.994880 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dn5mn" Apr 20 20:25:55.995127 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:25:55.994901 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dn5mn" Apr 20 20:25:56.021057 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:25:56.021014 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dn5mn" podStartSLOduration=6.647864484 podStartE2EDuration="2m21.021000852s" podCreationTimestamp="2026-04-20 20:23:35 +0000 UTC" firstStartedPulling="2026-04-20 20:23:40.493046713 +0000 UTC m=+1095.205728687" lastFinishedPulling="2026-04-20 20:25:54.866183067 +0000 UTC m=+1229.578865055" observedRunningTime="2026-04-20 20:25:56.019212412 +0000 UTC m=+1230.731894410" watchObservedRunningTime="2026-04-20 20:25:56.021000852 +0000 UTC m=+1230.733682849" Apr 20 20:26:02.004861 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:02.004833 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dn5mn" Apr 20 20:26:32.008284 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:32.008212 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dn5mn" Apr 20 20:26:35.440733 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:35.440703 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dn5mn"] Apr 20 20:26:35.441197 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:35.441024 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dn5mn" podUID="516a7d76-1274-4e97-9902-e7e3318f799f" containerName="kserve-container" containerID="cri-o://545197127e07c9d87b4b8f637a204227d464fe362875870c083623dc5a263268" gracePeriod=30 Apr 20 20:26:35.441197 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:35.441073 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dn5mn" podUID="516a7d76-1274-4e97-9902-e7e3318f799f" containerName="kube-rbac-proxy" containerID="cri-o://88c4c7e8d4d31ac721705ba3a45e45bdb7346440c39cc4b36dff08cd014a9018" gracePeriod=30 Apr 20 20:26:35.533442 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:35.533405 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j4699"] Apr 20 20:26:35.533927 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:35.533907 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="80b222d6-0d0d-4b9b-baa8-9a04be505252" containerName="kube-rbac-proxy" Apr 20 20:26:35.533927 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:35.533928 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="80b222d6-0d0d-4b9b-baa8-9a04be505252" containerName="kube-rbac-proxy" Apr 20 20:26:35.534054 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:35.533942 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="80b222d6-0d0d-4b9b-baa8-9a04be505252" containerName="storage-initializer" Apr 20 20:26:35.534054 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:35.533948 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="80b222d6-0d0d-4b9b-baa8-9a04be505252" containerName="storage-initializer" Apr 20 20:26:35.534054 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:35.533954 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="80b222d6-0d0d-4b9b-baa8-9a04be505252" containerName="kserve-container" Apr 20 20:26:35.534054 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:35.533960 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="80b222d6-0d0d-4b9b-baa8-9a04be505252" containerName="kserve-container" Apr 20 20:26:35.534054 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:35.534024 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="80b222d6-0d0d-4b9b-baa8-9a04be505252" containerName="kserve-container" Apr 20 20:26:35.534054 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:35.534032 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="80b222d6-0d0d-4b9b-baa8-9a04be505252" containerName="kube-rbac-proxy" Apr 20 20:26:35.537868 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:35.537849 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j4699" Apr 20 20:26:35.540298 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:35.540274 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\"" Apr 20 20:26:35.540445 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:35.540424 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-kserve-predictor-serving-cert\"" Apr 20 20:26:35.546704 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:35.546319 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j4699"] Apr 20 20:26:35.620933 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:35.620890 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/18e1098c-7e0e-484a-8a6b-94676d4802bc-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-j4699\" (UID: \"18e1098c-7e0e-484a-8a6b-94676d4802bc\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j4699" Apr 20 20:26:35.621100 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:35.620951 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/18e1098c-7e0e-484a-8a6b-94676d4802bc-proxy-tls\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-j4699\" (UID: \"18e1098c-7e0e-484a-8a6b-94676d4802bc\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j4699" Apr 20 20:26:35.621100 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:35.621009 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/18e1098c-7e0e-484a-8a6b-94676d4802bc-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-j4699\" (UID: \"18e1098c-7e0e-484a-8a6b-94676d4802bc\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j4699" Apr 20 20:26:35.621100 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:35.621080 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5h22p\" (UniqueName: \"kubernetes.io/projected/18e1098c-7e0e-484a-8a6b-94676d4802bc-kube-api-access-5h22p\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-j4699\" (UID: \"18e1098c-7e0e-484a-8a6b-94676d4802bc\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j4699" Apr 20 20:26:35.722152 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:35.722038 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5h22p\" (UniqueName: \"kubernetes.io/projected/18e1098c-7e0e-484a-8a6b-94676d4802bc-kube-api-access-5h22p\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-j4699\" (UID: \"18e1098c-7e0e-484a-8a6b-94676d4802bc\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j4699" Apr 20 20:26:35.722152 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:35.722099 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/18e1098c-7e0e-484a-8a6b-94676d4802bc-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-j4699\" (UID: \"18e1098c-7e0e-484a-8a6b-94676d4802bc\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j4699" Apr 20 20:26:35.722385 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:35.722153 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/18e1098c-7e0e-484a-8a6b-94676d4802bc-proxy-tls\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-j4699\" (UID: \"18e1098c-7e0e-484a-8a6b-94676d4802bc\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j4699" Apr 20 20:26:35.722385 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:35.722190 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/18e1098c-7e0e-484a-8a6b-94676d4802bc-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-j4699\" (UID: \"18e1098c-7e0e-484a-8a6b-94676d4802bc\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j4699" Apr 20 20:26:35.722620 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:35.722597 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/18e1098c-7e0e-484a-8a6b-94676d4802bc-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-j4699\" (UID: \"18e1098c-7e0e-484a-8a6b-94676d4802bc\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j4699" Apr 20 20:26:35.722841 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:35.722821 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/18e1098c-7e0e-484a-8a6b-94676d4802bc-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-j4699\" (UID: \"18e1098c-7e0e-484a-8a6b-94676d4802bc\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j4699" Apr 20 20:26:35.724709 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:35.724682 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/18e1098c-7e0e-484a-8a6b-94676d4802bc-proxy-tls\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-j4699\" (UID: \"18e1098c-7e0e-484a-8a6b-94676d4802bc\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j4699" Apr 20 20:26:35.730211 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:35.730189 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5h22p\" (UniqueName: \"kubernetes.io/projected/18e1098c-7e0e-484a-8a6b-94676d4802bc-kube-api-access-5h22p\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-j4699\" (UID: \"18e1098c-7e0e-484a-8a6b-94676d4802bc\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j4699" Apr 20 20:26:35.850235 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:35.850206 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j4699" Apr 20 20:26:35.973156 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:35.973134 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j4699"] Apr 20 20:26:35.975349 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:26:35.975321 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18e1098c_7e0e_484a_8a6b_94676d4802bc.slice/crio-592a56a4ac347420ec306ce40c333350b79f2a3168c230ccc7ad1891415efe0e WatchSource:0}: Error finding container 592a56a4ac347420ec306ce40c333350b79f2a3168c230ccc7ad1891415efe0e: Status 404 returned error can't find the container with id 592a56a4ac347420ec306ce40c333350b79f2a3168c230ccc7ad1891415efe0e Apr 20 20:26:36.125495 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:36.125454 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j4699" event={"ID":"18e1098c-7e0e-484a-8a6b-94676d4802bc","Type":"ContainerStarted","Data":"90cbbf472c04ab1b9f3bc4d1acfcde2a719a384d2e89c250e28b2c58fb4bd34f"} Apr 20 20:26:36.125495 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:36.125503 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j4699" event={"ID":"18e1098c-7e0e-484a-8a6b-94676d4802bc","Type":"ContainerStarted","Data":"592a56a4ac347420ec306ce40c333350b79f2a3168c230ccc7ad1891415efe0e"} Apr 20 20:26:36.127612 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:36.127583 2576 generic.go:358] "Generic (PLEG): container finished" podID="516a7d76-1274-4e97-9902-e7e3318f799f" containerID="88c4c7e8d4d31ac721705ba3a45e45bdb7346440c39cc4b36dff08cd014a9018" exitCode=2 Apr 20 20:26:36.127741 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:36.127624 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dn5mn" event={"ID":"516a7d76-1274-4e97-9902-e7e3318f799f","Type":"ContainerDied","Data":"88c4c7e8d4d31ac721705ba3a45e45bdb7346440c39cc4b36dff08cd014a9018"} Apr 20 20:26:36.501590 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:36.501565 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dn5mn" Apr 20 20:26:36.630731 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:36.630696 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/516a7d76-1274-4e97-9902-e7e3318f799f-kserve-provision-location\") pod \"516a7d76-1274-4e97-9902-e7e3318f799f\" (UID: \"516a7d76-1274-4e97-9902-e7e3318f799f\") " Apr 20 20:26:36.630731 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:36.630735 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/516a7d76-1274-4e97-9902-e7e3318f799f-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"516a7d76-1274-4e97-9902-e7e3318f799f\" (UID: \"516a7d76-1274-4e97-9902-e7e3318f799f\") " Apr 20 20:26:36.630993 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:36.630794 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzqd7\" (UniqueName: \"kubernetes.io/projected/516a7d76-1274-4e97-9902-e7e3318f799f-kube-api-access-jzqd7\") pod \"516a7d76-1274-4e97-9902-e7e3318f799f\" (UID: \"516a7d76-1274-4e97-9902-e7e3318f799f\") " Apr 20 20:26:36.630993 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:36.630819 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/516a7d76-1274-4e97-9902-e7e3318f799f-proxy-tls\") pod \"516a7d76-1274-4e97-9902-e7e3318f799f\" (UID: \"516a7d76-1274-4e97-9902-e7e3318f799f\") " Apr 20 20:26:36.631166 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:36.631023 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/516a7d76-1274-4e97-9902-e7e3318f799f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "516a7d76-1274-4e97-9902-e7e3318f799f" (UID: "516a7d76-1274-4e97-9902-e7e3318f799f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:26:36.631246 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:36.631160 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/516a7d76-1274-4e97-9902-e7e3318f799f-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config") pod "516a7d76-1274-4e97-9902-e7e3318f799f" (UID: "516a7d76-1274-4e97-9902-e7e3318f799f"). InnerVolumeSpecName "isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:26:36.632880 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:36.632854 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/516a7d76-1274-4e97-9902-e7e3318f799f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "516a7d76-1274-4e97-9902-e7e3318f799f" (UID: "516a7d76-1274-4e97-9902-e7e3318f799f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:26:36.632971 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:36.632894 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/516a7d76-1274-4e97-9902-e7e3318f799f-kube-api-access-jzqd7" (OuterVolumeSpecName: "kube-api-access-jzqd7") pod "516a7d76-1274-4e97-9902-e7e3318f799f" (UID: "516a7d76-1274-4e97-9902-e7e3318f799f"). InnerVolumeSpecName "kube-api-access-jzqd7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:26:36.731589 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:36.731558 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jzqd7\" (UniqueName: \"kubernetes.io/projected/516a7d76-1274-4e97-9902-e7e3318f799f-kube-api-access-jzqd7\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:26:36.731589 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:36.731585 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/516a7d76-1274-4e97-9902-e7e3318f799f-proxy-tls\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:26:36.731752 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:36.731598 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/516a7d76-1274-4e97-9902-e7e3318f799f-kserve-provision-location\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:26:36.731752 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:36.731607 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/516a7d76-1274-4e97-9902-e7e3318f799f-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:26:37.131874 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:37.131841 2576 generic.go:358] "Generic (PLEG): container finished" podID="516a7d76-1274-4e97-9902-e7e3318f799f" containerID="545197127e07c9d87b4b8f637a204227d464fe362875870c083623dc5a263268" exitCode=0 Apr 20 20:26:37.132044 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:37.131927 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dn5mn" Apr 20 20:26:37.132044 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:37.131926 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dn5mn" event={"ID":"516a7d76-1274-4e97-9902-e7e3318f799f","Type":"ContainerDied","Data":"545197127e07c9d87b4b8f637a204227d464fe362875870c083623dc5a263268"} Apr 20 20:26:37.132044 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:37.131970 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dn5mn" event={"ID":"516a7d76-1274-4e97-9902-e7e3318f799f","Type":"ContainerDied","Data":"bee95ed6469cc1f9d5effd30f3aca388eb4137fc186be22823c248a4af9417ef"} Apr 20 20:26:37.132044 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:37.131991 2576 scope.go:117] "RemoveContainer" containerID="88c4c7e8d4d31ac721705ba3a45e45bdb7346440c39cc4b36dff08cd014a9018" Apr 20 20:26:37.143798 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:37.143779 2576 scope.go:117] "RemoveContainer" containerID="545197127e07c9d87b4b8f637a204227d464fe362875870c083623dc5a263268" Apr 20 20:26:37.151952 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:37.151935 2576 scope.go:117] "RemoveContainer" containerID="7fa3a03f28e24a431eb871a9507709288caeb5c4bb5a4cf9b5cdd49d1339b3f7" Apr 20 20:26:37.157025 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:37.157003 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dn5mn"] Apr 20 20:26:37.159989 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:37.159974 2576 scope.go:117] "RemoveContainer" containerID="88c4c7e8d4d31ac721705ba3a45e45bdb7346440c39cc4b36dff08cd014a9018" Apr 20 20:26:37.160267 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:26:37.160249 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88c4c7e8d4d31ac721705ba3a45e45bdb7346440c39cc4b36dff08cd014a9018\": container with ID starting with 88c4c7e8d4d31ac721705ba3a45e45bdb7346440c39cc4b36dff08cd014a9018 not found: ID does not exist" containerID="88c4c7e8d4d31ac721705ba3a45e45bdb7346440c39cc4b36dff08cd014a9018" Apr 20 20:26:37.160331 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:37.160278 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88c4c7e8d4d31ac721705ba3a45e45bdb7346440c39cc4b36dff08cd014a9018"} err="failed to get container status \"88c4c7e8d4d31ac721705ba3a45e45bdb7346440c39cc4b36dff08cd014a9018\": rpc error: code = NotFound desc = could not find container \"88c4c7e8d4d31ac721705ba3a45e45bdb7346440c39cc4b36dff08cd014a9018\": container with ID starting with 88c4c7e8d4d31ac721705ba3a45e45bdb7346440c39cc4b36dff08cd014a9018 not found: ID does not exist" Apr 20 20:26:37.160331 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:37.160296 2576 scope.go:117] "RemoveContainer" containerID="545197127e07c9d87b4b8f637a204227d464fe362875870c083623dc5a263268" Apr 20 20:26:37.160556 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:26:37.160538 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"545197127e07c9d87b4b8f637a204227d464fe362875870c083623dc5a263268\": container with ID starting with 545197127e07c9d87b4b8f637a204227d464fe362875870c083623dc5a263268 not found: ID does not exist" containerID="545197127e07c9d87b4b8f637a204227d464fe362875870c083623dc5a263268" Apr 20 20:26:37.160615 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:37.160560 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"545197127e07c9d87b4b8f637a204227d464fe362875870c083623dc5a263268"} err="failed to get container status \"545197127e07c9d87b4b8f637a204227d464fe362875870c083623dc5a263268\": rpc error: code = NotFound desc = could not find container \"545197127e07c9d87b4b8f637a204227d464fe362875870c083623dc5a263268\": container with ID starting with 545197127e07c9d87b4b8f637a204227d464fe362875870c083623dc5a263268 not found: ID does not exist" Apr 20 20:26:37.160615 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:37.160579 2576 scope.go:117] "RemoveContainer" containerID="7fa3a03f28e24a431eb871a9507709288caeb5c4bb5a4cf9b5cdd49d1339b3f7" Apr 20 20:26:37.160724 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:37.160617 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dn5mn"] Apr 20 20:26:37.160789 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:26:37.160774 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fa3a03f28e24a431eb871a9507709288caeb5c4bb5a4cf9b5cdd49d1339b3f7\": container with ID starting with 7fa3a03f28e24a431eb871a9507709288caeb5c4bb5a4cf9b5cdd49d1339b3f7 not found: ID does not exist" containerID="7fa3a03f28e24a431eb871a9507709288caeb5c4bb5a4cf9b5cdd49d1339b3f7" Apr 20 20:26:37.160829 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:37.160791 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fa3a03f28e24a431eb871a9507709288caeb5c4bb5a4cf9b5cdd49d1339b3f7"} err="failed to get container status \"7fa3a03f28e24a431eb871a9507709288caeb5c4bb5a4cf9b5cdd49d1339b3f7\": rpc error: code = NotFound desc = could not find container \"7fa3a03f28e24a431eb871a9507709288caeb5c4bb5a4cf9b5cdd49d1339b3f7\": container with ID starting with 7fa3a03f28e24a431eb871a9507709288caeb5c4bb5a4cf9b5cdd49d1339b3f7 not found: ID does not exist" Apr 20 20:26:37.944902 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:37.944864 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="516a7d76-1274-4e97-9902-e7e3318f799f" path="/var/lib/kubelet/pods/516a7d76-1274-4e97-9902-e7e3318f799f/volumes" Apr 20 20:26:40.144895 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:40.144809 2576 generic.go:358] "Generic (PLEG): container finished" podID="18e1098c-7e0e-484a-8a6b-94676d4802bc" containerID="90cbbf472c04ab1b9f3bc4d1acfcde2a719a384d2e89c250e28b2c58fb4bd34f" exitCode=0 Apr 20 20:26:40.144895 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:40.144884 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j4699" event={"ID":"18e1098c-7e0e-484a-8a6b-94676d4802bc","Type":"ContainerDied","Data":"90cbbf472c04ab1b9f3bc4d1acfcde2a719a384d2e89c250e28b2c58fb4bd34f"} Apr 20 20:26:41.149978 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:41.149943 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j4699" event={"ID":"18e1098c-7e0e-484a-8a6b-94676d4802bc","Type":"ContainerStarted","Data":"6728b9bc5e49f4d4969c917006dc35805509e11d2f1dec8adaee7d2351e8fb0b"} Apr 20 20:26:41.150382 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:41.149991 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j4699" event={"ID":"18e1098c-7e0e-484a-8a6b-94676d4802bc","Type":"ContainerStarted","Data":"133d2edafdc8757adab6b15d347ee2952c8557ad03de00e44e7a5f98b82681c5"} Apr 20 20:26:41.150382 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:41.150304 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j4699" Apr 20 20:26:41.150479 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:41.150444 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j4699" Apr 20 20:26:41.151683 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:41.151659 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j4699" podUID="18e1098c-7e0e-484a-8a6b-94676d4802bc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 20 20:26:41.168704 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:41.168655 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j4699" podStartSLOduration=6.168644074 podStartE2EDuration="6.168644074s" podCreationTimestamp="2026-04-20 20:26:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:26:41.167384817 +0000 UTC m=+1275.880066814" watchObservedRunningTime="2026-04-20 20:26:41.168644074 +0000 UTC m=+1275.881326070" Apr 20 20:26:42.153245 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:42.153197 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j4699" podUID="18e1098c-7e0e-484a-8a6b-94676d4802bc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 20 20:26:47.157911 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:47.157883 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j4699" Apr 20 20:26:47.158442 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:47.158411 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j4699" podUID="18e1098c-7e0e-484a-8a6b-94676d4802bc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 20 20:26:57.159888 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:26:57.159862 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j4699" Apr 20 20:27:05.682881 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:05.682846 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j4699"] Apr 20 20:27:05.683401 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:05.683155 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j4699" podUID="18e1098c-7e0e-484a-8a6b-94676d4802bc" containerName="kserve-container" containerID="cri-o://133d2edafdc8757adab6b15d347ee2952c8557ad03de00e44e7a5f98b82681c5" gracePeriod=30 Apr 20 20:27:05.683401 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:05.683190 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j4699" podUID="18e1098c-7e0e-484a-8a6b-94676d4802bc" containerName="kube-rbac-proxy" containerID="cri-o://6728b9bc5e49f4d4969c917006dc35805509e11d2f1dec8adaee7d2351e8fb0b" gracePeriod=30 Apr 20 20:27:06.159382 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:06.159349 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-sks8k"] Apr 20 20:27:06.159746 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:06.159729 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="516a7d76-1274-4e97-9902-e7e3318f799f" containerName="storage-initializer" Apr 20 20:27:06.159825 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:06.159749 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="516a7d76-1274-4e97-9902-e7e3318f799f" containerName="storage-initializer" Apr 20 20:27:06.159825 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:06.159772 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="516a7d76-1274-4e97-9902-e7e3318f799f" containerName="kube-rbac-proxy" Apr 20 20:27:06.159825 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:06.159781 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="516a7d76-1274-4e97-9902-e7e3318f799f" containerName="kube-rbac-proxy" Apr 20 20:27:06.159825 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:06.159803 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="516a7d76-1274-4e97-9902-e7e3318f799f" containerName="kserve-container" Apr 20 20:27:06.159825 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:06.159811 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="516a7d76-1274-4e97-9902-e7e3318f799f" containerName="kserve-container" Apr 20 20:27:06.160081 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:06.159897 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="516a7d76-1274-4e97-9902-e7e3318f799f" containerName="kserve-container" Apr 20 20:27:06.160081 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:06.159912 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="516a7d76-1274-4e97-9902-e7e3318f799f" containerName="kube-rbac-proxy" Apr 20 20:27:06.163008 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:06.162987 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-sks8k" Apr 20 20:27:06.165386 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:06.165368 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-mlflow-v2-runtime-predictor-serving-cert\"" Apr 20 20:27:06.165478 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:06.165386 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 20 20:27:06.220769 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:06.220739 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-sks8k"] Apr 20 20:27:06.234064 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:06.234034 2576 generic.go:358] "Generic (PLEG): container finished" podID="18e1098c-7e0e-484a-8a6b-94676d4802bc" containerID="6728b9bc5e49f4d4969c917006dc35805509e11d2f1dec8adaee7d2351e8fb0b" exitCode=2 Apr 20 20:27:06.234064 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:06.234061 2576 generic.go:358] "Generic (PLEG): container finished" podID="18e1098c-7e0e-484a-8a6b-94676d4802bc" containerID="133d2edafdc8757adab6b15d347ee2952c8557ad03de00e44e7a5f98b82681c5" exitCode=0 Apr 20 20:27:06.234249 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:06.234166 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j4699" event={"ID":"18e1098c-7e0e-484a-8a6b-94676d4802bc","Type":"ContainerDied","Data":"6728b9bc5e49f4d4969c917006dc35805509e11d2f1dec8adaee7d2351e8fb0b"} Apr 20 20:27:06.234249 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:06.234196 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j4699" event={"ID":"18e1098c-7e0e-484a-8a6b-94676d4802bc","Type":"ContainerDied","Data":"133d2edafdc8757adab6b15d347ee2952c8557ad03de00e44e7a5f98b82681c5"} Apr 20 20:27:06.265047 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:06.265022 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9gc8\" (UniqueName: \"kubernetes.io/projected/83db2bdf-7f5d-405e-9fa4-8a76ee99d46c-kube-api-access-c9gc8\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-sks8k\" (UID: \"83db2bdf-7f5d-405e-9fa4-8a76ee99d46c\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-sks8k" Apr 20 20:27:06.265166 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:06.265102 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/83db2bdf-7f5d-405e-9fa4-8a76ee99d46c-proxy-tls\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-sks8k\" (UID: \"83db2bdf-7f5d-405e-9fa4-8a76ee99d46c\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-sks8k" Apr 20 20:27:06.265214 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:06.265173 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/83db2bdf-7f5d-405e-9fa4-8a76ee99d46c-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-sks8k\" (UID: \"83db2bdf-7f5d-405e-9fa4-8a76ee99d46c\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-sks8k" Apr 20 20:27:06.265255 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:06.265212 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/83db2bdf-7f5d-405e-9fa4-8a76ee99d46c-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-sks8k\" (UID: \"83db2bdf-7f5d-405e-9fa4-8a76ee99d46c\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-sks8k" Apr 20 20:27:06.338204 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:06.338184 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j4699" Apr 20 20:27:06.365738 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:06.365716 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/18e1098c-7e0e-484a-8a6b-94676d4802bc-proxy-tls\") pod \"18e1098c-7e0e-484a-8a6b-94676d4802bc\" (UID: \"18e1098c-7e0e-484a-8a6b-94676d4802bc\") " Apr 20 20:27:06.365872 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:06.365767 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/18e1098c-7e0e-484a-8a6b-94676d4802bc-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"18e1098c-7e0e-484a-8a6b-94676d4802bc\" (UID: \"18e1098c-7e0e-484a-8a6b-94676d4802bc\") " Apr 20 20:27:06.365872 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:06.365832 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5h22p\" (UniqueName: \"kubernetes.io/projected/18e1098c-7e0e-484a-8a6b-94676d4802bc-kube-api-access-5h22p\") pod \"18e1098c-7e0e-484a-8a6b-94676d4802bc\" (UID: \"18e1098c-7e0e-484a-8a6b-94676d4802bc\") " Apr 20 20:27:06.365872 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:06.365866 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/18e1098c-7e0e-484a-8a6b-94676d4802bc-kserve-provision-location\") pod \"18e1098c-7e0e-484a-8a6b-94676d4802bc\" (UID: \"18e1098c-7e0e-484a-8a6b-94676d4802bc\") " Apr 20 20:27:06.366051 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:06.365943 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/83db2bdf-7f5d-405e-9fa4-8a76ee99d46c-proxy-tls\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-sks8k\" (UID: \"83db2bdf-7f5d-405e-9fa4-8a76ee99d46c\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-sks8k" Apr 20 20:27:06.366051 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:06.365997 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/83db2bdf-7f5d-405e-9fa4-8a76ee99d46c-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-sks8k\" (UID: \"83db2bdf-7f5d-405e-9fa4-8a76ee99d46c\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-sks8k" Apr 20 20:27:06.366051 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:06.366025 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/83db2bdf-7f5d-405e-9fa4-8a76ee99d46c-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-sks8k\" (UID: \"83db2bdf-7f5d-405e-9fa4-8a76ee99d46c\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-sks8k" Apr 20 20:27:06.366257 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:06.366087 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c9gc8\" (UniqueName: \"kubernetes.io/projected/83db2bdf-7f5d-405e-9fa4-8a76ee99d46c-kube-api-access-c9gc8\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-sks8k\" (UID: \"83db2bdf-7f5d-405e-9fa4-8a76ee99d46c\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-sks8k" Apr 20 20:27:06.366314 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:06.366179 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18e1098c-7e0e-484a-8a6b-94676d4802bc-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config") pod "18e1098c-7e0e-484a-8a6b-94676d4802bc" (UID: "18e1098c-7e0e-484a-8a6b-94676d4802bc"). InnerVolumeSpecName "isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:27:06.366686 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:06.366657 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/83db2bdf-7f5d-405e-9fa4-8a76ee99d46c-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-sks8k\" (UID: \"83db2bdf-7f5d-405e-9fa4-8a76ee99d46c\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-sks8k" Apr 20 20:27:06.366847 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:06.366819 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/83db2bdf-7f5d-405e-9fa4-8a76ee99d46c-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-sks8k\" (UID: \"83db2bdf-7f5d-405e-9fa4-8a76ee99d46c\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-sks8k" Apr 20 20:27:06.366948 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:06.366856 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/18e1098c-7e0e-484a-8a6b-94676d4802bc-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "18e1098c-7e0e-484a-8a6b-94676d4802bc" (UID: "18e1098c-7e0e-484a-8a6b-94676d4802bc"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:27:06.368056 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:06.368026 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/18e1098c-7e0e-484a-8a6b-94676d4802bc-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "18e1098c-7e0e-484a-8a6b-94676d4802bc" (UID: "18e1098c-7e0e-484a-8a6b-94676d4802bc"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:27:06.368383 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:06.368364 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18e1098c-7e0e-484a-8a6b-94676d4802bc-kube-api-access-5h22p" (OuterVolumeSpecName: "kube-api-access-5h22p") pod "18e1098c-7e0e-484a-8a6b-94676d4802bc" (UID: "18e1098c-7e0e-484a-8a6b-94676d4802bc"). InnerVolumeSpecName "kube-api-access-5h22p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:27:06.368712 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:06.368695 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/83db2bdf-7f5d-405e-9fa4-8a76ee99d46c-proxy-tls\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-sks8k\" (UID: \"83db2bdf-7f5d-405e-9fa4-8a76ee99d46c\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-sks8k" Apr 20 20:27:06.373669 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:06.373651 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9gc8\" (UniqueName: \"kubernetes.io/projected/83db2bdf-7f5d-405e-9fa4-8a76ee99d46c-kube-api-access-c9gc8\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-sks8k\" (UID: \"83db2bdf-7f5d-405e-9fa4-8a76ee99d46c\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-sks8k" Apr 20 20:27:06.467494 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:06.467431 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5h22p\" (UniqueName: \"kubernetes.io/projected/18e1098c-7e0e-484a-8a6b-94676d4802bc-kube-api-access-5h22p\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:27:06.467494 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:06.467455 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/18e1098c-7e0e-484a-8a6b-94676d4802bc-kserve-provision-location\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:27:06.467494 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:06.467465 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/18e1098c-7e0e-484a-8a6b-94676d4802bc-proxy-tls\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:27:06.467494 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:06.467476 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/18e1098c-7e0e-484a-8a6b-94676d4802bc-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:27:06.510641 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:06.510614 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-sks8k" Apr 20 20:27:06.633529 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:06.633503 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-sks8k"] Apr 20 20:27:06.635165 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:27:06.635138 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83db2bdf_7f5d_405e_9fa4_8a76ee99d46c.slice/crio-0ea64bd8366fe781811e54499c0705c07da0405fa47664950158a2e3c3ebffd7 WatchSource:0}: Error finding container 0ea64bd8366fe781811e54499c0705c07da0405fa47664950158a2e3c3ebffd7: Status 404 returned error can't find the container with id 0ea64bd8366fe781811e54499c0705c07da0405fa47664950158a2e3c3ebffd7 Apr 20 20:27:07.246381 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:07.246337 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j4699" event={"ID":"18e1098c-7e0e-484a-8a6b-94676d4802bc","Type":"ContainerDied","Data":"592a56a4ac347420ec306ce40c333350b79f2a3168c230ccc7ad1891415efe0e"} Apr 20 20:27:07.246791 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:07.246389 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j4699" Apr 20 20:27:07.246791 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:07.246398 2576 scope.go:117] "RemoveContainer" containerID="6728b9bc5e49f4d4969c917006dc35805509e11d2f1dec8adaee7d2351e8fb0b" Apr 20 20:27:07.248033 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:07.248008 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-sks8k" event={"ID":"83db2bdf-7f5d-405e-9fa4-8a76ee99d46c","Type":"ContainerStarted","Data":"3e29466b5eb448d6d441b462963eed13a2b37b16c8ad47b357d8751fdec1108e"} Apr 20 20:27:07.248197 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:07.248040 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-sks8k" event={"ID":"83db2bdf-7f5d-405e-9fa4-8a76ee99d46c","Type":"ContainerStarted","Data":"0ea64bd8366fe781811e54499c0705c07da0405fa47664950158a2e3c3ebffd7"} Apr 20 20:27:07.255184 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:07.255165 2576 scope.go:117] "RemoveContainer" containerID="133d2edafdc8757adab6b15d347ee2952c8557ad03de00e44e7a5f98b82681c5" Apr 20 20:27:07.262669 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:07.262650 2576 scope.go:117] "RemoveContainer" containerID="90cbbf472c04ab1b9f3bc4d1acfcde2a719a384d2e89c250e28b2c58fb4bd34f" Apr 20 20:27:07.279768 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:07.279747 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j4699"] Apr 20 20:27:07.283526 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:07.283505 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-j4699"] Apr 20 20:27:07.947794 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:07.947755 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18e1098c-7e0e-484a-8a6b-94676d4802bc" path="/var/lib/kubelet/pods/18e1098c-7e0e-484a-8a6b-94676d4802bc/volumes" Apr 20 20:27:11.261542 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:11.261473 2576 generic.go:358] "Generic (PLEG): container finished" podID="83db2bdf-7f5d-405e-9fa4-8a76ee99d46c" containerID="3e29466b5eb448d6d441b462963eed13a2b37b16c8ad47b357d8751fdec1108e" exitCode=0 Apr 20 20:27:11.261975 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:11.261549 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-sks8k" event={"ID":"83db2bdf-7f5d-405e-9fa4-8a76ee99d46c","Type":"ContainerDied","Data":"3e29466b5eb448d6d441b462963eed13a2b37b16c8ad47b357d8751fdec1108e"} Apr 20 20:27:12.267178 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:12.267139 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-sks8k" event={"ID":"83db2bdf-7f5d-405e-9fa4-8a76ee99d46c","Type":"ContainerStarted","Data":"a5e64e2b4edf20db43d948be3a05e7b321c8773ba1f5d07c3e7fbc1dde924a24"} Apr 20 20:27:12.267573 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:12.267188 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-sks8k" event={"ID":"83db2bdf-7f5d-405e-9fa4-8a76ee99d46c","Type":"ContainerStarted","Data":"8e9bb9c7e376da03c13f2665b15e2e0e116aff703b502c412a890b8e33433b01"} Apr 20 20:27:12.267573 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:12.267459 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-sks8k" Apr 20 20:27:12.267573 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:12.267512 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-sks8k" Apr 20 20:27:12.286317 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:12.286246 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-sks8k" podStartSLOduration=6.286234102 podStartE2EDuration="6.286234102s" podCreationTimestamp="2026-04-20 20:27:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:27:12.285267083 +0000 UTC m=+1306.997949118" watchObservedRunningTime="2026-04-20 20:27:12.286234102 +0000 UTC m=+1306.998916101" Apr 20 20:27:18.276712 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:18.276685 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-sks8k" Apr 20 20:27:48.280275 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:48.280246 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-sks8k" Apr 20 20:27:55.801150 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:55.801100 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-sks8k"] Apr 20 20:27:55.801691 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:55.801533 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-sks8k" podUID="83db2bdf-7f5d-405e-9fa4-8a76ee99d46c" containerName="kserve-container" containerID="cri-o://8e9bb9c7e376da03c13f2665b15e2e0e116aff703b502c412a890b8e33433b01" gracePeriod=30 Apr 20 20:27:55.801691 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:55.801570 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-sks8k" podUID="83db2bdf-7f5d-405e-9fa4-8a76ee99d46c" containerName="kube-rbac-proxy" containerID="cri-o://a5e64e2b4edf20db43d948be3a05e7b321c8773ba1f5d07c3e7fbc1dde924a24" gracePeriod=30 Apr 20 20:27:55.891994 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:55.891959 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk"] Apr 20 20:27:55.892525 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:55.892504 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="18e1098c-7e0e-484a-8a6b-94676d4802bc" containerName="kserve-container" Apr 20 20:27:55.892591 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:55.892530 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="18e1098c-7e0e-484a-8a6b-94676d4802bc" containerName="kserve-container" Apr 20 20:27:55.892591 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:55.892565 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="18e1098c-7e0e-484a-8a6b-94676d4802bc" containerName="storage-initializer" Apr 20 20:27:55.892591 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:55.892574 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="18e1098c-7e0e-484a-8a6b-94676d4802bc" containerName="storage-initializer" Apr 20 20:27:55.892591 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:55.892586 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="18e1098c-7e0e-484a-8a6b-94676d4802bc" containerName="kube-rbac-proxy" Apr 20 20:27:55.892714 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:55.892594 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="18e1098c-7e0e-484a-8a6b-94676d4802bc" containerName="kube-rbac-proxy" Apr 20 20:27:55.892714 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:55.892681 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="18e1098c-7e0e-484a-8a6b-94676d4802bc" containerName="kube-rbac-proxy" Apr 20 20:27:55.892714 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:55.892693 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="18e1098c-7e0e-484a-8a6b-94676d4802bc" containerName="kserve-container" Apr 20 20:27:55.896610 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:55.896590 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk" Apr 20 20:27:55.899096 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:55.899070 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-mcp-predictor-serving-cert\"" Apr 20 20:27:55.899243 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:55.899099 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\"" Apr 20 20:27:55.908633 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:55.908607 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk"] Apr 20 20:27:56.057806 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:56.057712 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3148df86-6705-4616-bd15-e4c87159279f-proxy-tls\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk\" (UID: \"3148df86-6705-4616-bd15-e4c87159279f\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk" Apr 20 20:27:56.057806 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:56.057760 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3148df86-6705-4616-bd15-e4c87159279f-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk\" (UID: \"3148df86-6705-4616-bd15-e4c87159279f\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk" Apr 20 20:27:56.057997 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:56.057934 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qw52m\" (UniqueName: \"kubernetes.io/projected/3148df86-6705-4616-bd15-e4c87159279f-kube-api-access-qw52m\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk\" (UID: \"3148df86-6705-4616-bd15-e4c87159279f\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk" Apr 20 20:27:56.058074 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:56.058053 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3148df86-6705-4616-bd15-e4c87159279f-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk\" (UID: \"3148df86-6705-4616-bd15-e4c87159279f\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk" Apr 20 20:27:56.158898 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:56.158864 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qw52m\" (UniqueName: \"kubernetes.io/projected/3148df86-6705-4616-bd15-e4c87159279f-kube-api-access-qw52m\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk\" (UID: \"3148df86-6705-4616-bd15-e4c87159279f\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk" Apr 20 20:27:56.159140 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:56.158931 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3148df86-6705-4616-bd15-e4c87159279f-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk\" (UID: \"3148df86-6705-4616-bd15-e4c87159279f\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk" Apr 20 20:27:56.159140 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:56.158961 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3148df86-6705-4616-bd15-e4c87159279f-proxy-tls\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk\" (UID: \"3148df86-6705-4616-bd15-e4c87159279f\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk" Apr 20 20:27:56.159140 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:56.158980 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3148df86-6705-4616-bd15-e4c87159279f-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk\" (UID: \"3148df86-6705-4616-bd15-e4c87159279f\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk" Apr 20 20:27:56.159340 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:27:56.159210 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-serving-cert: secret "isvc-sklearn-mcp-predictor-serving-cert" not found Apr 20 20:27:56.159340 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:27:56.159287 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3148df86-6705-4616-bd15-e4c87159279f-proxy-tls podName:3148df86-6705-4616-bd15-e4c87159279f nodeName:}" failed. No retries permitted until 2026-04-20 20:27:56.659267809 +0000 UTC m=+1351.371949798 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/3148df86-6705-4616-bd15-e4c87159279f-proxy-tls") pod "isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk" (UID: "3148df86-6705-4616-bd15-e4c87159279f") : secret "isvc-sklearn-mcp-predictor-serving-cert" not found Apr 20 20:27:56.159428 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:56.159402 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3148df86-6705-4616-bd15-e4c87159279f-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk\" (UID: \"3148df86-6705-4616-bd15-e4c87159279f\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk" Apr 20 20:27:56.159769 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:56.159750 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3148df86-6705-4616-bd15-e4c87159279f-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk\" (UID: \"3148df86-6705-4616-bd15-e4c87159279f\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk" Apr 20 20:27:56.169893 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:56.169868 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qw52m\" (UniqueName: \"kubernetes.io/projected/3148df86-6705-4616-bd15-e4c87159279f-kube-api-access-qw52m\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk\" (UID: \"3148df86-6705-4616-bd15-e4c87159279f\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk" Apr 20 20:27:56.416438 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:56.416356 2576 generic.go:358] "Generic (PLEG): container finished" podID="83db2bdf-7f5d-405e-9fa4-8a76ee99d46c" containerID="a5e64e2b4edf20db43d948be3a05e7b321c8773ba1f5d07c3e7fbc1dde924a24" exitCode=2 Apr 20 20:27:56.416592 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:56.416427 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-sks8k" event={"ID":"83db2bdf-7f5d-405e-9fa4-8a76ee99d46c","Type":"ContainerDied","Data":"a5e64e2b4edf20db43d948be3a05e7b321c8773ba1f5d07c3e7fbc1dde924a24"} Apr 20 20:27:56.665006 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:56.664970 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3148df86-6705-4616-bd15-e4c87159279f-proxy-tls\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk\" (UID: \"3148df86-6705-4616-bd15-e4c87159279f\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk" Apr 20 20:27:56.667614 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:56.667548 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3148df86-6705-4616-bd15-e4c87159279f-proxy-tls\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk\" (UID: \"3148df86-6705-4616-bd15-e4c87159279f\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk" Apr 20 20:27:56.809786 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:56.809754 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk" Apr 20 20:27:56.953683 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:56.953660 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-sks8k" Apr 20 20:27:57.067533 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:57.067497 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/83db2bdf-7f5d-405e-9fa4-8a76ee99d46c-proxy-tls\") pod \"83db2bdf-7f5d-405e-9fa4-8a76ee99d46c\" (UID: \"83db2bdf-7f5d-405e-9fa4-8a76ee99d46c\") " Apr 20 20:27:57.067720 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:57.067555 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/83db2bdf-7f5d-405e-9fa4-8a76ee99d46c-kserve-provision-location\") pod \"83db2bdf-7f5d-405e-9fa4-8a76ee99d46c\" (UID: \"83db2bdf-7f5d-405e-9fa4-8a76ee99d46c\") " Apr 20 20:27:57.067720 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:57.067602 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9gc8\" (UniqueName: \"kubernetes.io/projected/83db2bdf-7f5d-405e-9fa4-8a76ee99d46c-kube-api-access-c9gc8\") pod \"83db2bdf-7f5d-405e-9fa4-8a76ee99d46c\" (UID: \"83db2bdf-7f5d-405e-9fa4-8a76ee99d46c\") " Apr 20 20:27:57.067720 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:57.067645 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/83db2bdf-7f5d-405e-9fa4-8a76ee99d46c-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"83db2bdf-7f5d-405e-9fa4-8a76ee99d46c\" (UID: \"83db2bdf-7f5d-405e-9fa4-8a76ee99d46c\") " Apr 20 20:27:57.068049 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:57.067907 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83db2bdf-7f5d-405e-9fa4-8a76ee99d46c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "83db2bdf-7f5d-405e-9fa4-8a76ee99d46c" (UID: "83db2bdf-7f5d-405e-9fa4-8a76ee99d46c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:27:57.068186 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:57.068057 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83db2bdf-7f5d-405e-9fa4-8a76ee99d46c-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config") pod "83db2bdf-7f5d-405e-9fa4-8a76ee99d46c" (UID: "83db2bdf-7f5d-405e-9fa4-8a76ee99d46c"). InnerVolumeSpecName "isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:27:57.069497 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:57.069477 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83db2bdf-7f5d-405e-9fa4-8a76ee99d46c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "83db2bdf-7f5d-405e-9fa4-8a76ee99d46c" (UID: "83db2bdf-7f5d-405e-9fa4-8a76ee99d46c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:27:57.069596 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:57.069580 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83db2bdf-7f5d-405e-9fa4-8a76ee99d46c-kube-api-access-c9gc8" (OuterVolumeSpecName: "kube-api-access-c9gc8") pod "83db2bdf-7f5d-405e-9fa4-8a76ee99d46c" (UID: "83db2bdf-7f5d-405e-9fa4-8a76ee99d46c"). InnerVolumeSpecName "kube-api-access-c9gc8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:27:57.151047 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:57.151025 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk"] Apr 20 20:27:57.153148 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:27:57.153124 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3148df86_6705_4616_bd15_e4c87159279f.slice/crio-6040fd726bfdfbeba96478de439d520cfe0041a9bae7e0d0068fed58ef3b1caa WatchSource:0}: Error finding container 6040fd726bfdfbeba96478de439d520cfe0041a9bae7e0d0068fed58ef3b1caa: Status 404 returned error can't find the container with id 6040fd726bfdfbeba96478de439d520cfe0041a9bae7e0d0068fed58ef3b1caa Apr 20 20:27:57.168442 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:57.168419 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/83db2bdf-7f5d-405e-9fa4-8a76ee99d46c-proxy-tls\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:27:57.168442 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:57.168444 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/83db2bdf-7f5d-405e-9fa4-8a76ee99d46c-kserve-provision-location\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:27:57.168587 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:57.168453 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c9gc8\" (UniqueName: \"kubernetes.io/projected/83db2bdf-7f5d-405e-9fa4-8a76ee99d46c-kube-api-access-c9gc8\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:27:57.168587 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:57.168464 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/83db2bdf-7f5d-405e-9fa4-8a76ee99d46c-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:27:57.422244 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:57.422208 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk" event={"ID":"3148df86-6705-4616-bd15-e4c87159279f","Type":"ContainerStarted","Data":"d374e49db21e668cbdb5c1eb71fae701bcec4793ae98076447afd6be57972c26"} Apr 20 20:27:57.422436 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:57.422252 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk" event={"ID":"3148df86-6705-4616-bd15-e4c87159279f","Type":"ContainerStarted","Data":"6040fd726bfdfbeba96478de439d520cfe0041a9bae7e0d0068fed58ef3b1caa"} Apr 20 20:27:57.423973 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:57.423946 2576 generic.go:358] "Generic (PLEG): container finished" podID="83db2bdf-7f5d-405e-9fa4-8a76ee99d46c" containerID="8e9bb9c7e376da03c13f2665b15e2e0e116aff703b502c412a890b8e33433b01" exitCode=0 Apr 20 20:27:57.424081 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:57.424004 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-sks8k" event={"ID":"83db2bdf-7f5d-405e-9fa4-8a76ee99d46c","Type":"ContainerDied","Data":"8e9bb9c7e376da03c13f2665b15e2e0e116aff703b502c412a890b8e33433b01"} Apr 20 20:27:57.424081 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:57.424031 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-sks8k" event={"ID":"83db2bdf-7f5d-405e-9fa4-8a76ee99d46c","Type":"ContainerDied","Data":"0ea64bd8366fe781811e54499c0705c07da0405fa47664950158a2e3c3ebffd7"} Apr 20 20:27:57.424081 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:57.424046 2576 scope.go:117] "RemoveContainer" containerID="a5e64e2b4edf20db43d948be3a05e7b321c8773ba1f5d07c3e7fbc1dde924a24" Apr 20 20:27:57.424081 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:57.424044 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-sks8k" Apr 20 20:27:57.431916 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:57.431891 2576 scope.go:117] "RemoveContainer" containerID="8e9bb9c7e376da03c13f2665b15e2e0e116aff703b502c412a890b8e33433b01" Apr 20 20:27:57.439684 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:57.439644 2576 scope.go:117] "RemoveContainer" containerID="3e29466b5eb448d6d441b462963eed13a2b37b16c8ad47b357d8751fdec1108e" Apr 20 20:27:57.447273 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:57.447259 2576 scope.go:117] "RemoveContainer" containerID="a5e64e2b4edf20db43d948be3a05e7b321c8773ba1f5d07c3e7fbc1dde924a24" Apr 20 20:27:57.447542 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:27:57.447490 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5e64e2b4edf20db43d948be3a05e7b321c8773ba1f5d07c3e7fbc1dde924a24\": container with ID starting with a5e64e2b4edf20db43d948be3a05e7b321c8773ba1f5d07c3e7fbc1dde924a24 not found: ID does not exist" containerID="a5e64e2b4edf20db43d948be3a05e7b321c8773ba1f5d07c3e7fbc1dde924a24" Apr 20 20:27:57.447542 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:57.447518 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5e64e2b4edf20db43d948be3a05e7b321c8773ba1f5d07c3e7fbc1dde924a24"} err="failed to get container status \"a5e64e2b4edf20db43d948be3a05e7b321c8773ba1f5d07c3e7fbc1dde924a24\": rpc error: code = NotFound desc = could not find container \"a5e64e2b4edf20db43d948be3a05e7b321c8773ba1f5d07c3e7fbc1dde924a24\": container with ID starting with a5e64e2b4edf20db43d948be3a05e7b321c8773ba1f5d07c3e7fbc1dde924a24 not found: ID does not exist" Apr 20 20:27:57.447542 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:57.447540 2576 scope.go:117] "RemoveContainer" containerID="8e9bb9c7e376da03c13f2665b15e2e0e116aff703b502c412a890b8e33433b01" Apr 20 20:27:57.447793 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:27:57.447774 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e9bb9c7e376da03c13f2665b15e2e0e116aff703b502c412a890b8e33433b01\": container with ID starting with 8e9bb9c7e376da03c13f2665b15e2e0e116aff703b502c412a890b8e33433b01 not found: ID does not exist" containerID="8e9bb9c7e376da03c13f2665b15e2e0e116aff703b502c412a890b8e33433b01" Apr 20 20:27:57.447846 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:57.447801 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e9bb9c7e376da03c13f2665b15e2e0e116aff703b502c412a890b8e33433b01"} err="failed to get container status \"8e9bb9c7e376da03c13f2665b15e2e0e116aff703b502c412a890b8e33433b01\": rpc error: code = NotFound desc = could not find container \"8e9bb9c7e376da03c13f2665b15e2e0e116aff703b502c412a890b8e33433b01\": container with ID starting with 8e9bb9c7e376da03c13f2665b15e2e0e116aff703b502c412a890b8e33433b01 not found: ID does not exist" Apr 20 20:27:57.447846 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:57.447817 2576 scope.go:117] "RemoveContainer" containerID="3e29466b5eb448d6d441b462963eed13a2b37b16c8ad47b357d8751fdec1108e" Apr 20 20:27:57.448130 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:27:57.448095 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e29466b5eb448d6d441b462963eed13a2b37b16c8ad47b357d8751fdec1108e\": container with ID starting with 3e29466b5eb448d6d441b462963eed13a2b37b16c8ad47b357d8751fdec1108e not found: ID does not exist" containerID="3e29466b5eb448d6d441b462963eed13a2b37b16c8ad47b357d8751fdec1108e" Apr 20 20:27:57.448194 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:57.448137 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e29466b5eb448d6d441b462963eed13a2b37b16c8ad47b357d8751fdec1108e"} err="failed to get container status \"3e29466b5eb448d6d441b462963eed13a2b37b16c8ad47b357d8751fdec1108e\": rpc error: code = NotFound desc = could not find container \"3e29466b5eb448d6d441b462963eed13a2b37b16c8ad47b357d8751fdec1108e\": container with ID starting with 3e29466b5eb448d6d441b462963eed13a2b37b16c8ad47b357d8751fdec1108e not found: ID does not exist" Apr 20 20:27:57.455732 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:57.455714 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-sks8k"] Apr 20 20:27:57.459171 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:57.459151 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-sks8k"] Apr 20 20:27:57.946866 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:27:57.946833 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83db2bdf-7f5d-405e-9fa4-8a76ee99d46c" path="/var/lib/kubelet/pods/83db2bdf-7f5d-405e-9fa4-8a76ee99d46c/volumes" Apr 20 20:28:01.439280 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:28:01.439249 2576 generic.go:358] "Generic (PLEG): container finished" podID="3148df86-6705-4616-bd15-e4c87159279f" containerID="d374e49db21e668cbdb5c1eb71fae701bcec4793ae98076447afd6be57972c26" exitCode=0 Apr 20 20:28:01.439689 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:28:01.439323 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk" event={"ID":"3148df86-6705-4616-bd15-e4c87159279f","Type":"ContainerDied","Data":"d374e49db21e668cbdb5c1eb71fae701bcec4793ae98076447afd6be57972c26"} Apr 20 20:28:02.445129 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:28:02.445079 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk" event={"ID":"3148df86-6705-4616-bd15-e4c87159279f","Type":"ContainerStarted","Data":"f31bcf3a2482a552f19555fdf0f0ed7f237081dbc61b15e4d54e5c1ff85713ba"} Apr 20 20:28:04.454363 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:28:04.454330 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk" event={"ID":"3148df86-6705-4616-bd15-e4c87159279f","Type":"ContainerStarted","Data":"6e3aeabc990811441fd45b7fe6851f4908420d9c08fd19c3eea8ede6964e516e"} Apr 20 20:28:04.454363 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:28:04.454363 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk" event={"ID":"3148df86-6705-4616-bd15-e4c87159279f","Type":"ContainerStarted","Data":"f6b74bc0f2a02cb16dff8f268846c5c6fd63e979a013944fdfc94103bb5ee32c"} Apr 20 20:28:04.454770 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:28:04.454522 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk" Apr 20 20:28:04.478213 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:28:04.478150 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk" podStartSLOduration=7.017719723 podStartE2EDuration="9.478132732s" podCreationTimestamp="2026-04-20 20:27:55 +0000 UTC" firstStartedPulling="2026-04-20 20:28:01.504875954 +0000 UTC m=+1356.217557928" lastFinishedPulling="2026-04-20 20:28:03.965288958 +0000 UTC m=+1358.677970937" observedRunningTime="2026-04-20 20:28:04.475774907 +0000 UTC m=+1359.188456904" watchObservedRunningTime="2026-04-20 20:28:04.478132732 +0000 UTC m=+1359.190814729" Apr 20 20:28:05.457676 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:28:05.457646 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk" Apr 20 20:28:05.457676 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:28:05.457678 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk" Apr 20 20:28:11.467808 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:28:11.467778 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk" Apr 20 20:28:31.469257 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:28:31.469218 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk" podUID="3148df86-6705-4616-bd15-e4c87159279f" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.46:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.132.0.46:8080: connect: connection refused" Apr 20 20:28:41.470340 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:28:41.470308 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk" Apr 20 20:29:11.471704 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:11.471634 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk" Apr 20 20:29:15.965817 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:15.965780 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk"] Apr 20 20:29:15.966299 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:15.966266 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk" podUID="3148df86-6705-4616-bd15-e4c87159279f" containerName="kserve-container" containerID="cri-o://f31bcf3a2482a552f19555fdf0f0ed7f237081dbc61b15e4d54e5c1ff85713ba" gracePeriod=30 Apr 20 20:29:15.966419 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:15.966314 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk" podUID="3148df86-6705-4616-bd15-e4c87159279f" containerName="kube-rbac-proxy" containerID="cri-o://6e3aeabc990811441fd45b7fe6851f4908420d9c08fd19c3eea8ede6964e516e" gracePeriod=30 Apr 20 20:29:15.966419 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:15.966303 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk" podUID="3148df86-6705-4616-bd15-e4c87159279f" containerName="kserve-agent" containerID="cri-o://f6b74bc0f2a02cb16dff8f268846c5c6fd63e979a013944fdfc94103bb5ee32c" gracePeriod=30 Apr 20 20:29:16.041464 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:16.041424 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-hmh9f"] Apr 20 20:29:16.041779 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:16.041766 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="83db2bdf-7f5d-405e-9fa4-8a76ee99d46c" containerName="storage-initializer" Apr 20 20:29:16.041779 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:16.041780 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="83db2bdf-7f5d-405e-9fa4-8a76ee99d46c" containerName="storage-initializer" Apr 20 20:29:16.041867 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:16.041797 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="83db2bdf-7f5d-405e-9fa4-8a76ee99d46c" containerName="kube-rbac-proxy" Apr 20 20:29:16.041867 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:16.041803 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="83db2bdf-7f5d-405e-9fa4-8a76ee99d46c" containerName="kube-rbac-proxy" Apr 20 20:29:16.041867 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:16.041817 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="83db2bdf-7f5d-405e-9fa4-8a76ee99d46c" containerName="kserve-container" Apr 20 20:29:16.041867 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:16.041822 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="83db2bdf-7f5d-405e-9fa4-8a76ee99d46c" containerName="kserve-container" Apr 20 20:29:16.041985 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:16.041879 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="83db2bdf-7f5d-405e-9fa4-8a76ee99d46c" containerName="kube-rbac-proxy" Apr 20 20:29:16.041985 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:16.041887 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="83db2bdf-7f5d-405e-9fa4-8a76ee99d46c" containerName="kserve-container" Apr 20 20:29:16.045174 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:16.045153 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-hmh9f" Apr 20 20:29:16.047626 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:16.047599 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-kube-rbac-proxy-sar-config\"" Apr 20 20:29:16.047755 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:16.047705 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-predictor-serving-cert\"" Apr 20 20:29:16.055869 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:16.055844 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-hmh9f"] Apr 20 20:29:16.150181 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:16.150135 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/68f25676-8fce-485d-93b4-7f2c97b1ab0b-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-predictor-6b8b7cfb4b-hmh9f\" (UID: \"68f25676-8fce-485d-93b4-7f2c97b1ab0b\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-hmh9f" Apr 20 20:29:16.150329 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:16.150225 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kbrn\" (UniqueName: \"kubernetes.io/projected/68f25676-8fce-485d-93b4-7f2c97b1ab0b-kube-api-access-9kbrn\") pod \"isvc-paddle-predictor-6b8b7cfb4b-hmh9f\" (UID: \"68f25676-8fce-485d-93b4-7f2c97b1ab0b\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-hmh9f" Apr 20 20:29:16.150329 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:16.150290 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/68f25676-8fce-485d-93b4-7f2c97b1ab0b-kserve-provision-location\") pod \"isvc-paddle-predictor-6b8b7cfb4b-hmh9f\" (UID: \"68f25676-8fce-485d-93b4-7f2c97b1ab0b\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-hmh9f" Apr 20 20:29:16.150412 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:16.150326 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/68f25676-8fce-485d-93b4-7f2c97b1ab0b-proxy-tls\") pod \"isvc-paddle-predictor-6b8b7cfb4b-hmh9f\" (UID: \"68f25676-8fce-485d-93b4-7f2c97b1ab0b\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-hmh9f" Apr 20 20:29:16.251829 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:16.251744 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/68f25676-8fce-485d-93b4-7f2c97b1ab0b-kserve-provision-location\") pod \"isvc-paddle-predictor-6b8b7cfb4b-hmh9f\" (UID: \"68f25676-8fce-485d-93b4-7f2c97b1ab0b\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-hmh9f" Apr 20 20:29:16.251829 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:16.251802 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/68f25676-8fce-485d-93b4-7f2c97b1ab0b-proxy-tls\") pod \"isvc-paddle-predictor-6b8b7cfb4b-hmh9f\" (UID: \"68f25676-8fce-485d-93b4-7f2c97b1ab0b\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-hmh9f" Apr 20 20:29:16.252004 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:16.251866 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/68f25676-8fce-485d-93b4-7f2c97b1ab0b-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-predictor-6b8b7cfb4b-hmh9f\" (UID: \"68f25676-8fce-485d-93b4-7f2c97b1ab0b\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-hmh9f" Apr 20 20:29:16.252004 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:16.251927 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9kbrn\" (UniqueName: \"kubernetes.io/projected/68f25676-8fce-485d-93b4-7f2c97b1ab0b-kube-api-access-9kbrn\") pod \"isvc-paddle-predictor-6b8b7cfb4b-hmh9f\" (UID: \"68f25676-8fce-485d-93b4-7f2c97b1ab0b\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-hmh9f" Apr 20 20:29:16.252004 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:29:16.251976 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-paddle-predictor-serving-cert: secret "isvc-paddle-predictor-serving-cert" not found Apr 20 20:29:16.252178 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:29:16.252062 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/68f25676-8fce-485d-93b4-7f2c97b1ab0b-proxy-tls podName:68f25676-8fce-485d-93b4-7f2c97b1ab0b nodeName:}" failed. No retries permitted until 2026-04-20 20:29:16.752025111 +0000 UTC m=+1431.464707089 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/68f25676-8fce-485d-93b4-7f2c97b1ab0b-proxy-tls") pod "isvc-paddle-predictor-6b8b7cfb4b-hmh9f" (UID: "68f25676-8fce-485d-93b4-7f2c97b1ab0b") : secret "isvc-paddle-predictor-serving-cert" not found Apr 20 20:29:16.252178 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:16.252153 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/68f25676-8fce-485d-93b4-7f2c97b1ab0b-kserve-provision-location\") pod \"isvc-paddle-predictor-6b8b7cfb4b-hmh9f\" (UID: \"68f25676-8fce-485d-93b4-7f2c97b1ab0b\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-hmh9f" Apr 20 20:29:16.252491 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:16.252473 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/68f25676-8fce-485d-93b4-7f2c97b1ab0b-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-predictor-6b8b7cfb4b-hmh9f\" (UID: \"68f25676-8fce-485d-93b4-7f2c97b1ab0b\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-hmh9f" Apr 20 20:29:16.260213 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:16.260192 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kbrn\" (UniqueName: \"kubernetes.io/projected/68f25676-8fce-485d-93b4-7f2c97b1ab0b-kube-api-access-9kbrn\") pod \"isvc-paddle-predictor-6b8b7cfb4b-hmh9f\" (UID: \"68f25676-8fce-485d-93b4-7f2c97b1ab0b\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-hmh9f" Apr 20 20:29:16.461827 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:16.461775 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk" podUID="3148df86-6705-4616-bd15-e4c87159279f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.46:8643/healthz\": dial tcp 10.132.0.46:8643: connect: connection refused" Apr 20 20:29:16.702404 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:16.702370 2576 generic.go:358] "Generic (PLEG): container finished" podID="3148df86-6705-4616-bd15-e4c87159279f" containerID="6e3aeabc990811441fd45b7fe6851f4908420d9c08fd19c3eea8ede6964e516e" exitCode=2 Apr 20 20:29:16.702667 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:16.702447 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk" event={"ID":"3148df86-6705-4616-bd15-e4c87159279f","Type":"ContainerDied","Data":"6e3aeabc990811441fd45b7fe6851f4908420d9c08fd19c3eea8ede6964e516e"} Apr 20 20:29:16.755980 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:16.755946 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/68f25676-8fce-485d-93b4-7f2c97b1ab0b-proxy-tls\") pod \"isvc-paddle-predictor-6b8b7cfb4b-hmh9f\" (UID: \"68f25676-8fce-485d-93b4-7f2c97b1ab0b\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-hmh9f" Apr 20 20:29:16.758386 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:16.758356 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/68f25676-8fce-485d-93b4-7f2c97b1ab0b-proxy-tls\") pod \"isvc-paddle-predictor-6b8b7cfb4b-hmh9f\" (UID: \"68f25676-8fce-485d-93b4-7f2c97b1ab0b\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-hmh9f" Apr 20 20:29:16.956597 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:16.956507 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-hmh9f" Apr 20 20:29:17.075526 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:17.075503 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-hmh9f"] Apr 20 20:29:17.077608 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:29:17.077581 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68f25676_8fce_485d_93b4_7f2c97b1ab0b.slice/crio-c98b2249d0a1d53fef4114aa0575370e6111610bf7e096861ac668d3bb89c9ca WatchSource:0}: Error finding container c98b2249d0a1d53fef4114aa0575370e6111610bf7e096861ac668d3bb89c9ca: Status 404 returned error can't find the container with id c98b2249d0a1d53fef4114aa0575370e6111610bf7e096861ac668d3bb89c9ca Apr 20 20:29:17.079776 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:17.079758 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:29:17.707147 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:17.707098 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-hmh9f" event={"ID":"68f25676-8fce-485d-93b4-7f2c97b1ab0b","Type":"ContainerStarted","Data":"c798943721896e3fa8c48a0d58d551503c2d14ce50d5740d170239c581b4bac6"} Apr 20 20:29:17.707147 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:17.707147 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-hmh9f" event={"ID":"68f25676-8fce-485d-93b4-7f2c97b1ab0b","Type":"ContainerStarted","Data":"c98b2249d0a1d53fef4114aa0575370e6111610bf7e096861ac668d3bb89c9ca"} Apr 20 20:29:18.714698 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:18.714659 2576 generic.go:358] "Generic (PLEG): container finished" podID="3148df86-6705-4616-bd15-e4c87159279f" containerID="f31bcf3a2482a552f19555fdf0f0ed7f237081dbc61b15e4d54e5c1ff85713ba" exitCode=0 Apr 20 20:29:18.715064 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:18.714731 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk" event={"ID":"3148df86-6705-4616-bd15-e4c87159279f","Type":"ContainerDied","Data":"f31bcf3a2482a552f19555fdf0f0ed7f237081dbc61b15e4d54e5c1ff85713ba"} Apr 20 20:29:21.462157 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:21.462106 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk" podUID="3148df86-6705-4616-bd15-e4c87159279f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.46:8643/healthz\": dial tcp 10.132.0.46:8643: connect: connection refused" Apr 20 20:29:21.468641 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:21.468616 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk" podUID="3148df86-6705-4616-bd15-e4c87159279f" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.46:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.132.0.46:8080: connect: connection refused" Apr 20 20:29:21.731350 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:21.731268 2576 generic.go:358] "Generic (PLEG): container finished" podID="68f25676-8fce-485d-93b4-7f2c97b1ab0b" containerID="c798943721896e3fa8c48a0d58d551503c2d14ce50d5740d170239c581b4bac6" exitCode=0 Apr 20 20:29:21.731350 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:21.731341 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-hmh9f" event={"ID":"68f25676-8fce-485d-93b4-7f2c97b1ab0b","Type":"ContainerDied","Data":"c798943721896e3fa8c48a0d58d551503c2d14ce50d5740d170239c581b4bac6"} Apr 20 20:29:26.462385 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:26.462344 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk" podUID="3148df86-6705-4616-bd15-e4c87159279f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.46:8643/healthz\": dial tcp 10.132.0.46:8643: connect: connection refused" Apr 20 20:29:26.462786 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:26.462535 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk" Apr 20 20:29:31.461909 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:31.461866 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk" podUID="3148df86-6705-4616-bd15-e4c87159279f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.46:8643/healthz\": dial tcp 10.132.0.46:8643: connect: connection refused" Apr 20 20:29:31.468468 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:31.468437 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk" podUID="3148df86-6705-4616-bd15-e4c87159279f" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.46:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.132.0.46:8080: connect: connection refused" Apr 20 20:29:34.782001 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:34.781965 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-hmh9f" event={"ID":"68f25676-8fce-485d-93b4-7f2c97b1ab0b","Type":"ContainerStarted","Data":"f84714d1a630dd4a39571f26bb1098d95d8157cfa9d6c7727f0e4d0fa9042006"} Apr 20 20:29:34.782001 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:34.782007 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-hmh9f" event={"ID":"68f25676-8fce-485d-93b4-7f2c97b1ab0b","Type":"ContainerStarted","Data":"22cd599f1206771cf0cc398ce2ab0170fbe78a8a5f7eb139b887683a5775de1e"} Apr 20 20:29:34.782601 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:34.782268 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-hmh9f" Apr 20 20:29:34.800977 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:34.800933 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-hmh9f" podStartSLOduration=6.47378166 podStartE2EDuration="18.800920855s" podCreationTimestamp="2026-04-20 20:29:16 +0000 UTC" firstStartedPulling="2026-04-20 20:29:21.732534641 +0000 UTC m=+1436.445216615" lastFinishedPulling="2026-04-20 20:29:34.05967382 +0000 UTC m=+1448.772355810" observedRunningTime="2026-04-20 20:29:34.799456102 +0000 UTC m=+1449.512138100" watchObservedRunningTime="2026-04-20 20:29:34.800920855 +0000 UTC m=+1449.513602854" Apr 20 20:29:35.784943 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:35.784909 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-hmh9f" Apr 20 20:29:35.786058 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:35.786028 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-hmh9f" podUID="68f25676-8fce-485d-93b4-7f2c97b1ab0b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 20 20:29:36.462213 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:36.462175 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk" podUID="3148df86-6705-4616-bd15-e4c87159279f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.46:8643/healthz\": dial tcp 10.132.0.46:8643: connect: connection refused" Apr 20 20:29:36.788266 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:36.788222 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-hmh9f" podUID="68f25676-8fce-485d-93b4-7f2c97b1ab0b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 20 20:29:41.462052 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:41.462005 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk" podUID="3148df86-6705-4616-bd15-e4c87159279f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.46:8643/healthz\": dial tcp 10.132.0.46:8643: connect: connection refused" Apr 20 20:29:41.468497 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:41.468469 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk" podUID="3148df86-6705-4616-bd15-e4c87159279f" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.46:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.132.0.46:8080: connect: connection refused" Apr 20 20:29:41.468596 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:41.468584 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk" Apr 20 20:29:41.798057 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:41.798026 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-hmh9f" Apr 20 20:29:41.798712 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:41.798679 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-hmh9f" podUID="68f25676-8fce-485d-93b4-7f2c97b1ab0b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 20 20:29:46.112233 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:46.112208 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk" Apr 20 20:29:46.189610 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:46.189577 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3148df86-6705-4616-bd15-e4c87159279f-kserve-provision-location\") pod \"3148df86-6705-4616-bd15-e4c87159279f\" (UID: \"3148df86-6705-4616-bd15-e4c87159279f\") " Apr 20 20:29:46.189768 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:46.189641 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3148df86-6705-4616-bd15-e4c87159279f-proxy-tls\") pod \"3148df86-6705-4616-bd15-e4c87159279f\" (UID: \"3148df86-6705-4616-bd15-e4c87159279f\") " Apr 20 20:29:46.189768 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:46.189714 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qw52m\" (UniqueName: \"kubernetes.io/projected/3148df86-6705-4616-bd15-e4c87159279f-kube-api-access-qw52m\") pod \"3148df86-6705-4616-bd15-e4c87159279f\" (UID: \"3148df86-6705-4616-bd15-e4c87159279f\") " Apr 20 20:29:46.189900 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:46.189771 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3148df86-6705-4616-bd15-e4c87159279f-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"3148df86-6705-4616-bd15-e4c87159279f\" (UID: \"3148df86-6705-4616-bd15-e4c87159279f\") " Apr 20 20:29:46.189900 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:46.189873 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3148df86-6705-4616-bd15-e4c87159279f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3148df86-6705-4616-bd15-e4c87159279f" (UID: "3148df86-6705-4616-bd15-e4c87159279f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:29:46.190052 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:46.190032 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3148df86-6705-4616-bd15-e4c87159279f-kserve-provision-location\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:29:46.190188 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:46.190161 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3148df86-6705-4616-bd15-e4c87159279f-isvc-sklearn-mcp-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-mcp-kube-rbac-proxy-sar-config") pod "3148df86-6705-4616-bd15-e4c87159279f" (UID: "3148df86-6705-4616-bd15-e4c87159279f"). InnerVolumeSpecName "isvc-sklearn-mcp-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:29:46.191683 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:46.191659 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3148df86-6705-4616-bd15-e4c87159279f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "3148df86-6705-4616-bd15-e4c87159279f" (UID: "3148df86-6705-4616-bd15-e4c87159279f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:29:46.191788 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:46.191696 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3148df86-6705-4616-bd15-e4c87159279f-kube-api-access-qw52m" (OuterVolumeSpecName: "kube-api-access-qw52m") pod "3148df86-6705-4616-bd15-e4c87159279f" (UID: "3148df86-6705-4616-bd15-e4c87159279f"). InnerVolumeSpecName "kube-api-access-qw52m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:29:46.290657 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:46.290629 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3148df86-6705-4616-bd15-e4c87159279f-proxy-tls\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:29:46.290657 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:46.290658 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qw52m\" (UniqueName: \"kubernetes.io/projected/3148df86-6705-4616-bd15-e4c87159279f-kube-api-access-qw52m\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:29:46.290827 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:46.290669 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3148df86-6705-4616-bd15-e4c87159279f-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:29:46.824446 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:46.824409 2576 generic.go:358] "Generic (PLEG): container finished" podID="3148df86-6705-4616-bd15-e4c87159279f" containerID="f6b74bc0f2a02cb16dff8f268846c5c6fd63e979a013944fdfc94103bb5ee32c" exitCode=0 Apr 20 20:29:46.824643 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:46.824475 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk" event={"ID":"3148df86-6705-4616-bd15-e4c87159279f","Type":"ContainerDied","Data":"f6b74bc0f2a02cb16dff8f268846c5c6fd63e979a013944fdfc94103bb5ee32c"} Apr 20 20:29:46.824643 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:46.824501 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk" event={"ID":"3148df86-6705-4616-bd15-e4c87159279f","Type":"ContainerDied","Data":"6040fd726bfdfbeba96478de439d520cfe0041a9bae7e0d0068fed58ef3b1caa"} Apr 20 20:29:46.824643 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:46.824515 2576 scope.go:117] "RemoveContainer" containerID="6e3aeabc990811441fd45b7fe6851f4908420d9c08fd19c3eea8ede6964e516e" Apr 20 20:29:46.824643 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:46.824516 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk" Apr 20 20:29:46.835840 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:46.835811 2576 scope.go:117] "RemoveContainer" containerID="f6b74bc0f2a02cb16dff8f268846c5c6fd63e979a013944fdfc94103bb5ee32c" Apr 20 20:29:46.843752 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:46.843728 2576 scope.go:117] "RemoveContainer" containerID="f31bcf3a2482a552f19555fdf0f0ed7f237081dbc61b15e4d54e5c1ff85713ba" Apr 20 20:29:46.847160 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:46.847140 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk"] Apr 20 20:29:46.850551 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:46.850528 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-wvtqk"] Apr 20 20:29:46.853220 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:46.853189 2576 scope.go:117] "RemoveContainer" containerID="d374e49db21e668cbdb5c1eb71fae701bcec4793ae98076447afd6be57972c26" Apr 20 20:29:46.860723 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:46.860705 2576 scope.go:117] "RemoveContainer" containerID="6e3aeabc990811441fd45b7fe6851f4908420d9c08fd19c3eea8ede6964e516e" Apr 20 20:29:46.860970 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:29:46.860945 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e3aeabc990811441fd45b7fe6851f4908420d9c08fd19c3eea8ede6964e516e\": container with ID starting with 6e3aeabc990811441fd45b7fe6851f4908420d9c08fd19c3eea8ede6964e516e not found: ID does not exist" containerID="6e3aeabc990811441fd45b7fe6851f4908420d9c08fd19c3eea8ede6964e516e" Apr 20 20:29:46.861012 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:46.860977 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e3aeabc990811441fd45b7fe6851f4908420d9c08fd19c3eea8ede6964e516e"} err="failed to get container status \"6e3aeabc990811441fd45b7fe6851f4908420d9c08fd19c3eea8ede6964e516e\": rpc error: code = NotFound desc = could not find container \"6e3aeabc990811441fd45b7fe6851f4908420d9c08fd19c3eea8ede6964e516e\": container with ID starting with 6e3aeabc990811441fd45b7fe6851f4908420d9c08fd19c3eea8ede6964e516e not found: ID does not exist" Apr 20 20:29:46.861012 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:46.860996 2576 scope.go:117] "RemoveContainer" containerID="f6b74bc0f2a02cb16dff8f268846c5c6fd63e979a013944fdfc94103bb5ee32c" Apr 20 20:29:46.861247 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:29:46.861222 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6b74bc0f2a02cb16dff8f268846c5c6fd63e979a013944fdfc94103bb5ee32c\": container with ID starting with f6b74bc0f2a02cb16dff8f268846c5c6fd63e979a013944fdfc94103bb5ee32c not found: ID does not exist" containerID="f6b74bc0f2a02cb16dff8f268846c5c6fd63e979a013944fdfc94103bb5ee32c" Apr 20 20:29:46.861347 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:46.861251 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6b74bc0f2a02cb16dff8f268846c5c6fd63e979a013944fdfc94103bb5ee32c"} err="failed to get container status \"f6b74bc0f2a02cb16dff8f268846c5c6fd63e979a013944fdfc94103bb5ee32c\": rpc error: code = NotFound desc = could not find container \"f6b74bc0f2a02cb16dff8f268846c5c6fd63e979a013944fdfc94103bb5ee32c\": container with ID starting with f6b74bc0f2a02cb16dff8f268846c5c6fd63e979a013944fdfc94103bb5ee32c not found: ID does not exist" Apr 20 20:29:46.861347 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:46.861268 2576 scope.go:117] "RemoveContainer" containerID="f31bcf3a2482a552f19555fdf0f0ed7f237081dbc61b15e4d54e5c1ff85713ba" Apr 20 20:29:46.861499 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:29:46.861483 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f31bcf3a2482a552f19555fdf0f0ed7f237081dbc61b15e4d54e5c1ff85713ba\": container with ID starting with f31bcf3a2482a552f19555fdf0f0ed7f237081dbc61b15e4d54e5c1ff85713ba not found: ID does not exist" containerID="f31bcf3a2482a552f19555fdf0f0ed7f237081dbc61b15e4d54e5c1ff85713ba" Apr 20 20:29:46.861542 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:46.861505 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f31bcf3a2482a552f19555fdf0f0ed7f237081dbc61b15e4d54e5c1ff85713ba"} err="failed to get container status \"f31bcf3a2482a552f19555fdf0f0ed7f237081dbc61b15e4d54e5c1ff85713ba\": rpc error: code = NotFound desc = could not find container \"f31bcf3a2482a552f19555fdf0f0ed7f237081dbc61b15e4d54e5c1ff85713ba\": container with ID starting with f31bcf3a2482a552f19555fdf0f0ed7f237081dbc61b15e4d54e5c1ff85713ba not found: ID does not exist" Apr 20 20:29:46.861542 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:46.861520 2576 scope.go:117] "RemoveContainer" containerID="d374e49db21e668cbdb5c1eb71fae701bcec4793ae98076447afd6be57972c26" Apr 20 20:29:46.861747 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:29:46.861729 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d374e49db21e668cbdb5c1eb71fae701bcec4793ae98076447afd6be57972c26\": container with ID starting with d374e49db21e668cbdb5c1eb71fae701bcec4793ae98076447afd6be57972c26 not found: ID does not exist" containerID="d374e49db21e668cbdb5c1eb71fae701bcec4793ae98076447afd6be57972c26" Apr 20 20:29:46.861788 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:46.861751 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d374e49db21e668cbdb5c1eb71fae701bcec4793ae98076447afd6be57972c26"} err="failed to get container status \"d374e49db21e668cbdb5c1eb71fae701bcec4793ae98076447afd6be57972c26\": rpc error: code = NotFound desc = could not find container \"d374e49db21e668cbdb5c1eb71fae701bcec4793ae98076447afd6be57972c26\": container with ID starting with d374e49db21e668cbdb5c1eb71fae701bcec4793ae98076447afd6be57972c26 not found: ID does not exist" Apr 20 20:29:47.945008 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:47.944972 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3148df86-6705-4616-bd15-e4c87159279f" path="/var/lib/kubelet/pods/3148df86-6705-4616-bd15-e4c87159279f/volumes" Apr 20 20:29:51.799246 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:29:51.799210 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-hmh9f" podUID="68f25676-8fce-485d-93b4-7f2c97b1ab0b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 20 20:30:01.799575 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:30:01.799539 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-hmh9f" podUID="68f25676-8fce-485d-93b4-7f2c97b1ab0b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 20 20:30:11.799220 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:30:11.799180 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-hmh9f" podUID="68f25676-8fce-485d-93b4-7f2c97b1ab0b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 20 20:30:21.799277 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:30:21.799245 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-hmh9f" Apr 20 20:30:27.598424 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:30:27.598384 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-hmh9f"] Apr 20 20:30:27.598846 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:30:27.598715 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-hmh9f" podUID="68f25676-8fce-485d-93b4-7f2c97b1ab0b" containerName="kserve-container" containerID="cri-o://22cd599f1206771cf0cc398ce2ab0170fbe78a8a5f7eb139b887683a5775de1e" gracePeriod=30 Apr 20 20:30:27.598846 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:30:27.598745 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-hmh9f" podUID="68f25676-8fce-485d-93b4-7f2c97b1ab0b" containerName="kube-rbac-proxy" containerID="cri-o://f84714d1a630dd4a39571f26bb1098d95d8157cfa9d6c7727f0e4d0fa9042006" gracePeriod=30 Apr 20 20:30:27.963730 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:30:27.963644 2576 generic.go:358] "Generic (PLEG): container finished" podID="68f25676-8fce-485d-93b4-7f2c97b1ab0b" containerID="f84714d1a630dd4a39571f26bb1098d95d8157cfa9d6c7727f0e4d0fa9042006" exitCode=2 Apr 20 20:30:27.963730 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:30:27.963715 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-hmh9f" event={"ID":"68f25676-8fce-485d-93b4-7f2c97b1ab0b","Type":"ContainerDied","Data":"f84714d1a630dd4a39571f26bb1098d95d8157cfa9d6c7727f0e4d0fa9042006"} Apr 20 20:30:29.973417 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:30:29.973384 2576 generic.go:358] "Generic (PLEG): container finished" podID="68f25676-8fce-485d-93b4-7f2c97b1ab0b" containerID="22cd599f1206771cf0cc398ce2ab0170fbe78a8a5f7eb139b887683a5775de1e" exitCode=0 Apr 20 20:30:29.973728 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:30:29.973429 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-hmh9f" event={"ID":"68f25676-8fce-485d-93b4-7f2c97b1ab0b","Type":"ContainerDied","Data":"22cd599f1206771cf0cc398ce2ab0170fbe78a8a5f7eb139b887683a5775de1e"} Apr 20 20:30:30.038613 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:30:30.038592 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-hmh9f" Apr 20 20:30:30.132916 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:30:30.132841 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kbrn\" (UniqueName: \"kubernetes.io/projected/68f25676-8fce-485d-93b4-7f2c97b1ab0b-kube-api-access-9kbrn\") pod \"68f25676-8fce-485d-93b4-7f2c97b1ab0b\" (UID: \"68f25676-8fce-485d-93b4-7f2c97b1ab0b\") " Apr 20 20:30:30.133065 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:30:30.132940 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/68f25676-8fce-485d-93b4-7f2c97b1ab0b-kserve-provision-location\") pod \"68f25676-8fce-485d-93b4-7f2c97b1ab0b\" (UID: \"68f25676-8fce-485d-93b4-7f2c97b1ab0b\") " Apr 20 20:30:30.133065 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:30:30.132968 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/68f25676-8fce-485d-93b4-7f2c97b1ab0b-proxy-tls\") pod \"68f25676-8fce-485d-93b4-7f2c97b1ab0b\" (UID: \"68f25676-8fce-485d-93b4-7f2c97b1ab0b\") " Apr 20 20:30:30.133065 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:30:30.132998 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/68f25676-8fce-485d-93b4-7f2c97b1ab0b-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"68f25676-8fce-485d-93b4-7f2c97b1ab0b\" (UID: \"68f25676-8fce-485d-93b4-7f2c97b1ab0b\") " Apr 20 20:30:30.133406 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:30:30.133371 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68f25676-8fce-485d-93b4-7f2c97b1ab0b-isvc-paddle-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-paddle-kube-rbac-proxy-sar-config") pod "68f25676-8fce-485d-93b4-7f2c97b1ab0b" (UID: "68f25676-8fce-485d-93b4-7f2c97b1ab0b"). InnerVolumeSpecName "isvc-paddle-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:30:30.135011 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:30:30.134983 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68f25676-8fce-485d-93b4-7f2c97b1ab0b-kube-api-access-9kbrn" (OuterVolumeSpecName: "kube-api-access-9kbrn") pod "68f25676-8fce-485d-93b4-7f2c97b1ab0b" (UID: "68f25676-8fce-485d-93b4-7f2c97b1ab0b"). InnerVolumeSpecName "kube-api-access-9kbrn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:30:30.135126 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:30:30.134992 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68f25676-8fce-485d-93b4-7f2c97b1ab0b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "68f25676-8fce-485d-93b4-7f2c97b1ab0b" (UID: "68f25676-8fce-485d-93b4-7f2c97b1ab0b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:30:30.141778 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:30:30.141756 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68f25676-8fce-485d-93b4-7f2c97b1ab0b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "68f25676-8fce-485d-93b4-7f2c97b1ab0b" (UID: "68f25676-8fce-485d-93b4-7f2c97b1ab0b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:30:30.234378 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:30:30.234352 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/68f25676-8fce-485d-93b4-7f2c97b1ab0b-kserve-provision-location\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:30:30.234378 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:30:30.234377 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/68f25676-8fce-485d-93b4-7f2c97b1ab0b-proxy-tls\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:30:30.234516 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:30:30.234389 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/68f25676-8fce-485d-93b4-7f2c97b1ab0b-isvc-paddle-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:30:30.234516 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:30:30.234399 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9kbrn\" (UniqueName: \"kubernetes.io/projected/68f25676-8fce-485d-93b4-7f2c97b1ab0b-kube-api-access-9kbrn\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:30:30.978397 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:30:30.978360 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-hmh9f" event={"ID":"68f25676-8fce-485d-93b4-7f2c97b1ab0b","Type":"ContainerDied","Data":"c98b2249d0a1d53fef4114aa0575370e6111610bf7e096861ac668d3bb89c9ca"} Apr 20 20:30:30.978397 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:30:30.978393 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-hmh9f" Apr 20 20:30:30.978885 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:30:30.978408 2576 scope.go:117] "RemoveContainer" containerID="f84714d1a630dd4a39571f26bb1098d95d8157cfa9d6c7727f0e4d0fa9042006" Apr 20 20:30:30.987242 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:30:30.987219 2576 scope.go:117] "RemoveContainer" containerID="22cd599f1206771cf0cc398ce2ab0170fbe78a8a5f7eb139b887683a5775de1e" Apr 20 20:30:30.994915 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:30:30.994897 2576 scope.go:117] "RemoveContainer" containerID="c798943721896e3fa8c48a0d58d551503c2d14ce50d5740d170239c581b4bac6" Apr 20 20:30:31.002918 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:30:31.001376 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-hmh9f"] Apr 20 20:30:31.005302 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:30:31.005282 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-hmh9f"] Apr 20 20:30:31.943913 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:30:31.943878 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68f25676-8fce-485d-93b4-7f2c97b1ab0b" path="/var/lib/kubelet/pods/68f25676-8fce-485d-93b4-7f2c97b1ab0b/volumes" Apr 20 20:30:54.383265 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:30:54.383161 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z55qt_f78ac3d9-bcf1-43dd-aac7-1678831ee3ba/ovn-acl-logging/0.log" Apr 20 20:30:54.385903 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:30:54.385881 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z55qt_f78ac3d9-bcf1-43dd-aac7-1678831ee3ba/ovn-acl-logging/0.log" Apr 20 20:32:51.089207 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:32:51.089096 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rvjm2"] Apr 20 20:32:51.089599 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:32:51.089499 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3148df86-6705-4616-bd15-e4c87159279f" containerName="kserve-agent" Apr 20 20:32:51.089599 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:32:51.089511 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3148df86-6705-4616-bd15-e4c87159279f" containerName="kserve-agent" Apr 20 20:32:51.089599 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:32:51.089522 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="68f25676-8fce-485d-93b4-7f2c97b1ab0b" containerName="kserve-container" Apr 20 20:32:51.089599 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:32:51.089529 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="68f25676-8fce-485d-93b4-7f2c97b1ab0b" containerName="kserve-container" Apr 20 20:32:51.089599 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:32:51.089537 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3148df86-6705-4616-bd15-e4c87159279f" containerName="kube-rbac-proxy" Apr 20 20:32:51.089599 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:32:51.089543 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3148df86-6705-4616-bd15-e4c87159279f" containerName="kube-rbac-proxy" Apr 20 20:32:51.089599 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:32:51.089549 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="68f25676-8fce-485d-93b4-7f2c97b1ab0b" containerName="kube-rbac-proxy" Apr 20 20:32:51.089599 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:32:51.089554 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="68f25676-8fce-485d-93b4-7f2c97b1ab0b" containerName="kube-rbac-proxy" Apr 20 20:32:51.089599 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:32:51.089566 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="68f25676-8fce-485d-93b4-7f2c97b1ab0b" containerName="storage-initializer" Apr 20 20:32:51.089599 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:32:51.089572 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="68f25676-8fce-485d-93b4-7f2c97b1ab0b" containerName="storage-initializer" Apr 20 20:32:51.089599 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:32:51.089577 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3148df86-6705-4616-bd15-e4c87159279f" containerName="storage-initializer" Apr 20 20:32:51.089599 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:32:51.089582 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3148df86-6705-4616-bd15-e4c87159279f" containerName="storage-initializer" Apr 20 20:32:51.089599 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:32:51.089587 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3148df86-6705-4616-bd15-e4c87159279f" containerName="kserve-container" Apr 20 20:32:51.089599 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:32:51.089592 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3148df86-6705-4616-bd15-e4c87159279f" containerName="kserve-container" Apr 20 20:32:51.090017 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:32:51.089644 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="68f25676-8fce-485d-93b4-7f2c97b1ab0b" containerName="kube-rbac-proxy" Apr 20 20:32:51.090017 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:32:51.089652 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="3148df86-6705-4616-bd15-e4c87159279f" containerName="kserve-container" Apr 20 20:32:51.090017 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:32:51.089659 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="3148df86-6705-4616-bd15-e4c87159279f" containerName="kube-rbac-proxy" Apr 20 20:32:51.090017 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:32:51.089665 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="3148df86-6705-4616-bd15-e4c87159279f" containerName="kserve-agent" Apr 20 20:32:51.090017 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:32:51.089672 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="68f25676-8fce-485d-93b4-7f2c97b1ab0b" containerName="kserve-container" Apr 20 20:32:51.092603 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:32:51.092581 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rvjm2" Apr 20 20:32:51.095536 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:32:51.095509 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 20 20:32:51.096717 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:32:51.096697 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 20 20:32:51.096842 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:32:51.096762 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-predictor-serving-cert\"" Apr 20 20:32:51.096842 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:32:51.096804 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-c5zv6\"" Apr 20 20:32:51.096970 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:32:51.096874 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-kube-rbac-proxy-sar-config\"" Apr 20 20:32:51.102953 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:32:51.102931 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rvjm2"] Apr 20 20:32:51.207885 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:32:51.207851 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ef87595b-7c95-4f09-ac9b-de1e454d6796-kserve-provision-location\") pod \"isvc-pmml-predictor-8bb578669-rvjm2\" (UID: \"ef87595b-7c95-4f09-ac9b-de1e454d6796\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rvjm2" Apr 20 20:32:51.208042 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:32:51.207893 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78ssm\" (UniqueName: \"kubernetes.io/projected/ef87595b-7c95-4f09-ac9b-de1e454d6796-kube-api-access-78ssm\") pod \"isvc-pmml-predictor-8bb578669-rvjm2\" (UID: \"ef87595b-7c95-4f09-ac9b-de1e454d6796\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rvjm2" Apr 20 20:32:51.208042 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:32:51.207926 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ef87595b-7c95-4f09-ac9b-de1e454d6796-proxy-tls\") pod \"isvc-pmml-predictor-8bb578669-rvjm2\" (UID: \"ef87595b-7c95-4f09-ac9b-de1e454d6796\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rvjm2" Apr 20 20:32:51.208042 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:32:51.207948 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ef87595b-7c95-4f09-ac9b-de1e454d6796-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-predictor-8bb578669-rvjm2\" (UID: \"ef87595b-7c95-4f09-ac9b-de1e454d6796\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rvjm2" Apr 20 20:32:51.308912 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:32:51.308876 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ef87595b-7c95-4f09-ac9b-de1e454d6796-proxy-tls\") pod \"isvc-pmml-predictor-8bb578669-rvjm2\" (UID: \"ef87595b-7c95-4f09-ac9b-de1e454d6796\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rvjm2" Apr 20 20:32:51.308912 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:32:51.308914 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ef87595b-7c95-4f09-ac9b-de1e454d6796-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-predictor-8bb578669-rvjm2\" (UID: \"ef87595b-7c95-4f09-ac9b-de1e454d6796\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rvjm2" Apr 20 20:32:51.309141 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:32:51.308967 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ef87595b-7c95-4f09-ac9b-de1e454d6796-kserve-provision-location\") pod \"isvc-pmml-predictor-8bb578669-rvjm2\" (UID: \"ef87595b-7c95-4f09-ac9b-de1e454d6796\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rvjm2" Apr 20 20:32:51.309141 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:32:51.308991 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-78ssm\" (UniqueName: \"kubernetes.io/projected/ef87595b-7c95-4f09-ac9b-de1e454d6796-kube-api-access-78ssm\") pod \"isvc-pmml-predictor-8bb578669-rvjm2\" (UID: \"ef87595b-7c95-4f09-ac9b-de1e454d6796\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rvjm2" Apr 20 20:32:51.309467 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:32:51.309445 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ef87595b-7c95-4f09-ac9b-de1e454d6796-kserve-provision-location\") pod \"isvc-pmml-predictor-8bb578669-rvjm2\" (UID: \"ef87595b-7c95-4f09-ac9b-de1e454d6796\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rvjm2" Apr 20 20:32:51.309658 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:32:51.309637 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ef87595b-7c95-4f09-ac9b-de1e454d6796-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-predictor-8bb578669-rvjm2\" (UID: \"ef87595b-7c95-4f09-ac9b-de1e454d6796\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rvjm2" Apr 20 20:32:51.311281 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:32:51.311265 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ef87595b-7c95-4f09-ac9b-de1e454d6796-proxy-tls\") pod \"isvc-pmml-predictor-8bb578669-rvjm2\" (UID: \"ef87595b-7c95-4f09-ac9b-de1e454d6796\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rvjm2" Apr 20 20:32:51.317372 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:32:51.317349 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-78ssm\" (UniqueName: \"kubernetes.io/projected/ef87595b-7c95-4f09-ac9b-de1e454d6796-kube-api-access-78ssm\") pod \"isvc-pmml-predictor-8bb578669-rvjm2\" (UID: \"ef87595b-7c95-4f09-ac9b-de1e454d6796\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rvjm2" Apr 20 20:32:51.405690 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:32:51.405622 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rvjm2" Apr 20 20:32:51.529351 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:32:51.529329 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rvjm2"] Apr 20 20:32:51.531177 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:32:51.531148 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef87595b_7c95_4f09_ac9b_de1e454d6796.slice/crio-50f27fdd40bfb207112daad65ba3aa2818fa58d9473a06368c581c15e59b7b36 WatchSource:0}: Error finding container 50f27fdd40bfb207112daad65ba3aa2818fa58d9473a06368c581c15e59b7b36: Status 404 returned error can't find the container with id 50f27fdd40bfb207112daad65ba3aa2818fa58d9473a06368c581c15e59b7b36 Apr 20 20:32:52.441716 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:32:52.441683 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rvjm2" event={"ID":"ef87595b-7c95-4f09-ac9b-de1e454d6796","Type":"ContainerStarted","Data":"591652d895c7df6d9057e516c438839080403fc04ff624330ff032596feafc38"} Apr 20 20:32:52.441716 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:32:52.441717 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rvjm2" event={"ID":"ef87595b-7c95-4f09-ac9b-de1e454d6796","Type":"ContainerStarted","Data":"50f27fdd40bfb207112daad65ba3aa2818fa58d9473a06368c581c15e59b7b36"} Apr 20 20:32:55.451811 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:32:55.451779 2576 generic.go:358] "Generic (PLEG): container finished" podID="ef87595b-7c95-4f09-ac9b-de1e454d6796" containerID="591652d895c7df6d9057e516c438839080403fc04ff624330ff032596feafc38" exitCode=0 Apr 20 20:32:55.452136 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:32:55.451849 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rvjm2" event={"ID":"ef87595b-7c95-4f09-ac9b-de1e454d6796","Type":"ContainerDied","Data":"591652d895c7df6d9057e516c438839080403fc04ff624330ff032596feafc38"} Apr 20 20:33:02.486054 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:33:02.486011 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rvjm2" event={"ID":"ef87595b-7c95-4f09-ac9b-de1e454d6796","Type":"ContainerStarted","Data":"fc709cbe5965944e8037c2c3b2cf3683f084ccb018670393d6f948bcd6c29ca7"} Apr 20 20:33:03.490949 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:33:03.490914 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rvjm2" event={"ID":"ef87595b-7c95-4f09-ac9b-de1e454d6796","Type":"ContainerStarted","Data":"c8ae4cb46cc9aea1b05732763aa85782ce0907e5f379a08df8fa81bef0b3e39d"} Apr 20 20:33:03.491328 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:33:03.491088 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rvjm2" Apr 20 20:33:03.510725 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:33:03.510677 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rvjm2" podStartSLOduration=5.562984984 podStartE2EDuration="12.510665322s" podCreationTimestamp="2026-04-20 20:32:51 +0000 UTC" firstStartedPulling="2026-04-20 20:32:55.452952075 +0000 UTC m=+1650.165634050" lastFinishedPulling="2026-04-20 20:33:02.40063241 +0000 UTC m=+1657.113314388" observedRunningTime="2026-04-20 20:33:03.508233985 +0000 UTC m=+1658.220915982" watchObservedRunningTime="2026-04-20 20:33:03.510665322 +0000 UTC m=+1658.223347319" Apr 20 20:33:04.494727 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:33:04.494693 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rvjm2" Apr 20 20:33:04.495947 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:33:04.495920 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rvjm2" podUID="ef87595b-7c95-4f09-ac9b-de1e454d6796" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 20 20:33:05.497769 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:33:05.497725 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rvjm2" podUID="ef87595b-7c95-4f09-ac9b-de1e454d6796" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 20 20:33:10.501837 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:33:10.501810 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rvjm2" Apr 20 20:33:10.502444 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:33:10.502414 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rvjm2" podUID="ef87595b-7c95-4f09-ac9b-de1e454d6796" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 20 20:33:20.502713 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:33:20.502675 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rvjm2" podUID="ef87595b-7c95-4f09-ac9b-de1e454d6796" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 20 20:33:30.502561 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:33:30.502521 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rvjm2" podUID="ef87595b-7c95-4f09-ac9b-de1e454d6796" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 20 20:33:40.502903 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:33:40.502817 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rvjm2" podUID="ef87595b-7c95-4f09-ac9b-de1e454d6796" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 20 20:33:50.502692 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:33:50.502653 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rvjm2" podUID="ef87595b-7c95-4f09-ac9b-de1e454d6796" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 20 20:34:00.502869 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:34:00.502818 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rvjm2" podUID="ef87595b-7c95-4f09-ac9b-de1e454d6796" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 20 20:34:10.502798 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:34:10.502761 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rvjm2" podUID="ef87595b-7c95-4f09-ac9b-de1e454d6796" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 20 20:34:20.502958 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:34:20.502930 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rvjm2" Apr 20 20:34:21.989025 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:34:21.988997 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rvjm2"] Apr 20 20:34:21.989450 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:34:21.989299 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rvjm2" podUID="ef87595b-7c95-4f09-ac9b-de1e454d6796" containerName="kserve-container" containerID="cri-o://fc709cbe5965944e8037c2c3b2cf3683f084ccb018670393d6f948bcd6c29ca7" gracePeriod=30 Apr 20 20:34:21.989450 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:34:21.989332 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rvjm2" podUID="ef87595b-7c95-4f09-ac9b-de1e454d6796" containerName="kube-rbac-proxy" containerID="cri-o://c8ae4cb46cc9aea1b05732763aa85782ce0907e5f379a08df8fa81bef0b3e39d" gracePeriod=30 Apr 20 20:34:22.752781 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:34:22.752752 2576 generic.go:358] "Generic (PLEG): container finished" podID="ef87595b-7c95-4f09-ac9b-de1e454d6796" containerID="c8ae4cb46cc9aea1b05732763aa85782ce0907e5f379a08df8fa81bef0b3e39d" exitCode=2 Apr 20 20:34:22.752949 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:34:22.752833 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rvjm2" event={"ID":"ef87595b-7c95-4f09-ac9b-de1e454d6796","Type":"ContainerDied","Data":"c8ae4cb46cc9aea1b05732763aa85782ce0907e5f379a08df8fa81bef0b3e39d"} Apr 20 20:34:25.218437 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:34:25.218418 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rvjm2" Apr 20 20:34:25.336212 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:34:25.336151 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ef87595b-7c95-4f09-ac9b-de1e454d6796-proxy-tls\") pod \"ef87595b-7c95-4f09-ac9b-de1e454d6796\" (UID: \"ef87595b-7c95-4f09-ac9b-de1e454d6796\") " Apr 20 20:34:25.336212 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:34:25.336207 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78ssm\" (UniqueName: \"kubernetes.io/projected/ef87595b-7c95-4f09-ac9b-de1e454d6796-kube-api-access-78ssm\") pod \"ef87595b-7c95-4f09-ac9b-de1e454d6796\" (UID: \"ef87595b-7c95-4f09-ac9b-de1e454d6796\") " Apr 20 20:34:25.336373 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:34:25.336249 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ef87595b-7c95-4f09-ac9b-de1e454d6796-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"ef87595b-7c95-4f09-ac9b-de1e454d6796\" (UID: \"ef87595b-7c95-4f09-ac9b-de1e454d6796\") " Apr 20 20:34:25.336373 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:34:25.336269 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ef87595b-7c95-4f09-ac9b-de1e454d6796-kserve-provision-location\") pod \"ef87595b-7c95-4f09-ac9b-de1e454d6796\" (UID: \"ef87595b-7c95-4f09-ac9b-de1e454d6796\") " Apr 20 20:34:25.336637 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:34:25.336613 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef87595b-7c95-4f09-ac9b-de1e454d6796-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ef87595b-7c95-4f09-ac9b-de1e454d6796" (UID: "ef87595b-7c95-4f09-ac9b-de1e454d6796"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:34:25.336719 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:34:25.336619 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef87595b-7c95-4f09-ac9b-de1e454d6796-isvc-pmml-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-pmml-kube-rbac-proxy-sar-config") pod "ef87595b-7c95-4f09-ac9b-de1e454d6796" (UID: "ef87595b-7c95-4f09-ac9b-de1e454d6796"). InnerVolumeSpecName "isvc-pmml-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:34:25.338164 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:34:25.338139 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ef87595b-7c95-4f09-ac9b-de1e454d6796-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "ef87595b-7c95-4f09-ac9b-de1e454d6796" (UID: "ef87595b-7c95-4f09-ac9b-de1e454d6796"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:34:25.338256 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:34:25.338201 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef87595b-7c95-4f09-ac9b-de1e454d6796-kube-api-access-78ssm" (OuterVolumeSpecName: "kube-api-access-78ssm") pod "ef87595b-7c95-4f09-ac9b-de1e454d6796" (UID: "ef87595b-7c95-4f09-ac9b-de1e454d6796"). InnerVolumeSpecName "kube-api-access-78ssm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:34:25.437692 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:34:25.437669 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-78ssm\" (UniqueName: \"kubernetes.io/projected/ef87595b-7c95-4f09-ac9b-de1e454d6796-kube-api-access-78ssm\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:34:25.437692 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:34:25.437691 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ef87595b-7c95-4f09-ac9b-de1e454d6796-isvc-pmml-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:34:25.437822 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:34:25.437703 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ef87595b-7c95-4f09-ac9b-de1e454d6796-kserve-provision-location\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:34:25.437822 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:34:25.437713 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ef87595b-7c95-4f09-ac9b-de1e454d6796-proxy-tls\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:34:25.766348 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:34:25.766275 2576 generic.go:358] "Generic (PLEG): container finished" podID="ef87595b-7c95-4f09-ac9b-de1e454d6796" containerID="fc709cbe5965944e8037c2c3b2cf3683f084ccb018670393d6f948bcd6c29ca7" exitCode=0 Apr 20 20:34:25.766348 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:34:25.766343 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rvjm2" event={"ID":"ef87595b-7c95-4f09-ac9b-de1e454d6796","Type":"ContainerDied","Data":"fc709cbe5965944e8037c2c3b2cf3683f084ccb018670393d6f948bcd6c29ca7"} Apr 20 20:34:25.766526 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:34:25.766349 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rvjm2" Apr 20 20:34:25.766526 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:34:25.766372 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rvjm2" event={"ID":"ef87595b-7c95-4f09-ac9b-de1e454d6796","Type":"ContainerDied","Data":"50f27fdd40bfb207112daad65ba3aa2818fa58d9473a06368c581c15e59b7b36"} Apr 20 20:34:25.766526 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:34:25.766388 2576 scope.go:117] "RemoveContainer" containerID="c8ae4cb46cc9aea1b05732763aa85782ce0907e5f379a08df8fa81bef0b3e39d" Apr 20 20:34:25.774824 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:34:25.774806 2576 scope.go:117] "RemoveContainer" containerID="fc709cbe5965944e8037c2c3b2cf3683f084ccb018670393d6f948bcd6c29ca7" Apr 20 20:34:25.781663 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:34:25.781644 2576 scope.go:117] "RemoveContainer" containerID="591652d895c7df6d9057e516c438839080403fc04ff624330ff032596feafc38" Apr 20 20:34:25.788081 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:34:25.788047 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rvjm2"] Apr 20 20:34:25.789090 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:34:25.789075 2576 scope.go:117] "RemoveContainer" containerID="c8ae4cb46cc9aea1b05732763aa85782ce0907e5f379a08df8fa81bef0b3e39d" Apr 20 20:34:25.789351 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:34:25.789334 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8ae4cb46cc9aea1b05732763aa85782ce0907e5f379a08df8fa81bef0b3e39d\": container with ID starting with c8ae4cb46cc9aea1b05732763aa85782ce0907e5f379a08df8fa81bef0b3e39d not found: ID does not exist" containerID="c8ae4cb46cc9aea1b05732763aa85782ce0907e5f379a08df8fa81bef0b3e39d" Apr 20 20:34:25.789401 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:34:25.789359 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8ae4cb46cc9aea1b05732763aa85782ce0907e5f379a08df8fa81bef0b3e39d"} err="failed to get container status \"c8ae4cb46cc9aea1b05732763aa85782ce0907e5f379a08df8fa81bef0b3e39d\": rpc error: code = NotFound desc = could not find container \"c8ae4cb46cc9aea1b05732763aa85782ce0907e5f379a08df8fa81bef0b3e39d\": container with ID starting with c8ae4cb46cc9aea1b05732763aa85782ce0907e5f379a08df8fa81bef0b3e39d not found: ID does not exist" Apr 20 20:34:25.789401 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:34:25.789378 2576 scope.go:117] "RemoveContainer" containerID="fc709cbe5965944e8037c2c3b2cf3683f084ccb018670393d6f948bcd6c29ca7" Apr 20 20:34:25.789596 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:34:25.789578 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc709cbe5965944e8037c2c3b2cf3683f084ccb018670393d6f948bcd6c29ca7\": container with ID starting with fc709cbe5965944e8037c2c3b2cf3683f084ccb018670393d6f948bcd6c29ca7 not found: ID does not exist" containerID="fc709cbe5965944e8037c2c3b2cf3683f084ccb018670393d6f948bcd6c29ca7" Apr 20 20:34:25.789661 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:34:25.789615 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc709cbe5965944e8037c2c3b2cf3683f084ccb018670393d6f948bcd6c29ca7"} err="failed to get container status \"fc709cbe5965944e8037c2c3b2cf3683f084ccb018670393d6f948bcd6c29ca7\": rpc error: code = NotFound desc = could not find container \"fc709cbe5965944e8037c2c3b2cf3683f084ccb018670393d6f948bcd6c29ca7\": container with ID starting with fc709cbe5965944e8037c2c3b2cf3683f084ccb018670393d6f948bcd6c29ca7 not found: ID does not exist" Apr 20 20:34:25.789661 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:34:25.789639 2576 scope.go:117] "RemoveContainer" containerID="591652d895c7df6d9057e516c438839080403fc04ff624330ff032596feafc38" Apr 20 20:34:25.789878 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:34:25.789860 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"591652d895c7df6d9057e516c438839080403fc04ff624330ff032596feafc38\": container with ID starting with 591652d895c7df6d9057e516c438839080403fc04ff624330ff032596feafc38 not found: ID does not exist" containerID="591652d895c7df6d9057e516c438839080403fc04ff624330ff032596feafc38" Apr 20 20:34:25.789918 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:34:25.789884 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"591652d895c7df6d9057e516c438839080403fc04ff624330ff032596feafc38"} err="failed to get container status \"591652d895c7df6d9057e516c438839080403fc04ff624330ff032596feafc38\": rpc error: code = NotFound desc = could not find container \"591652d895c7df6d9057e516c438839080403fc04ff624330ff032596feafc38\": container with ID starting with 591652d895c7df6d9057e516c438839080403fc04ff624330ff032596feafc38 not found: ID does not exist" Apr 20 20:34:25.794923 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:34:25.794901 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-rvjm2"] Apr 20 20:34:25.943901 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:34:25.943869 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef87595b-7c95-4f09-ac9b-de1e454d6796" path="/var/lib/kubelet/pods/ef87595b-7c95-4f09-ac9b-de1e454d6796/volumes" Apr 20 20:35:54.405721 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:35:54.405614 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z55qt_f78ac3d9-bcf1-43dd-aac7-1678831ee3ba/ovn-acl-logging/0.log" Apr 20 20:35:54.410344 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:35:54.410325 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z55qt_f78ac3d9-bcf1-43dd-aac7-1678831ee3ba/ovn-acl-logging/0.log" Apr 20 20:36:03.055197 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:36:03.055165 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz"] Apr 20 20:36:03.055546 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:36:03.055512 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ef87595b-7c95-4f09-ac9b-de1e454d6796" containerName="kserve-container" Apr 20 20:36:03.055546 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:36:03.055523 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef87595b-7c95-4f09-ac9b-de1e454d6796" containerName="kserve-container" Apr 20 20:36:03.055546 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:36:03.055535 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ef87595b-7c95-4f09-ac9b-de1e454d6796" containerName="kube-rbac-proxy" Apr 20 20:36:03.055546 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:36:03.055541 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef87595b-7c95-4f09-ac9b-de1e454d6796" containerName="kube-rbac-proxy" Apr 20 20:36:03.055681 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:36:03.055551 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ef87595b-7c95-4f09-ac9b-de1e454d6796" containerName="storage-initializer" Apr 20 20:36:03.055681 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:36:03.055560 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef87595b-7c95-4f09-ac9b-de1e454d6796" containerName="storage-initializer" Apr 20 20:36:03.055681 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:36:03.055616 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ef87595b-7c95-4f09-ac9b-de1e454d6796" containerName="kube-rbac-proxy" Apr 20 20:36:03.055681 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:36:03.055628 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ef87595b-7c95-4f09-ac9b-de1e454d6796" containerName="kserve-container" Apr 20 20:36:03.058739 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:36:03.058723 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz" Apr 20 20:36:03.061280 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:36:03.061260 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 20 20:36:03.061380 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:36:03.061293 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\"" Apr 20 20:36:03.061439 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:36:03.061412 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-v2-kserve-predictor-serving-cert\"" Apr 20 20:36:03.061494 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:36:03.061412 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 20 20:36:03.062488 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:36:03.062472 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-c5zv6\"" Apr 20 20:36:03.067964 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:36:03.067944 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz"] Apr 20 20:36:03.155170 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:36:03.155141 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3e419c2c-59ae-4c78-bae9-ce583fb091c8-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz\" (UID: \"3e419c2c-59ae-4c78-bae9-ce583fb091c8\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz" Apr 20 20:36:03.155309 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:36:03.155193 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3e419c2c-59ae-4c78-bae9-ce583fb091c8-proxy-tls\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz\" (UID: \"3e419c2c-59ae-4c78-bae9-ce583fb091c8\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz" Apr 20 20:36:03.155309 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:36:03.155256 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbgvt\" (UniqueName: \"kubernetes.io/projected/3e419c2c-59ae-4c78-bae9-ce583fb091c8-kube-api-access-fbgvt\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz\" (UID: \"3e419c2c-59ae-4c78-bae9-ce583fb091c8\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz" Apr 20 20:36:03.155309 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:36:03.155275 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3e419c2c-59ae-4c78-bae9-ce583fb091c8-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz\" (UID: \"3e419c2c-59ae-4c78-bae9-ce583fb091c8\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz" Apr 20 20:36:03.255701 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:36:03.255675 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3e419c2c-59ae-4c78-bae9-ce583fb091c8-proxy-tls\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz\" (UID: \"3e419c2c-59ae-4c78-bae9-ce583fb091c8\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz" Apr 20 20:36:03.255826 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:36:03.255721 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fbgvt\" (UniqueName: \"kubernetes.io/projected/3e419c2c-59ae-4c78-bae9-ce583fb091c8-kube-api-access-fbgvt\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz\" (UID: \"3e419c2c-59ae-4c78-bae9-ce583fb091c8\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz" Apr 20 20:36:03.255826 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:36:03.255741 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3e419c2c-59ae-4c78-bae9-ce583fb091c8-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz\" (UID: \"3e419c2c-59ae-4c78-bae9-ce583fb091c8\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz" Apr 20 20:36:03.255826 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:36:03.255769 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3e419c2c-59ae-4c78-bae9-ce583fb091c8-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz\" (UID: \"3e419c2c-59ae-4c78-bae9-ce583fb091c8\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz" Apr 20 20:36:03.255976 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:36:03.255843 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-serving-cert: secret "isvc-pmml-v2-kserve-predictor-serving-cert" not found Apr 20 20:36:03.255976 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:36:03.255923 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e419c2c-59ae-4c78-bae9-ce583fb091c8-proxy-tls podName:3e419c2c-59ae-4c78-bae9-ce583fb091c8 nodeName:}" failed. No retries permitted until 2026-04-20 20:36:03.755901702 +0000 UTC m=+1838.468583683 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/3e419c2c-59ae-4c78-bae9-ce583fb091c8-proxy-tls") pod "isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz" (UID: "3e419c2c-59ae-4c78-bae9-ce583fb091c8") : secret "isvc-pmml-v2-kserve-predictor-serving-cert" not found Apr 20 20:36:03.256149 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:36:03.256126 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3e419c2c-59ae-4c78-bae9-ce583fb091c8-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz\" (UID: \"3e419c2c-59ae-4c78-bae9-ce583fb091c8\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz" Apr 20 20:36:03.256370 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:36:03.256353 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3e419c2c-59ae-4c78-bae9-ce583fb091c8-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz\" (UID: \"3e419c2c-59ae-4c78-bae9-ce583fb091c8\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz" Apr 20 20:36:03.264685 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:36:03.264657 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbgvt\" (UniqueName: \"kubernetes.io/projected/3e419c2c-59ae-4c78-bae9-ce583fb091c8-kube-api-access-fbgvt\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz\" (UID: \"3e419c2c-59ae-4c78-bae9-ce583fb091c8\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz" Apr 20 20:36:03.759881 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:36:03.759842 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3e419c2c-59ae-4c78-bae9-ce583fb091c8-proxy-tls\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz\" (UID: \"3e419c2c-59ae-4c78-bae9-ce583fb091c8\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz" Apr 20 20:36:03.762303 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:36:03.762276 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3e419c2c-59ae-4c78-bae9-ce583fb091c8-proxy-tls\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz\" (UID: \"3e419c2c-59ae-4c78-bae9-ce583fb091c8\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz" Apr 20 20:36:03.971629 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:36:03.971594 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz" Apr 20 20:36:04.091516 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:36:04.091493 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz"] Apr 20 20:36:04.094224 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:36:04.094185 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e419c2c_59ae_4c78_bae9_ce583fb091c8.slice/crio-55d8c5b374005319cb27d881cf92d5b5b6119bb0c30981e3443cf97c17fe9a74 WatchSource:0}: Error finding container 55d8c5b374005319cb27d881cf92d5b5b6119bb0c30981e3443cf97c17fe9a74: Status 404 returned error can't find the container with id 55d8c5b374005319cb27d881cf92d5b5b6119bb0c30981e3443cf97c17fe9a74 Apr 20 20:36:04.096074 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:36:04.096046 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:36:05.084348 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:36:05.084314 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz" event={"ID":"3e419c2c-59ae-4c78-bae9-ce583fb091c8","Type":"ContainerStarted","Data":"1b7130f145b6845cdce383f031c16771a8c3cf747f29398803b147443fc3fc4b"} Apr 20 20:36:05.084348 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:36:05.084349 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz" event={"ID":"3e419c2c-59ae-4c78-bae9-ce583fb091c8","Type":"ContainerStarted","Data":"55d8c5b374005319cb27d881cf92d5b5b6119bb0c30981e3443cf97c17fe9a74"} Apr 20 20:36:08.094789 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:36:08.094759 2576 generic.go:358] "Generic (PLEG): container finished" podID="3e419c2c-59ae-4c78-bae9-ce583fb091c8" containerID="1b7130f145b6845cdce383f031c16771a8c3cf747f29398803b147443fc3fc4b" exitCode=0 Apr 20 20:36:08.095184 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:36:08.094831 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz" event={"ID":"3e419c2c-59ae-4c78-bae9-ce583fb091c8","Type":"ContainerDied","Data":"1b7130f145b6845cdce383f031c16771a8c3cf747f29398803b147443fc3fc4b"} Apr 20 20:36:09.099965 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:36:09.099934 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz" event={"ID":"3e419c2c-59ae-4c78-bae9-ce583fb091c8","Type":"ContainerStarted","Data":"0233b0c9b259b3a89399f331e5da4b727db50cb5b4d0dd3cfab3432b4b5e259b"} Apr 20 20:36:09.099965 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:36:09.099971 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz" event={"ID":"3e419c2c-59ae-4c78-bae9-ce583fb091c8","Type":"ContainerStarted","Data":"9ba61c3eb58614137746af6d78a0e1f8f1514818435b912010fb60c6bf5c9420"} Apr 20 20:36:09.100401 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:36:09.100169 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz" Apr 20 20:36:09.120340 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:36:09.120298 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz" podStartSLOduration=6.12028737 podStartE2EDuration="6.12028737s" podCreationTimestamp="2026-04-20 20:36:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:36:09.117969151 +0000 UTC m=+1843.830651149" watchObservedRunningTime="2026-04-20 20:36:09.12028737 +0000 UTC m=+1843.832969367" Apr 20 20:36:10.104188 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:36:10.104154 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz" Apr 20 20:36:10.105235 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:36:10.105207 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz" podUID="3e419c2c-59ae-4c78-bae9-ce583fb091c8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 20 20:36:11.107545 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:36:11.107502 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz" podUID="3e419c2c-59ae-4c78-bae9-ce583fb091c8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 20 20:36:16.112129 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:36:16.112088 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz" Apr 20 20:36:16.112669 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:36:16.112641 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz" podUID="3e419c2c-59ae-4c78-bae9-ce583fb091c8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 20 20:36:26.112913 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:36:26.112877 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz" podUID="3e419c2c-59ae-4c78-bae9-ce583fb091c8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 20 20:36:36.112629 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:36:36.112544 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz" podUID="3e419c2c-59ae-4c78-bae9-ce583fb091c8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 20 20:36:46.113353 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:36:46.113314 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz" podUID="3e419c2c-59ae-4c78-bae9-ce583fb091c8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 20 20:36:56.113020 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:36:56.112985 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz" podUID="3e419c2c-59ae-4c78-bae9-ce583fb091c8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 20 20:37:06.112522 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:06.112487 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz" podUID="3e419c2c-59ae-4c78-bae9-ce583fb091c8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 20 20:37:16.113565 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:16.113528 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz" podUID="3e419c2c-59ae-4c78-bae9-ce583fb091c8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 20 20:37:26.114032 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:26.114002 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz" Apr 20 20:37:33.920661 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:33.920618 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz"] Apr 20 20:37:33.921099 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:33.920962 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz" podUID="3e419c2c-59ae-4c78-bae9-ce583fb091c8" containerName="kserve-container" containerID="cri-o://9ba61c3eb58614137746af6d78a0e1f8f1514818435b912010fb60c6bf5c9420" gracePeriod=30 Apr 20 20:37:33.921099 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:33.921020 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz" podUID="3e419c2c-59ae-4c78-bae9-ce583fb091c8" containerName="kube-rbac-proxy" containerID="cri-o://0233b0c9b259b3a89399f331e5da4b727db50cb5b4d0dd3cfab3432b4b5e259b" gracePeriod=30 Apr 20 20:37:34.029713 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:34.029673 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-63f047-predictor-5b4c68b4d-xwbzq"] Apr 20 20:37:34.036964 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:34.036935 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-63f047-predictor-5b4c68b4d-xwbzq" Apr 20 20:37:34.039697 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:34.039667 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-63f047-predictor-serving-cert\"" Apr 20 20:37:34.039854 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:34.039715 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-63f047-kube-rbac-proxy-sar-config\"" Apr 20 20:37:34.045341 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:34.045321 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-63f047-predictor-5b4c68b4d-xwbzq"] Apr 20 20:37:34.107747 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:34.107719 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb-proxy-tls\") pod \"isvc-primary-63f047-predictor-5b4c68b4d-xwbzq\" (UID: \"2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb\") " pod="kserve-ci-e2e-test/isvc-primary-63f047-predictor-5b4c68b4d-xwbzq" Apr 20 20:37:34.107899 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:34.107759 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fs2vh\" (UniqueName: \"kubernetes.io/projected/2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb-kube-api-access-fs2vh\") pod \"isvc-primary-63f047-predictor-5b4c68b4d-xwbzq\" (UID: \"2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb\") " pod="kserve-ci-e2e-test/isvc-primary-63f047-predictor-5b4c68b4d-xwbzq" Apr 20 20:37:34.107899 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:34.107840 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-primary-63f047-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb-isvc-primary-63f047-kube-rbac-proxy-sar-config\") pod \"isvc-primary-63f047-predictor-5b4c68b4d-xwbzq\" (UID: \"2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb\") " pod="kserve-ci-e2e-test/isvc-primary-63f047-predictor-5b4c68b4d-xwbzq" Apr 20 20:37:34.107988 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:34.107916 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb-kserve-provision-location\") pod \"isvc-primary-63f047-predictor-5b4c68b4d-xwbzq\" (UID: \"2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb\") " pod="kserve-ci-e2e-test/isvc-primary-63f047-predictor-5b4c68b4d-xwbzq" Apr 20 20:37:34.208558 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:34.208474 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-primary-63f047-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb-isvc-primary-63f047-kube-rbac-proxy-sar-config\") pod \"isvc-primary-63f047-predictor-5b4c68b4d-xwbzq\" (UID: \"2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb\") " pod="kserve-ci-e2e-test/isvc-primary-63f047-predictor-5b4c68b4d-xwbzq" Apr 20 20:37:34.208558 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:34.208527 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb-kserve-provision-location\") pod \"isvc-primary-63f047-predictor-5b4c68b4d-xwbzq\" (UID: \"2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb\") " pod="kserve-ci-e2e-test/isvc-primary-63f047-predictor-5b4c68b4d-xwbzq" Apr 20 20:37:34.208558 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:34.208556 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb-proxy-tls\") pod \"isvc-primary-63f047-predictor-5b4c68b4d-xwbzq\" (UID: \"2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb\") " pod="kserve-ci-e2e-test/isvc-primary-63f047-predictor-5b4c68b4d-xwbzq" Apr 20 20:37:34.208836 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:34.208584 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fs2vh\" (UniqueName: \"kubernetes.io/projected/2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb-kube-api-access-fs2vh\") pod \"isvc-primary-63f047-predictor-5b4c68b4d-xwbzq\" (UID: \"2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb\") " pod="kserve-ci-e2e-test/isvc-primary-63f047-predictor-5b4c68b4d-xwbzq" Apr 20 20:37:34.208974 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:34.208951 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb-kserve-provision-location\") pod \"isvc-primary-63f047-predictor-5b4c68b4d-xwbzq\" (UID: \"2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb\") " pod="kserve-ci-e2e-test/isvc-primary-63f047-predictor-5b4c68b4d-xwbzq" Apr 20 20:37:34.209217 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:34.209195 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-primary-63f047-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb-isvc-primary-63f047-kube-rbac-proxy-sar-config\") pod \"isvc-primary-63f047-predictor-5b4c68b4d-xwbzq\" (UID: \"2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb\") " pod="kserve-ci-e2e-test/isvc-primary-63f047-predictor-5b4c68b4d-xwbzq" Apr 20 20:37:34.210934 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:34.210914 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb-proxy-tls\") pod \"isvc-primary-63f047-predictor-5b4c68b4d-xwbzq\" (UID: \"2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb\") " pod="kserve-ci-e2e-test/isvc-primary-63f047-predictor-5b4c68b4d-xwbzq" Apr 20 20:37:34.216462 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:34.216442 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fs2vh\" (UniqueName: \"kubernetes.io/projected/2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb-kube-api-access-fs2vh\") pod \"isvc-primary-63f047-predictor-5b4c68b4d-xwbzq\" (UID: \"2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb\") " pod="kserve-ci-e2e-test/isvc-primary-63f047-predictor-5b4c68b4d-xwbzq" Apr 20 20:37:34.348495 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:34.348467 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-63f047-predictor-5b4c68b4d-xwbzq" Apr 20 20:37:34.383603 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:34.383566 2576 generic.go:358] "Generic (PLEG): container finished" podID="3e419c2c-59ae-4c78-bae9-ce583fb091c8" containerID="0233b0c9b259b3a89399f331e5da4b727db50cb5b4d0dd3cfab3432b4b5e259b" exitCode=2 Apr 20 20:37:34.383744 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:34.383631 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz" event={"ID":"3e419c2c-59ae-4c78-bae9-ce583fb091c8","Type":"ContainerDied","Data":"0233b0c9b259b3a89399f331e5da4b727db50cb5b4d0dd3cfab3432b4b5e259b"} Apr 20 20:37:34.468997 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:34.468962 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-63f047-predictor-5b4c68b4d-xwbzq"] Apr 20 20:37:34.470762 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:37:34.470733 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d5611b1_ee37_4683_91f1_6d2a6fb3fcfb.slice/crio-462b9a5c2bbdced8f7cf10bcad06983439464085a09b16b742258956b84be0b5 WatchSource:0}: Error finding container 462b9a5c2bbdced8f7cf10bcad06983439464085a09b16b742258956b84be0b5: Status 404 returned error can't find the container with id 462b9a5c2bbdced8f7cf10bcad06983439464085a09b16b742258956b84be0b5 Apr 20 20:37:35.388061 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:35.388027 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-63f047-predictor-5b4c68b4d-xwbzq" event={"ID":"2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb","Type":"ContainerStarted","Data":"d73b2c8d1360a8fa7da48dfcda3a83fa3f7de7e67565b915136a7e2382c5e674"} Apr 20 20:37:35.388456 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:35.388068 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-63f047-predictor-5b4c68b4d-xwbzq" event={"ID":"2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb","Type":"ContainerStarted","Data":"462b9a5c2bbdced8f7cf10bcad06983439464085a09b16b742258956b84be0b5"} Apr 20 20:37:36.107715 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:36.107679 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz" podUID="3e419c2c-59ae-4c78-bae9-ce583fb091c8" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.49:8643/healthz\": dial tcp 10.132.0.49:8643: connect: connection refused" Apr 20 20:37:36.112997 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:36.112976 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz" podUID="3e419c2c-59ae-4c78-bae9-ce583fb091c8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 20 20:37:37.167025 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:37.167003 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz" Apr 20 20:37:37.233594 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:37.233523 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbgvt\" (UniqueName: \"kubernetes.io/projected/3e419c2c-59ae-4c78-bae9-ce583fb091c8-kube-api-access-fbgvt\") pod \"3e419c2c-59ae-4c78-bae9-ce583fb091c8\" (UID: \"3e419c2c-59ae-4c78-bae9-ce583fb091c8\") " Apr 20 20:37:37.233722 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:37.233618 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3e419c2c-59ae-4c78-bae9-ce583fb091c8-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"3e419c2c-59ae-4c78-bae9-ce583fb091c8\" (UID: \"3e419c2c-59ae-4c78-bae9-ce583fb091c8\") " Apr 20 20:37:37.233722 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:37.233638 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3e419c2c-59ae-4c78-bae9-ce583fb091c8-kserve-provision-location\") pod \"3e419c2c-59ae-4c78-bae9-ce583fb091c8\" (UID: \"3e419c2c-59ae-4c78-bae9-ce583fb091c8\") " Apr 20 20:37:37.233722 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:37.233656 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3e419c2c-59ae-4c78-bae9-ce583fb091c8-proxy-tls\") pod \"3e419c2c-59ae-4c78-bae9-ce583fb091c8\" (UID: \"3e419c2c-59ae-4c78-bae9-ce583fb091c8\") " Apr 20 20:37:37.234035 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:37.234006 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3e419c2c-59ae-4c78-bae9-ce583fb091c8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3e419c2c-59ae-4c78-bae9-ce583fb091c8" (UID: "3e419c2c-59ae-4c78-bae9-ce583fb091c8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:37:37.234133 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:37.234017 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e419c2c-59ae-4c78-bae9-ce583fb091c8-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config") pod "3e419c2c-59ae-4c78-bae9-ce583fb091c8" (UID: "3e419c2c-59ae-4c78-bae9-ce583fb091c8"). InnerVolumeSpecName "isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:37:37.235523 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:37.235497 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e419c2c-59ae-4c78-bae9-ce583fb091c8-kube-api-access-fbgvt" (OuterVolumeSpecName: "kube-api-access-fbgvt") pod "3e419c2c-59ae-4c78-bae9-ce583fb091c8" (UID: "3e419c2c-59ae-4c78-bae9-ce583fb091c8"). InnerVolumeSpecName "kube-api-access-fbgvt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:37:37.235853 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:37.235830 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e419c2c-59ae-4c78-bae9-ce583fb091c8-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "3e419c2c-59ae-4c78-bae9-ce583fb091c8" (UID: "3e419c2c-59ae-4c78-bae9-ce583fb091c8"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:37:37.334182 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:37.334157 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fbgvt\" (UniqueName: \"kubernetes.io/projected/3e419c2c-59ae-4c78-bae9-ce583fb091c8-kube-api-access-fbgvt\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:37:37.334182 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:37.334179 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3e419c2c-59ae-4c78-bae9-ce583fb091c8-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:37:37.334324 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:37.334190 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3e419c2c-59ae-4c78-bae9-ce583fb091c8-kserve-provision-location\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:37:37.334324 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:37.334199 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3e419c2c-59ae-4c78-bae9-ce583fb091c8-proxy-tls\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:37:37.396672 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:37.396640 2576 generic.go:358] "Generic (PLEG): container finished" podID="3e419c2c-59ae-4c78-bae9-ce583fb091c8" containerID="9ba61c3eb58614137746af6d78a0e1f8f1514818435b912010fb60c6bf5c9420" exitCode=0 Apr 20 20:37:37.396775 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:37.396680 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz" event={"ID":"3e419c2c-59ae-4c78-bae9-ce583fb091c8","Type":"ContainerDied","Data":"9ba61c3eb58614137746af6d78a0e1f8f1514818435b912010fb60c6bf5c9420"} Apr 20 20:37:37.396775 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:37.396703 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz" event={"ID":"3e419c2c-59ae-4c78-bae9-ce583fb091c8","Type":"ContainerDied","Data":"55d8c5b374005319cb27d881cf92d5b5b6119bb0c30981e3443cf97c17fe9a74"} Apr 20 20:37:37.396775 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:37.396720 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz" Apr 20 20:37:37.396886 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:37.396723 2576 scope.go:117] "RemoveContainer" containerID="0233b0c9b259b3a89399f331e5da4b727db50cb5b4d0dd3cfab3432b4b5e259b" Apr 20 20:37:37.405164 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:37.405143 2576 scope.go:117] "RemoveContainer" containerID="9ba61c3eb58614137746af6d78a0e1f8f1514818435b912010fb60c6bf5c9420" Apr 20 20:37:37.412085 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:37.412068 2576 scope.go:117] "RemoveContainer" containerID="1b7130f145b6845cdce383f031c16771a8c3cf747f29398803b147443fc3fc4b" Apr 20 20:37:37.419379 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:37.419357 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz"] Apr 20 20:37:37.419574 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:37.419560 2576 scope.go:117] "RemoveContainer" containerID="0233b0c9b259b3a89399f331e5da4b727db50cb5b4d0dd3cfab3432b4b5e259b" Apr 20 20:37:37.419789 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:37:37.419771 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0233b0c9b259b3a89399f331e5da4b727db50cb5b4d0dd3cfab3432b4b5e259b\": container with ID starting with 0233b0c9b259b3a89399f331e5da4b727db50cb5b4d0dd3cfab3432b4b5e259b not found: ID does not exist" containerID="0233b0c9b259b3a89399f331e5da4b727db50cb5b4d0dd3cfab3432b4b5e259b" Apr 20 20:37:37.419862 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:37.419800 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0233b0c9b259b3a89399f331e5da4b727db50cb5b4d0dd3cfab3432b4b5e259b"} err="failed to get container status \"0233b0c9b259b3a89399f331e5da4b727db50cb5b4d0dd3cfab3432b4b5e259b\": rpc error: code = NotFound desc = could not find container \"0233b0c9b259b3a89399f331e5da4b727db50cb5b4d0dd3cfab3432b4b5e259b\": container with ID starting with 0233b0c9b259b3a89399f331e5da4b727db50cb5b4d0dd3cfab3432b4b5e259b not found: ID does not exist" Apr 20 20:37:37.419862 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:37.419826 2576 scope.go:117] "RemoveContainer" containerID="9ba61c3eb58614137746af6d78a0e1f8f1514818435b912010fb60c6bf5c9420" Apr 20 20:37:37.420083 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:37:37.420063 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ba61c3eb58614137746af6d78a0e1f8f1514818435b912010fb60c6bf5c9420\": container with ID starting with 9ba61c3eb58614137746af6d78a0e1f8f1514818435b912010fb60c6bf5c9420 not found: ID does not exist" containerID="9ba61c3eb58614137746af6d78a0e1f8f1514818435b912010fb60c6bf5c9420" Apr 20 20:37:37.420223 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:37.420089 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ba61c3eb58614137746af6d78a0e1f8f1514818435b912010fb60c6bf5c9420"} err="failed to get container status \"9ba61c3eb58614137746af6d78a0e1f8f1514818435b912010fb60c6bf5c9420\": rpc error: code = NotFound desc = could not find container \"9ba61c3eb58614137746af6d78a0e1f8f1514818435b912010fb60c6bf5c9420\": container with ID starting with 9ba61c3eb58614137746af6d78a0e1f8f1514818435b912010fb60c6bf5c9420 not found: ID does not exist" Apr 20 20:37:37.420223 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:37.420105 2576 scope.go:117] "RemoveContainer" containerID="1b7130f145b6845cdce383f031c16771a8c3cf747f29398803b147443fc3fc4b" Apr 20 20:37:37.420329 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:37:37.420303 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b7130f145b6845cdce383f031c16771a8c3cf747f29398803b147443fc3fc4b\": container with ID starting with 1b7130f145b6845cdce383f031c16771a8c3cf747f29398803b147443fc3fc4b not found: ID does not exist" containerID="1b7130f145b6845cdce383f031c16771a8c3cf747f29398803b147443fc3fc4b" Apr 20 20:37:37.420369 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:37.420323 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b7130f145b6845cdce383f031c16771a8c3cf747f29398803b147443fc3fc4b"} err="failed to get container status \"1b7130f145b6845cdce383f031c16771a8c3cf747f29398803b147443fc3fc4b\": rpc error: code = NotFound desc = could not find container \"1b7130f145b6845cdce383f031c16771a8c3cf747f29398803b147443fc3fc4b\": container with ID starting with 1b7130f145b6845cdce383f031c16771a8c3cf747f29398803b147443fc3fc4b not found: ID does not exist" Apr 20 20:37:37.425125 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:37.425092 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-q8klz"] Apr 20 20:37:37.944410 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:37.944379 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e419c2c-59ae-4c78-bae9-ce583fb091c8" path="/var/lib/kubelet/pods/3e419c2c-59ae-4c78-bae9-ce583fb091c8/volumes" Apr 20 20:37:38.401053 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:38.401022 2576 generic.go:358] "Generic (PLEG): container finished" podID="2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb" containerID="d73b2c8d1360a8fa7da48dfcda3a83fa3f7de7e67565b915136a7e2382c5e674" exitCode=0 Apr 20 20:37:38.401457 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:38.401097 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-63f047-predictor-5b4c68b4d-xwbzq" event={"ID":"2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb","Type":"ContainerDied","Data":"d73b2c8d1360a8fa7da48dfcda3a83fa3f7de7e67565b915136a7e2382c5e674"} Apr 20 20:37:39.409788 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:39.409753 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-63f047-predictor-5b4c68b4d-xwbzq" event={"ID":"2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb","Type":"ContainerStarted","Data":"6c801a4d6fa06a53ddbe04643cc5920eb230572f759783ba157d66832c196b68"} Apr 20 20:37:39.409788 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:39.409794 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-63f047-predictor-5b4c68b4d-xwbzq" event={"ID":"2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb","Type":"ContainerStarted","Data":"e48c103a8f9a07f49982875457ccbd485c3b7eddf45fbb7f25633b3b401ebe4a"} Apr 20 20:37:39.410224 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:39.409992 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-63f047-predictor-5b4c68b4d-xwbzq" Apr 20 20:37:39.430590 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:39.430545 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-63f047-predictor-5b4c68b4d-xwbzq" podStartSLOduration=5.430529906 podStartE2EDuration="5.430529906s" podCreationTimestamp="2026-04-20 20:37:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:37:39.427816604 +0000 UTC m=+1934.140498602" watchObservedRunningTime="2026-04-20 20:37:39.430529906 +0000 UTC m=+1934.143211901" Apr 20 20:37:40.413059 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:40.413031 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-63f047-predictor-5b4c68b4d-xwbzq" Apr 20 20:37:40.414311 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:40.414276 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-63f047-predictor-5b4c68b4d-xwbzq" podUID="2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.50:8080: connect: connection refused" Apr 20 20:37:41.416955 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:41.416911 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-63f047-predictor-5b4c68b4d-xwbzq" podUID="2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.50:8080: connect: connection refused" Apr 20 20:37:46.421484 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:46.421455 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-63f047-predictor-5b4c68b4d-xwbzq" Apr 20 20:37:46.422037 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:46.422010 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-63f047-predictor-5b4c68b4d-xwbzq" podUID="2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.50:8080: connect: connection refused" Apr 20 20:37:56.422085 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:37:56.422047 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-63f047-predictor-5b4c68b4d-xwbzq" podUID="2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.50:8080: connect: connection refused" Apr 20 20:38:06.422163 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:38:06.422078 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-63f047-predictor-5b4c68b4d-xwbzq" podUID="2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.50:8080: connect: connection refused" Apr 20 20:38:16.422560 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:38:16.422520 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-63f047-predictor-5b4c68b4d-xwbzq" podUID="2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.50:8080: connect: connection refused" Apr 20 20:38:26.422358 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:38:26.422322 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-63f047-predictor-5b4c68b4d-xwbzq" podUID="2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.50:8080: connect: connection refused" Apr 20 20:38:36.422400 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:38:36.422311 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-63f047-predictor-5b4c68b4d-xwbzq" podUID="2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.50:8080: connect: connection refused" Apr 20 20:38:46.423164 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:38:46.423134 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-63f047-predictor-5b4c68b4d-xwbzq" Apr 20 20:38:54.174694 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:38:54.174657 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8"] Apr 20 20:38:54.175039 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:38:54.174992 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3e419c2c-59ae-4c78-bae9-ce583fb091c8" containerName="kube-rbac-proxy" Apr 20 20:38:54.175039 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:38:54.175002 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e419c2c-59ae-4c78-bae9-ce583fb091c8" containerName="kube-rbac-proxy" Apr 20 20:38:54.175039 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:38:54.175017 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3e419c2c-59ae-4c78-bae9-ce583fb091c8" containerName="kserve-container" Apr 20 20:38:54.175039 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:38:54.175023 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e419c2c-59ae-4c78-bae9-ce583fb091c8" containerName="kserve-container" Apr 20 20:38:54.175039 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:38:54.175033 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3e419c2c-59ae-4c78-bae9-ce583fb091c8" containerName="storage-initializer" Apr 20 20:38:54.175039 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:38:54.175039 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e419c2c-59ae-4c78-bae9-ce583fb091c8" containerName="storage-initializer" Apr 20 20:38:54.175253 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:38:54.175091 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="3e419c2c-59ae-4c78-bae9-ce583fb091c8" containerName="kserve-container" Apr 20 20:38:54.175253 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:38:54.175100 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="3e419c2c-59ae-4c78-bae9-ce583fb091c8" containerName="kube-rbac-proxy" Apr 20 20:38:54.178229 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:38:54.178213 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8" Apr 20 20:38:54.180953 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:38:54.180930 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-63f047\"" Apr 20 20:38:54.180953 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:38:54.180940 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-63f047-dockercfg-5bmd5\"" Apr 20 20:38:54.181250 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:38:54.180930 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-63f047-predictor-serving-cert\"" Apr 20 20:38:54.181250 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:38:54.181032 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-63f047-kube-rbac-proxy-sar-config\"" Apr 20 20:38:54.181250 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:38:54.181237 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 20 20:38:54.188949 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:38:54.188924 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8"] Apr 20 20:38:54.290376 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:38:54.290348 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5b02cc34-80f4-4c6d-bc12-8103bdbb047a-proxy-tls\") pod \"isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8\" (UID: \"5b02cc34-80f4-4c6d-bc12-8103bdbb047a\") " pod="kserve-ci-e2e-test/isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8" Apr 20 20:38:54.290503 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:38:54.290424 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b02cc34-80f4-4c6d-bc12-8103bdbb047a-kserve-provision-location\") pod \"isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8\" (UID: \"5b02cc34-80f4-4c6d-bc12-8103bdbb047a\") " pod="kserve-ci-e2e-test/isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8" Apr 20 20:38:54.290503 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:38:54.290470 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/5b02cc34-80f4-4c6d-bc12-8103bdbb047a-cabundle-cert\") pod \"isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8\" (UID: \"5b02cc34-80f4-4c6d-bc12-8103bdbb047a\") " pod="kserve-ci-e2e-test/isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8" Apr 20 20:38:54.290592 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:38:54.290535 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7cmk\" (UniqueName: \"kubernetes.io/projected/5b02cc34-80f4-4c6d-bc12-8103bdbb047a-kube-api-access-l7cmk\") pod \"isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8\" (UID: \"5b02cc34-80f4-4c6d-bc12-8103bdbb047a\") " pod="kserve-ci-e2e-test/isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8" Apr 20 20:38:54.290592 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:38:54.290578 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-secondary-63f047-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5b02cc34-80f4-4c6d-bc12-8103bdbb047a-isvc-secondary-63f047-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8\" (UID: \"5b02cc34-80f4-4c6d-bc12-8103bdbb047a\") " pod="kserve-ci-e2e-test/isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8" Apr 20 20:38:54.391541 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:38:54.391514 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b02cc34-80f4-4c6d-bc12-8103bdbb047a-kserve-provision-location\") pod \"isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8\" (UID: \"5b02cc34-80f4-4c6d-bc12-8103bdbb047a\") " pod="kserve-ci-e2e-test/isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8" Apr 20 20:38:54.391698 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:38:54.391564 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/5b02cc34-80f4-4c6d-bc12-8103bdbb047a-cabundle-cert\") pod \"isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8\" (UID: \"5b02cc34-80f4-4c6d-bc12-8103bdbb047a\") " pod="kserve-ci-e2e-test/isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8" Apr 20 20:38:54.391698 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:38:54.391601 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7cmk\" (UniqueName: \"kubernetes.io/projected/5b02cc34-80f4-4c6d-bc12-8103bdbb047a-kube-api-access-l7cmk\") pod \"isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8\" (UID: \"5b02cc34-80f4-4c6d-bc12-8103bdbb047a\") " pod="kserve-ci-e2e-test/isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8" Apr 20 20:38:54.391698 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:38:54.391639 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-secondary-63f047-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5b02cc34-80f4-4c6d-bc12-8103bdbb047a-isvc-secondary-63f047-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8\" (UID: \"5b02cc34-80f4-4c6d-bc12-8103bdbb047a\") " pod="kserve-ci-e2e-test/isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8" Apr 20 20:38:54.391885 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:38:54.391770 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5b02cc34-80f4-4c6d-bc12-8103bdbb047a-proxy-tls\") pod \"isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8\" (UID: \"5b02cc34-80f4-4c6d-bc12-8103bdbb047a\") " pod="kserve-ci-e2e-test/isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8" Apr 20 20:38:54.391946 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:38:54.391926 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b02cc34-80f4-4c6d-bc12-8103bdbb047a-kserve-provision-location\") pod \"isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8\" (UID: \"5b02cc34-80f4-4c6d-bc12-8103bdbb047a\") " pod="kserve-ci-e2e-test/isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8" Apr 20 20:38:54.392301 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:38:54.392278 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-secondary-63f047-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5b02cc34-80f4-4c6d-bc12-8103bdbb047a-isvc-secondary-63f047-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8\" (UID: \"5b02cc34-80f4-4c6d-bc12-8103bdbb047a\") " pod="kserve-ci-e2e-test/isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8" Apr 20 20:38:54.392387 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:38:54.392324 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/5b02cc34-80f4-4c6d-bc12-8103bdbb047a-cabundle-cert\") pod \"isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8\" (UID: \"5b02cc34-80f4-4c6d-bc12-8103bdbb047a\") " pod="kserve-ci-e2e-test/isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8" Apr 20 20:38:54.394296 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:38:54.394278 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5b02cc34-80f4-4c6d-bc12-8103bdbb047a-proxy-tls\") pod \"isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8\" (UID: \"5b02cc34-80f4-4c6d-bc12-8103bdbb047a\") " pod="kserve-ci-e2e-test/isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8" Apr 20 20:38:54.400799 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:38:54.400780 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7cmk\" (UniqueName: \"kubernetes.io/projected/5b02cc34-80f4-4c6d-bc12-8103bdbb047a-kube-api-access-l7cmk\") pod \"isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8\" (UID: \"5b02cc34-80f4-4c6d-bc12-8103bdbb047a\") " pod="kserve-ci-e2e-test/isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8" Apr 20 20:38:54.489519 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:38:54.489470 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8" Apr 20 20:38:54.610784 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:38:54.610762 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8"] Apr 20 20:38:54.612864 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:38:54.612833 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b02cc34_80f4_4c6d_bc12_8103bdbb047a.slice/crio-15f7eb8aafbcc090b4de040b086c4dabc270ab480737920bba27a11ab419d198 WatchSource:0}: Error finding container 15f7eb8aafbcc090b4de040b086c4dabc270ab480737920bba27a11ab419d198: Status 404 returned error can't find the container with id 15f7eb8aafbcc090b4de040b086c4dabc270ab480737920bba27a11ab419d198 Apr 20 20:38:54.662543 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:38:54.662518 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8" event={"ID":"5b02cc34-80f4-4c6d-bc12-8103bdbb047a","Type":"ContainerStarted","Data":"15f7eb8aafbcc090b4de040b086c4dabc270ab480737920bba27a11ab419d198"} Apr 20 20:38:55.667438 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:38:55.667404 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8" event={"ID":"5b02cc34-80f4-4c6d-bc12-8103bdbb047a","Type":"ContainerStarted","Data":"68aadc64837c5fb01cebda841cb9890d060bf3686b0ab5aa48463019571c7501"} Apr 20 20:39:00.684466 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:00.684435 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8_5b02cc34-80f4-4c6d-bc12-8103bdbb047a/storage-initializer/0.log" Apr 20 20:39:00.684901 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:00.684478 2576 generic.go:358] "Generic (PLEG): container finished" podID="5b02cc34-80f4-4c6d-bc12-8103bdbb047a" containerID="68aadc64837c5fb01cebda841cb9890d060bf3686b0ab5aa48463019571c7501" exitCode=1 Apr 20 20:39:00.684901 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:00.684555 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8" event={"ID":"5b02cc34-80f4-4c6d-bc12-8103bdbb047a","Type":"ContainerDied","Data":"68aadc64837c5fb01cebda841cb9890d060bf3686b0ab5aa48463019571c7501"} Apr 20 20:39:01.690350 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:01.690325 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8_5b02cc34-80f4-4c6d-bc12-8103bdbb047a/storage-initializer/0.log" Apr 20 20:39:01.690690 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:01.690443 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8" event={"ID":"5b02cc34-80f4-4c6d-bc12-8103bdbb047a","Type":"ContainerStarted","Data":"ca8735f2880c93c8e0bb4df26ea7e22bb5ef2245e1f947223533be4abc2b1cf7"} Apr 20 20:39:04.703103 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:04.703071 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8_5b02cc34-80f4-4c6d-bc12-8103bdbb047a/storage-initializer/1.log" Apr 20 20:39:04.703553 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:04.703481 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8_5b02cc34-80f4-4c6d-bc12-8103bdbb047a/storage-initializer/0.log" Apr 20 20:39:04.703553 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:04.703515 2576 generic.go:358] "Generic (PLEG): container finished" podID="5b02cc34-80f4-4c6d-bc12-8103bdbb047a" containerID="ca8735f2880c93c8e0bb4df26ea7e22bb5ef2245e1f947223533be4abc2b1cf7" exitCode=1 Apr 20 20:39:04.703629 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:04.703594 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8" event={"ID":"5b02cc34-80f4-4c6d-bc12-8103bdbb047a","Type":"ContainerDied","Data":"ca8735f2880c93c8e0bb4df26ea7e22bb5ef2245e1f947223533be4abc2b1cf7"} Apr 20 20:39:04.703667 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:04.703636 2576 scope.go:117] "RemoveContainer" containerID="68aadc64837c5fb01cebda841cb9890d060bf3686b0ab5aa48463019571c7501" Apr 20 20:39:04.703988 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:04.703968 2576 scope.go:117] "RemoveContainer" containerID="68aadc64837c5fb01cebda841cb9890d060bf3686b0ab5aa48463019571c7501" Apr 20 20:39:04.718148 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:39:04.718087 2576 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8_kserve-ci-e2e-test_5b02cc34-80f4-4c6d-bc12-8103bdbb047a_0 in pod sandbox 15f7eb8aafbcc090b4de040b086c4dabc270ab480737920bba27a11ab419d198 from index: no such id: '68aadc64837c5fb01cebda841cb9890d060bf3686b0ab5aa48463019571c7501'" containerID="68aadc64837c5fb01cebda841cb9890d060bf3686b0ab5aa48463019571c7501" Apr 20 20:39:04.718229 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:04.718142 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68aadc64837c5fb01cebda841cb9890d060bf3686b0ab5aa48463019571c7501"} err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8_kserve-ci-e2e-test_5b02cc34-80f4-4c6d-bc12-8103bdbb047a_0 in pod sandbox 15f7eb8aafbcc090b4de040b086c4dabc270ab480737920bba27a11ab419d198 from index: no such id: '68aadc64837c5fb01cebda841cb9890d060bf3686b0ab5aa48463019571c7501'" Apr 20 20:39:04.718323 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:39:04.718303 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8_kserve-ci-e2e-test(5b02cc34-80f4-4c6d-bc12-8103bdbb047a)\"" pod="kserve-ci-e2e-test/isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8" podUID="5b02cc34-80f4-4c6d-bc12-8103bdbb047a" Apr 20 20:39:05.709211 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:05.709183 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8_5b02cc34-80f4-4c6d-bc12-8103bdbb047a/storage-initializer/1.log" Apr 20 20:39:12.210438 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:12.210407 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8"] Apr 20 20:39:12.267853 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:12.267823 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-63f047-predictor-5b4c68b4d-xwbzq"] Apr 20 20:39:12.268310 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:12.268259 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-63f047-predictor-5b4c68b4d-xwbzq" podUID="2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb" containerName="kserve-container" containerID="cri-o://e48c103a8f9a07f49982875457ccbd485c3b7eddf45fbb7f25633b3b401ebe4a" gracePeriod=30 Apr 20 20:39:12.268418 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:12.268295 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-63f047-predictor-5b4c68b4d-xwbzq" podUID="2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb" containerName="kube-rbac-proxy" containerID="cri-o://6c801a4d6fa06a53ddbe04643cc5920eb230572f759783ba157d66832c196b68" gracePeriod=30 Apr 20 20:39:12.363282 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:12.363252 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-258425-predictor-59d7cd9f6c-l8mxt"] Apr 20 20:39:12.368802 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:12.368780 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-258425-predictor-59d7cd9f6c-l8mxt" Apr 20 20:39:12.371320 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:12.371277 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-258425\"" Apr 20 20:39:12.371448 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:12.371284 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-258425-dockercfg-ht6kv\"" Apr 20 20:39:12.371448 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:12.371347 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-258425-predictor-serving-cert\"" Apr 20 20:39:12.371448 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:12.371353 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-258425-kube-rbac-proxy-sar-config\"" Apr 20 20:39:12.376005 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:12.375983 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-258425-predictor-59d7cd9f6c-l8mxt"] Apr 20 20:39:12.396002 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:12.395983 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8_5b02cc34-80f4-4c6d-bc12-8103bdbb047a/storage-initializer/1.log" Apr 20 20:39:12.396089 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:12.396053 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8" Apr 20 20:39:12.530079 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:12.530013 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5b02cc34-80f4-4c6d-bc12-8103bdbb047a-proxy-tls\") pod \"5b02cc34-80f4-4c6d-bc12-8103bdbb047a\" (UID: \"5b02cc34-80f4-4c6d-bc12-8103bdbb047a\") " Apr 20 20:39:12.530079 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:12.530047 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-secondary-63f047-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5b02cc34-80f4-4c6d-bc12-8103bdbb047a-isvc-secondary-63f047-kube-rbac-proxy-sar-config\") pod \"5b02cc34-80f4-4c6d-bc12-8103bdbb047a\" (UID: \"5b02cc34-80f4-4c6d-bc12-8103bdbb047a\") " Apr 20 20:39:12.530079 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:12.530072 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/5b02cc34-80f4-4c6d-bc12-8103bdbb047a-cabundle-cert\") pod \"5b02cc34-80f4-4c6d-bc12-8103bdbb047a\" (UID: \"5b02cc34-80f4-4c6d-bc12-8103bdbb047a\") " Apr 20 20:39:12.530334 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:12.530094 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b02cc34-80f4-4c6d-bc12-8103bdbb047a-kserve-provision-location\") pod \"5b02cc34-80f4-4c6d-bc12-8103bdbb047a\" (UID: \"5b02cc34-80f4-4c6d-bc12-8103bdbb047a\") " Apr 20 20:39:12.530334 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:12.530132 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7cmk\" (UniqueName: \"kubernetes.io/projected/5b02cc34-80f4-4c6d-bc12-8103bdbb047a-kube-api-access-l7cmk\") pod \"5b02cc34-80f4-4c6d-bc12-8103bdbb047a\" (UID: \"5b02cc34-80f4-4c6d-bc12-8103bdbb047a\") " Apr 20 20:39:12.530334 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:12.530281 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/e6165d0f-a1f8-4046-b907-8c91088d1b70-cabundle-cert\") pod \"isvc-init-fail-258425-predictor-59d7cd9f6c-l8mxt\" (UID: \"e6165d0f-a1f8-4046-b907-8c91088d1b70\") " pod="kserve-ci-e2e-test/isvc-init-fail-258425-predictor-59d7cd9f6c-l8mxt" Apr 20 20:39:12.530489 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:12.530336 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e6165d0f-a1f8-4046-b907-8c91088d1b70-kserve-provision-location\") pod \"isvc-init-fail-258425-predictor-59d7cd9f6c-l8mxt\" (UID: \"e6165d0f-a1f8-4046-b907-8c91088d1b70\") " pod="kserve-ci-e2e-test/isvc-init-fail-258425-predictor-59d7cd9f6c-l8mxt" Apr 20 20:39:12.530489 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:12.530362 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e6165d0f-a1f8-4046-b907-8c91088d1b70-proxy-tls\") pod \"isvc-init-fail-258425-predictor-59d7cd9f6c-l8mxt\" (UID: \"e6165d0f-a1f8-4046-b907-8c91088d1b70\") " pod="kserve-ci-e2e-test/isvc-init-fail-258425-predictor-59d7cd9f6c-l8mxt" Apr 20 20:39:12.530489 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:12.530392 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4qjm\" (UniqueName: \"kubernetes.io/projected/e6165d0f-a1f8-4046-b907-8c91088d1b70-kube-api-access-r4qjm\") pod \"isvc-init-fail-258425-predictor-59d7cd9f6c-l8mxt\" (UID: \"e6165d0f-a1f8-4046-b907-8c91088d1b70\") " pod="kserve-ci-e2e-test/isvc-init-fail-258425-predictor-59d7cd9f6c-l8mxt" Apr 20 20:39:12.530489 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:12.530396 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b02cc34-80f4-4c6d-bc12-8103bdbb047a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5b02cc34-80f4-4c6d-bc12-8103bdbb047a" (UID: "5b02cc34-80f4-4c6d-bc12-8103bdbb047a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:39:12.530489 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:12.530475 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-init-fail-258425-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e6165d0f-a1f8-4046-b907-8c91088d1b70-isvc-init-fail-258425-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-258425-predictor-59d7cd9f6c-l8mxt\" (UID: \"e6165d0f-a1f8-4046-b907-8c91088d1b70\") " pod="kserve-ci-e2e-test/isvc-init-fail-258425-predictor-59d7cd9f6c-l8mxt" Apr 20 20:39:12.530736 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:12.530494 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b02cc34-80f4-4c6d-bc12-8103bdbb047a-isvc-secondary-63f047-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-secondary-63f047-kube-rbac-proxy-sar-config") pod "5b02cc34-80f4-4c6d-bc12-8103bdbb047a" (UID: "5b02cc34-80f4-4c6d-bc12-8103bdbb047a"). InnerVolumeSpecName "isvc-secondary-63f047-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:39:12.530736 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:12.530499 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b02cc34-80f4-4c6d-bc12-8103bdbb047a-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "5b02cc34-80f4-4c6d-bc12-8103bdbb047a" (UID: "5b02cc34-80f4-4c6d-bc12-8103bdbb047a"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:39:12.530736 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:12.530599 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-secondary-63f047-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5b02cc34-80f4-4c6d-bc12-8103bdbb047a-isvc-secondary-63f047-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:39:12.530736 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:12.530626 2576 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/5b02cc34-80f4-4c6d-bc12-8103bdbb047a-cabundle-cert\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:39:12.530736 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:12.530646 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b02cc34-80f4-4c6d-bc12-8103bdbb047a-kserve-provision-location\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:39:12.532295 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:12.532275 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b02cc34-80f4-4c6d-bc12-8103bdbb047a-kube-api-access-l7cmk" (OuterVolumeSpecName: "kube-api-access-l7cmk") pod "5b02cc34-80f4-4c6d-bc12-8103bdbb047a" (UID: "5b02cc34-80f4-4c6d-bc12-8103bdbb047a"). InnerVolumeSpecName "kube-api-access-l7cmk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:39:12.532434 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:12.532418 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b02cc34-80f4-4c6d-bc12-8103bdbb047a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5b02cc34-80f4-4c6d-bc12-8103bdbb047a" (UID: "5b02cc34-80f4-4c6d-bc12-8103bdbb047a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:39:12.631584 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:12.631552 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e6165d0f-a1f8-4046-b907-8c91088d1b70-kserve-provision-location\") pod \"isvc-init-fail-258425-predictor-59d7cd9f6c-l8mxt\" (UID: \"e6165d0f-a1f8-4046-b907-8c91088d1b70\") " pod="kserve-ci-e2e-test/isvc-init-fail-258425-predictor-59d7cd9f6c-l8mxt" Apr 20 20:39:12.631703 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:12.631588 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e6165d0f-a1f8-4046-b907-8c91088d1b70-proxy-tls\") pod \"isvc-init-fail-258425-predictor-59d7cd9f6c-l8mxt\" (UID: \"e6165d0f-a1f8-4046-b907-8c91088d1b70\") " pod="kserve-ci-e2e-test/isvc-init-fail-258425-predictor-59d7cd9f6c-l8mxt" Apr 20 20:39:12.631703 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:12.631606 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r4qjm\" (UniqueName: \"kubernetes.io/projected/e6165d0f-a1f8-4046-b907-8c91088d1b70-kube-api-access-r4qjm\") pod \"isvc-init-fail-258425-predictor-59d7cd9f6c-l8mxt\" (UID: \"e6165d0f-a1f8-4046-b907-8c91088d1b70\") " pod="kserve-ci-e2e-test/isvc-init-fail-258425-predictor-59d7cd9f6c-l8mxt" Apr 20 20:39:12.631703 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:12.631632 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-init-fail-258425-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e6165d0f-a1f8-4046-b907-8c91088d1b70-isvc-init-fail-258425-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-258425-predictor-59d7cd9f6c-l8mxt\" (UID: \"e6165d0f-a1f8-4046-b907-8c91088d1b70\") " pod="kserve-ci-e2e-test/isvc-init-fail-258425-predictor-59d7cd9f6c-l8mxt" Apr 20 20:39:12.631856 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:12.631720 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/e6165d0f-a1f8-4046-b907-8c91088d1b70-cabundle-cert\") pod \"isvc-init-fail-258425-predictor-59d7cd9f6c-l8mxt\" (UID: \"e6165d0f-a1f8-4046-b907-8c91088d1b70\") " pod="kserve-ci-e2e-test/isvc-init-fail-258425-predictor-59d7cd9f6c-l8mxt" Apr 20 20:39:12.631856 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:12.631775 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5b02cc34-80f4-4c6d-bc12-8103bdbb047a-proxy-tls\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:39:12.631856 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:12.631794 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l7cmk\" (UniqueName: \"kubernetes.io/projected/5b02cc34-80f4-4c6d-bc12-8103bdbb047a-kube-api-access-l7cmk\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:39:12.632001 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:12.631984 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e6165d0f-a1f8-4046-b907-8c91088d1b70-kserve-provision-location\") pod \"isvc-init-fail-258425-predictor-59d7cd9f6c-l8mxt\" (UID: \"e6165d0f-a1f8-4046-b907-8c91088d1b70\") " pod="kserve-ci-e2e-test/isvc-init-fail-258425-predictor-59d7cd9f6c-l8mxt" Apr 20 20:39:12.632376 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:12.632347 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-init-fail-258425-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e6165d0f-a1f8-4046-b907-8c91088d1b70-isvc-init-fail-258425-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-258425-predictor-59d7cd9f6c-l8mxt\" (UID: \"e6165d0f-a1f8-4046-b907-8c91088d1b70\") " pod="kserve-ci-e2e-test/isvc-init-fail-258425-predictor-59d7cd9f6c-l8mxt" Apr 20 20:39:12.632463 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:12.632398 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/e6165d0f-a1f8-4046-b907-8c91088d1b70-cabundle-cert\") pod \"isvc-init-fail-258425-predictor-59d7cd9f6c-l8mxt\" (UID: \"e6165d0f-a1f8-4046-b907-8c91088d1b70\") " pod="kserve-ci-e2e-test/isvc-init-fail-258425-predictor-59d7cd9f6c-l8mxt" Apr 20 20:39:12.633963 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:12.633943 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e6165d0f-a1f8-4046-b907-8c91088d1b70-proxy-tls\") pod \"isvc-init-fail-258425-predictor-59d7cd9f6c-l8mxt\" (UID: \"e6165d0f-a1f8-4046-b907-8c91088d1b70\") " pod="kserve-ci-e2e-test/isvc-init-fail-258425-predictor-59d7cd9f6c-l8mxt" Apr 20 20:39:12.639297 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:12.639277 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4qjm\" (UniqueName: \"kubernetes.io/projected/e6165d0f-a1f8-4046-b907-8c91088d1b70-kube-api-access-r4qjm\") pod \"isvc-init-fail-258425-predictor-59d7cd9f6c-l8mxt\" (UID: \"e6165d0f-a1f8-4046-b907-8c91088d1b70\") " pod="kserve-ci-e2e-test/isvc-init-fail-258425-predictor-59d7cd9f6c-l8mxt" Apr 20 20:39:12.694282 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:12.694260 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-258425-predictor-59d7cd9f6c-l8mxt" Apr 20 20:39:12.734630 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:12.734606 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8_5b02cc34-80f4-4c6d-bc12-8103bdbb047a/storage-initializer/1.log" Apr 20 20:39:12.734772 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:12.734727 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8" Apr 20 20:39:12.734840 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:12.734726 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8" event={"ID":"5b02cc34-80f4-4c6d-bc12-8103bdbb047a","Type":"ContainerDied","Data":"15f7eb8aafbcc090b4de040b086c4dabc270ab480737920bba27a11ab419d198"} Apr 20 20:39:12.734887 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:12.734846 2576 scope.go:117] "RemoveContainer" containerID="ca8735f2880c93c8e0bb4df26ea7e22bb5ef2245e1f947223533be4abc2b1cf7" Apr 20 20:39:12.736868 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:12.736832 2576 generic.go:358] "Generic (PLEG): container finished" podID="2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb" containerID="6c801a4d6fa06a53ddbe04643cc5920eb230572f759783ba157d66832c196b68" exitCode=2 Apr 20 20:39:12.736967 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:12.736889 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-63f047-predictor-5b4c68b4d-xwbzq" event={"ID":"2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb","Type":"ContainerDied","Data":"6c801a4d6fa06a53ddbe04643cc5920eb230572f759783ba157d66832c196b68"} Apr 20 20:39:12.776488 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:12.776252 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8"] Apr 20 20:39:12.781638 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:12.781582 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-63f047-predictor-6459c4bbc9-4sbw8"] Apr 20 20:39:12.826026 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:12.826005 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-258425-predictor-59d7cd9f6c-l8mxt"] Apr 20 20:39:12.831487 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:39:12.831462 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6165d0f_a1f8_4046_b907_8c91088d1b70.slice/crio-bec97b5a8e57e286743df5f85160cdffd5aba43984d675ca5aee44b0d1dcb10d WatchSource:0}: Error finding container bec97b5a8e57e286743df5f85160cdffd5aba43984d675ca5aee44b0d1dcb10d: Status 404 returned error can't find the container with id bec97b5a8e57e286743df5f85160cdffd5aba43984d675ca5aee44b0d1dcb10d Apr 20 20:39:13.741971 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:13.741932 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-258425-predictor-59d7cd9f6c-l8mxt" event={"ID":"e6165d0f-a1f8-4046-b907-8c91088d1b70","Type":"ContainerStarted","Data":"1e9202ccee57d303ef76154c4c820212a0b5de2b3c7033f7dc45544fb83fd2f6"} Apr 20 20:39:13.741971 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:13.741975 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-258425-predictor-59d7cd9f6c-l8mxt" event={"ID":"e6165d0f-a1f8-4046-b907-8c91088d1b70","Type":"ContainerStarted","Data":"bec97b5a8e57e286743df5f85160cdffd5aba43984d675ca5aee44b0d1dcb10d"} Apr 20 20:39:13.943718 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:13.943686 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b02cc34-80f4-4c6d-bc12-8103bdbb047a" path="/var/lib/kubelet/pods/5b02cc34-80f4-4c6d-bc12-8103bdbb047a/volumes" Apr 20 20:39:15.750264 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:15.750187 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-258425-predictor-59d7cd9f6c-l8mxt_e6165d0f-a1f8-4046-b907-8c91088d1b70/storage-initializer/0.log" Apr 20 20:39:15.750264 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:15.750225 2576 generic.go:358] "Generic (PLEG): container finished" podID="e6165d0f-a1f8-4046-b907-8c91088d1b70" containerID="1e9202ccee57d303ef76154c4c820212a0b5de2b3c7033f7dc45544fb83fd2f6" exitCode=1 Apr 20 20:39:15.750645 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:15.750307 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-258425-predictor-59d7cd9f6c-l8mxt" event={"ID":"e6165d0f-a1f8-4046-b907-8c91088d1b70","Type":"ContainerDied","Data":"1e9202ccee57d303ef76154c4c820212a0b5de2b3c7033f7dc45544fb83fd2f6"} Apr 20 20:39:16.209573 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:16.209552 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-63f047-predictor-5b4c68b4d-xwbzq" Apr 20 20:39:16.358555 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:16.358477 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb-kserve-provision-location\") pod \"2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb\" (UID: \"2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb\") " Apr 20 20:39:16.358555 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:16.358523 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb-proxy-tls\") pod \"2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb\" (UID: \"2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb\") " Apr 20 20:39:16.358765 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:16.358594 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fs2vh\" (UniqueName: \"kubernetes.io/projected/2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb-kube-api-access-fs2vh\") pod \"2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb\" (UID: \"2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb\") " Apr 20 20:39:16.358765 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:16.358615 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-primary-63f047-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb-isvc-primary-63f047-kube-rbac-proxy-sar-config\") pod \"2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb\" (UID: \"2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb\") " Apr 20 20:39:16.358880 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:16.358778 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb" (UID: "2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:39:16.359047 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:16.359022 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb-isvc-primary-63f047-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-primary-63f047-kube-rbac-proxy-sar-config") pod "2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb" (UID: "2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb"). InnerVolumeSpecName "isvc-primary-63f047-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:39:16.360660 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:16.360642 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb-kube-api-access-fs2vh" (OuterVolumeSpecName: "kube-api-access-fs2vh") pod "2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb" (UID: "2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb"). InnerVolumeSpecName "kube-api-access-fs2vh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:39:16.360721 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:16.360658 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb" (UID: "2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:39:16.459971 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:16.459932 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb-kserve-provision-location\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:39:16.459971 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:16.459972 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb-proxy-tls\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:39:16.460174 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:16.459990 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fs2vh\" (UniqueName: \"kubernetes.io/projected/2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb-kube-api-access-fs2vh\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:39:16.460174 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:16.460004 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-primary-63f047-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb-isvc-primary-63f047-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:39:16.755398 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:16.755308 2576 generic.go:358] "Generic (PLEG): container finished" podID="2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb" containerID="e48c103a8f9a07f49982875457ccbd485c3b7eddf45fbb7f25633b3b401ebe4a" exitCode=0 Apr 20 20:39:16.755801 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:16.755398 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-63f047-predictor-5b4c68b4d-xwbzq" Apr 20 20:39:16.755801 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:16.755390 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-63f047-predictor-5b4c68b4d-xwbzq" event={"ID":"2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb","Type":"ContainerDied","Data":"e48c103a8f9a07f49982875457ccbd485c3b7eddf45fbb7f25633b3b401ebe4a"} Apr 20 20:39:16.755801 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:16.755503 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-63f047-predictor-5b4c68b4d-xwbzq" event={"ID":"2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb","Type":"ContainerDied","Data":"462b9a5c2bbdced8f7cf10bcad06983439464085a09b16b742258956b84be0b5"} Apr 20 20:39:16.755801 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:16.755535 2576 scope.go:117] "RemoveContainer" containerID="6c801a4d6fa06a53ddbe04643cc5920eb230572f759783ba157d66832c196b68" Apr 20 20:39:16.757165 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:16.757145 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-258425-predictor-59d7cd9f6c-l8mxt_e6165d0f-a1f8-4046-b907-8c91088d1b70/storage-initializer/0.log" Apr 20 20:39:16.757270 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:16.757254 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-258425-predictor-59d7cd9f6c-l8mxt" event={"ID":"e6165d0f-a1f8-4046-b907-8c91088d1b70","Type":"ContainerStarted","Data":"1c8500315a39010d1d6787a455c14db3ad8d62b6365fb5e92a72693c190122ee"} Apr 20 20:39:16.764703 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:16.764686 2576 scope.go:117] "RemoveContainer" containerID="e48c103a8f9a07f49982875457ccbd485c3b7eddf45fbb7f25633b3b401ebe4a" Apr 20 20:39:16.771853 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:16.771838 2576 scope.go:117] "RemoveContainer" containerID="d73b2c8d1360a8fa7da48dfcda3a83fa3f7de7e67565b915136a7e2382c5e674" Apr 20 20:39:16.778155 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:16.778140 2576 scope.go:117] "RemoveContainer" containerID="6c801a4d6fa06a53ddbe04643cc5920eb230572f759783ba157d66832c196b68" Apr 20 20:39:16.778400 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:39:16.778380 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c801a4d6fa06a53ddbe04643cc5920eb230572f759783ba157d66832c196b68\": container with ID starting with 6c801a4d6fa06a53ddbe04643cc5920eb230572f759783ba157d66832c196b68 not found: ID does not exist" containerID="6c801a4d6fa06a53ddbe04643cc5920eb230572f759783ba157d66832c196b68" Apr 20 20:39:16.778454 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:16.778409 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c801a4d6fa06a53ddbe04643cc5920eb230572f759783ba157d66832c196b68"} err="failed to get container status \"6c801a4d6fa06a53ddbe04643cc5920eb230572f759783ba157d66832c196b68\": rpc error: code = NotFound desc = could not find container \"6c801a4d6fa06a53ddbe04643cc5920eb230572f759783ba157d66832c196b68\": container with ID starting with 6c801a4d6fa06a53ddbe04643cc5920eb230572f759783ba157d66832c196b68 not found: ID does not exist" Apr 20 20:39:16.778454 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:16.778426 2576 scope.go:117] "RemoveContainer" containerID="e48c103a8f9a07f49982875457ccbd485c3b7eddf45fbb7f25633b3b401ebe4a" Apr 20 20:39:16.778656 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:39:16.778637 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e48c103a8f9a07f49982875457ccbd485c3b7eddf45fbb7f25633b3b401ebe4a\": container with ID starting with e48c103a8f9a07f49982875457ccbd485c3b7eddf45fbb7f25633b3b401ebe4a not found: ID does not exist" containerID="e48c103a8f9a07f49982875457ccbd485c3b7eddf45fbb7f25633b3b401ebe4a" Apr 20 20:39:16.778701 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:16.778663 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e48c103a8f9a07f49982875457ccbd485c3b7eddf45fbb7f25633b3b401ebe4a"} err="failed to get container status \"e48c103a8f9a07f49982875457ccbd485c3b7eddf45fbb7f25633b3b401ebe4a\": rpc error: code = NotFound desc = could not find container \"e48c103a8f9a07f49982875457ccbd485c3b7eddf45fbb7f25633b3b401ebe4a\": container with ID starting with e48c103a8f9a07f49982875457ccbd485c3b7eddf45fbb7f25633b3b401ebe4a not found: ID does not exist" Apr 20 20:39:16.778701 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:16.778678 2576 scope.go:117] "RemoveContainer" containerID="d73b2c8d1360a8fa7da48dfcda3a83fa3f7de7e67565b915136a7e2382c5e674" Apr 20 20:39:16.778882 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:39:16.778865 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d73b2c8d1360a8fa7da48dfcda3a83fa3f7de7e67565b915136a7e2382c5e674\": container with ID starting with d73b2c8d1360a8fa7da48dfcda3a83fa3f7de7e67565b915136a7e2382c5e674 not found: ID does not exist" containerID="d73b2c8d1360a8fa7da48dfcda3a83fa3f7de7e67565b915136a7e2382c5e674" Apr 20 20:39:16.778926 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:16.778885 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d73b2c8d1360a8fa7da48dfcda3a83fa3f7de7e67565b915136a7e2382c5e674"} err="failed to get container status \"d73b2c8d1360a8fa7da48dfcda3a83fa3f7de7e67565b915136a7e2382c5e674\": rpc error: code = NotFound desc = could not find container \"d73b2c8d1360a8fa7da48dfcda3a83fa3f7de7e67565b915136a7e2382c5e674\": container with ID starting with d73b2c8d1360a8fa7da48dfcda3a83fa3f7de7e67565b915136a7e2382c5e674 not found: ID does not exist" Apr 20 20:39:16.792639 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:16.792619 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-63f047-predictor-5b4c68b4d-xwbzq"] Apr 20 20:39:16.794997 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:16.794975 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-63f047-predictor-5b4c68b4d-xwbzq"] Apr 20 20:39:17.354914 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:17.354879 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-258425-predictor-59d7cd9f6c-l8mxt"] Apr 20 20:39:17.762812 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:17.762718 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-init-fail-258425-predictor-59d7cd9f6c-l8mxt" podUID="e6165d0f-a1f8-4046-b907-8c91088d1b70" containerName="storage-initializer" containerID="cri-o://1c8500315a39010d1d6787a455c14db3ad8d62b6365fb5e92a72693c190122ee" gracePeriod=30 Apr 20 20:39:17.944000 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:17.943967 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb" path="/var/lib/kubelet/pods/2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb/volumes" Apr 20 20:39:19.006540 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:19.006521 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-258425-predictor-59d7cd9f6c-l8mxt_e6165d0f-a1f8-4046-b907-8c91088d1b70/storage-initializer/1.log" Apr 20 20:39:19.006878 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:19.006863 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-258425-predictor-59d7cd9f6c-l8mxt_e6165d0f-a1f8-4046-b907-8c91088d1b70/storage-initializer/0.log" Apr 20 20:39:19.006939 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:19.006929 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-258425-predictor-59d7cd9f6c-l8mxt" Apr 20 20:39:19.078561 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:19.078536 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/e6165d0f-a1f8-4046-b907-8c91088d1b70-cabundle-cert\") pod \"e6165d0f-a1f8-4046-b907-8c91088d1b70\" (UID: \"e6165d0f-a1f8-4046-b907-8c91088d1b70\") " Apr 20 20:39:19.078680 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:19.078590 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e6165d0f-a1f8-4046-b907-8c91088d1b70-proxy-tls\") pod \"e6165d0f-a1f8-4046-b907-8c91088d1b70\" (UID: \"e6165d0f-a1f8-4046-b907-8c91088d1b70\") " Apr 20 20:39:19.078774 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:19.078754 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-init-fail-258425-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e6165d0f-a1f8-4046-b907-8c91088d1b70-isvc-init-fail-258425-kube-rbac-proxy-sar-config\") pod \"e6165d0f-a1f8-4046-b907-8c91088d1b70\" (UID: \"e6165d0f-a1f8-4046-b907-8c91088d1b70\") " Apr 20 20:39:19.078811 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:19.078797 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4qjm\" (UniqueName: \"kubernetes.io/projected/e6165d0f-a1f8-4046-b907-8c91088d1b70-kube-api-access-r4qjm\") pod \"e6165d0f-a1f8-4046-b907-8c91088d1b70\" (UID: \"e6165d0f-a1f8-4046-b907-8c91088d1b70\") " Apr 20 20:39:19.078858 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:19.078823 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e6165d0f-a1f8-4046-b907-8c91088d1b70-kserve-provision-location\") pod \"e6165d0f-a1f8-4046-b907-8c91088d1b70\" (UID: \"e6165d0f-a1f8-4046-b907-8c91088d1b70\") " Apr 20 20:39:19.078942 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:19.078919 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6165d0f-a1f8-4046-b907-8c91088d1b70-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "e6165d0f-a1f8-4046-b907-8c91088d1b70" (UID: "e6165d0f-a1f8-4046-b907-8c91088d1b70"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:39:19.079139 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:19.079095 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6165d0f-a1f8-4046-b907-8c91088d1b70-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e6165d0f-a1f8-4046-b907-8c91088d1b70" (UID: "e6165d0f-a1f8-4046-b907-8c91088d1b70"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:39:19.079139 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:19.079103 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6165d0f-a1f8-4046-b907-8c91088d1b70-isvc-init-fail-258425-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-init-fail-258425-kube-rbac-proxy-sar-config") pod "e6165d0f-a1f8-4046-b907-8c91088d1b70" (UID: "e6165d0f-a1f8-4046-b907-8c91088d1b70"). InnerVolumeSpecName "isvc-init-fail-258425-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:39:19.079297 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:19.079180 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e6165d0f-a1f8-4046-b907-8c91088d1b70-kserve-provision-location\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:39:19.079297 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:19.079195 2576 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/e6165d0f-a1f8-4046-b907-8c91088d1b70-cabundle-cert\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:39:19.080687 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:19.080669 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6165d0f-a1f8-4046-b907-8c91088d1b70-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "e6165d0f-a1f8-4046-b907-8c91088d1b70" (UID: "e6165d0f-a1f8-4046-b907-8c91088d1b70"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:39:19.080760 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:19.080734 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6165d0f-a1f8-4046-b907-8c91088d1b70-kube-api-access-r4qjm" (OuterVolumeSpecName: "kube-api-access-r4qjm") pod "e6165d0f-a1f8-4046-b907-8c91088d1b70" (UID: "e6165d0f-a1f8-4046-b907-8c91088d1b70"). InnerVolumeSpecName "kube-api-access-r4qjm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:39:19.179917 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:19.179864 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-init-fail-258425-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e6165d0f-a1f8-4046-b907-8c91088d1b70-isvc-init-fail-258425-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:39:19.179917 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:19.179886 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r4qjm\" (UniqueName: \"kubernetes.io/projected/e6165d0f-a1f8-4046-b907-8c91088d1b70-kube-api-access-r4qjm\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:39:19.179917 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:19.179896 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e6165d0f-a1f8-4046-b907-8c91088d1b70-proxy-tls\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:39:19.770984 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:19.770955 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-258425-predictor-59d7cd9f6c-l8mxt_e6165d0f-a1f8-4046-b907-8c91088d1b70/storage-initializer/1.log" Apr 20 20:39:19.771323 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:19.771304 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-258425-predictor-59d7cd9f6c-l8mxt_e6165d0f-a1f8-4046-b907-8c91088d1b70/storage-initializer/0.log" Apr 20 20:39:19.771383 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:19.771342 2576 generic.go:358] "Generic (PLEG): container finished" podID="e6165d0f-a1f8-4046-b907-8c91088d1b70" containerID="1c8500315a39010d1d6787a455c14db3ad8d62b6365fb5e92a72693c190122ee" exitCode=1 Apr 20 20:39:19.771420 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:19.771395 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-258425-predictor-59d7cd9f6c-l8mxt" event={"ID":"e6165d0f-a1f8-4046-b907-8c91088d1b70","Type":"ContainerDied","Data":"1c8500315a39010d1d6787a455c14db3ad8d62b6365fb5e92a72693c190122ee"} Apr 20 20:39:19.771462 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:19.771419 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-258425-predictor-59d7cd9f6c-l8mxt" Apr 20 20:39:19.771462 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:19.771431 2576 scope.go:117] "RemoveContainer" containerID="1c8500315a39010d1d6787a455c14db3ad8d62b6365fb5e92a72693c190122ee" Apr 20 20:39:19.771562 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:19.771421 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-258425-predictor-59d7cd9f6c-l8mxt" event={"ID":"e6165d0f-a1f8-4046-b907-8c91088d1b70","Type":"ContainerDied","Data":"bec97b5a8e57e286743df5f85160cdffd5aba43984d675ca5aee44b0d1dcb10d"} Apr 20 20:39:19.779803 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:19.779784 2576 scope.go:117] "RemoveContainer" containerID="1e9202ccee57d303ef76154c4c820212a0b5de2b3c7033f7dc45544fb83fd2f6" Apr 20 20:39:19.789212 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:19.789177 2576 scope.go:117] "RemoveContainer" containerID="1c8500315a39010d1d6787a455c14db3ad8d62b6365fb5e92a72693c190122ee" Apr 20 20:39:19.789668 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:39:19.789645 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c8500315a39010d1d6787a455c14db3ad8d62b6365fb5e92a72693c190122ee\": container with ID starting with 1c8500315a39010d1d6787a455c14db3ad8d62b6365fb5e92a72693c190122ee not found: ID does not exist" containerID="1c8500315a39010d1d6787a455c14db3ad8d62b6365fb5e92a72693c190122ee" Apr 20 20:39:19.789733 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:19.789676 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c8500315a39010d1d6787a455c14db3ad8d62b6365fb5e92a72693c190122ee"} err="failed to get container status \"1c8500315a39010d1d6787a455c14db3ad8d62b6365fb5e92a72693c190122ee\": rpc error: code = NotFound desc = could not find container \"1c8500315a39010d1d6787a455c14db3ad8d62b6365fb5e92a72693c190122ee\": container with ID starting with 1c8500315a39010d1d6787a455c14db3ad8d62b6365fb5e92a72693c190122ee not found: ID does not exist" Apr 20 20:39:19.789733 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:19.789693 2576 scope.go:117] "RemoveContainer" containerID="1e9202ccee57d303ef76154c4c820212a0b5de2b3c7033f7dc45544fb83fd2f6" Apr 20 20:39:19.789969 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:39:19.789949 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e9202ccee57d303ef76154c4c820212a0b5de2b3c7033f7dc45544fb83fd2f6\": container with ID starting with 1e9202ccee57d303ef76154c4c820212a0b5de2b3c7033f7dc45544fb83fd2f6 not found: ID does not exist" containerID="1e9202ccee57d303ef76154c4c820212a0b5de2b3c7033f7dc45544fb83fd2f6" Apr 20 20:39:19.790030 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:19.789974 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e9202ccee57d303ef76154c4c820212a0b5de2b3c7033f7dc45544fb83fd2f6"} err="failed to get container status \"1e9202ccee57d303ef76154c4c820212a0b5de2b3c7033f7dc45544fb83fd2f6\": rpc error: code = NotFound desc = could not find container \"1e9202ccee57d303ef76154c4c820212a0b5de2b3c7033f7dc45544fb83fd2f6\": container with ID starting with 1e9202ccee57d303ef76154c4c820212a0b5de2b3c7033f7dc45544fb83fd2f6 not found: ID does not exist" Apr 20 20:39:19.810701 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:19.810678 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-258425-predictor-59d7cd9f6c-l8mxt"] Apr 20 20:39:19.812916 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:19.812895 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-258425-predictor-59d7cd9f6c-l8mxt"] Apr 20 20:39:19.944378 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:39:19.944351 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6165d0f-a1f8-4046-b907-8c91088d1b70" path="/var/lib/kubelet/pods/e6165d0f-a1f8-4046-b907-8c91088d1b70/volumes" Apr 20 20:40:54.430204 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:40:54.430091 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z55qt_f78ac3d9-bcf1-43dd-aac7-1678831ee3ba/ovn-acl-logging/0.log" Apr 20 20:40:54.435205 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:40:54.435185 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z55qt_f78ac3d9-bcf1-43dd-aac7-1678831ee3ba/ovn-acl-logging/0.log" Apr 20 20:45:54.451941 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:45:54.451833 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z55qt_f78ac3d9-bcf1-43dd-aac7-1678831ee3ba/ovn-acl-logging/0.log" Apr 20 20:45:54.458446 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:45:54.458427 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z55qt_f78ac3d9-bcf1-43dd-aac7-1678831ee3ba/ovn-acl-logging/0.log" Apr 20 20:48:40.996969 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:40.996888 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gttwj"] Apr 20 20:48:40.997525 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:40.997243 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb" containerName="storage-initializer" Apr 20 20:48:40.997525 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:40.997257 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb" containerName="storage-initializer" Apr 20 20:48:40.997525 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:40.997265 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5b02cc34-80f4-4c6d-bc12-8103bdbb047a" containerName="storage-initializer" Apr 20 20:48:40.997525 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:40.997272 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b02cc34-80f4-4c6d-bc12-8103bdbb047a" containerName="storage-initializer" Apr 20 20:48:40.997525 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:40.997281 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb" containerName="kserve-container" Apr 20 20:48:40.997525 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:40.997287 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb" containerName="kserve-container" Apr 20 20:48:40.997525 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:40.997300 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5b02cc34-80f4-4c6d-bc12-8103bdbb047a" containerName="storage-initializer" Apr 20 20:48:40.997525 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:40.997306 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b02cc34-80f4-4c6d-bc12-8103bdbb047a" containerName="storage-initializer" Apr 20 20:48:40.997525 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:40.997320 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e6165d0f-a1f8-4046-b907-8c91088d1b70" containerName="storage-initializer" Apr 20 20:48:40.997525 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:40.997325 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6165d0f-a1f8-4046-b907-8c91088d1b70" containerName="storage-initializer" Apr 20 20:48:40.997525 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:40.997332 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb" containerName="kube-rbac-proxy" Apr 20 20:48:40.997525 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:40.997337 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb" containerName="kube-rbac-proxy" Apr 20 20:48:40.997525 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:40.997342 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e6165d0f-a1f8-4046-b907-8c91088d1b70" containerName="storage-initializer" Apr 20 20:48:40.997525 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:40.997347 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6165d0f-a1f8-4046-b907-8c91088d1b70" containerName="storage-initializer" Apr 20 20:48:40.997525 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:40.997395 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="5b02cc34-80f4-4c6d-bc12-8103bdbb047a" containerName="storage-initializer" Apr 20 20:48:40.997525 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:40.997404 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb" containerName="kserve-container" Apr 20 20:48:40.997525 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:40.997410 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="2d5611b1-ee37-4683-91f1-6d2a6fb3fcfb" containerName="kube-rbac-proxy" Apr 20 20:48:40.997525 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:40.997416 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e6165d0f-a1f8-4046-b907-8c91088d1b70" containerName="storage-initializer" Apr 20 20:48:40.997525 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:40.997422 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="5b02cc34-80f4-4c6d-bc12-8103bdbb047a" containerName="storage-initializer" Apr 20 20:48:40.997525 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:40.997521 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="e6165d0f-a1f8-4046-b907-8c91088d1b70" containerName="storage-initializer" Apr 20 20:48:41.000489 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:41.000471 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gttwj" Apr 20 20:48:41.002922 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:41.002897 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-predictor-serving-cert\"" Apr 20 20:48:41.003067 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:41.002899 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-kube-rbac-proxy-sar-config\"" Apr 20 20:48:41.003067 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:41.003009 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 20 20:48:41.003359 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:41.003340 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 20 20:48:41.003997 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:41.003979 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-c5zv6\"" Apr 20 20:48:41.009893 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:41.009872 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gttwj"] Apr 20 20:48:41.014418 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:41.014397 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/677a00d8-2c24-4e16-b790-7d05ec523d1a-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-predictor-d8dbfbbb9-gttwj\" (UID: \"677a00d8-2c24-4e16-b790-7d05ec523d1a\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gttwj" Apr 20 20:48:41.014509 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:41.014432 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/677a00d8-2c24-4e16-b790-7d05ec523d1a-proxy-tls\") pod \"isvc-sklearn-predictor-d8dbfbbb9-gttwj\" (UID: \"677a00d8-2c24-4e16-b790-7d05ec523d1a\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gttwj" Apr 20 20:48:41.014509 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:41.014450 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fftbp\" (UniqueName: \"kubernetes.io/projected/677a00d8-2c24-4e16-b790-7d05ec523d1a-kube-api-access-fftbp\") pod \"isvc-sklearn-predictor-d8dbfbbb9-gttwj\" (UID: \"677a00d8-2c24-4e16-b790-7d05ec523d1a\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gttwj" Apr 20 20:48:41.014509 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:41.014492 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/677a00d8-2c24-4e16-b790-7d05ec523d1a-kserve-provision-location\") pod \"isvc-sklearn-predictor-d8dbfbbb9-gttwj\" (UID: \"677a00d8-2c24-4e16-b790-7d05ec523d1a\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gttwj" Apr 20 20:48:41.115387 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:41.115357 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/677a00d8-2c24-4e16-b790-7d05ec523d1a-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-predictor-d8dbfbbb9-gttwj\" (UID: \"677a00d8-2c24-4e16-b790-7d05ec523d1a\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gttwj" Apr 20 20:48:41.115543 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:41.115394 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/677a00d8-2c24-4e16-b790-7d05ec523d1a-proxy-tls\") pod \"isvc-sklearn-predictor-d8dbfbbb9-gttwj\" (UID: \"677a00d8-2c24-4e16-b790-7d05ec523d1a\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gttwj" Apr 20 20:48:41.115543 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:41.115411 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fftbp\" (UniqueName: \"kubernetes.io/projected/677a00d8-2c24-4e16-b790-7d05ec523d1a-kube-api-access-fftbp\") pod \"isvc-sklearn-predictor-d8dbfbbb9-gttwj\" (UID: \"677a00d8-2c24-4e16-b790-7d05ec523d1a\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gttwj" Apr 20 20:48:41.115543 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:41.115428 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/677a00d8-2c24-4e16-b790-7d05ec523d1a-kserve-provision-location\") pod \"isvc-sklearn-predictor-d8dbfbbb9-gttwj\" (UID: \"677a00d8-2c24-4e16-b790-7d05ec523d1a\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gttwj" Apr 20 20:48:41.115543 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:48:41.115519 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-predictor-serving-cert: secret "isvc-sklearn-predictor-serving-cert" not found Apr 20 20:48:41.115757 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:48:41.115582 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/677a00d8-2c24-4e16-b790-7d05ec523d1a-proxy-tls podName:677a00d8-2c24-4e16-b790-7d05ec523d1a nodeName:}" failed. No retries permitted until 2026-04-20 20:48:41.615562992 +0000 UTC m=+2596.328244970 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/677a00d8-2c24-4e16-b790-7d05ec523d1a-proxy-tls") pod "isvc-sklearn-predictor-d8dbfbbb9-gttwj" (UID: "677a00d8-2c24-4e16-b790-7d05ec523d1a") : secret "isvc-sklearn-predictor-serving-cert" not found Apr 20 20:48:41.115917 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:41.115896 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/677a00d8-2c24-4e16-b790-7d05ec523d1a-kserve-provision-location\") pod \"isvc-sklearn-predictor-d8dbfbbb9-gttwj\" (UID: \"677a00d8-2c24-4e16-b790-7d05ec523d1a\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gttwj" Apr 20 20:48:41.116158 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:41.116136 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/677a00d8-2c24-4e16-b790-7d05ec523d1a-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-predictor-d8dbfbbb9-gttwj\" (UID: \"677a00d8-2c24-4e16-b790-7d05ec523d1a\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gttwj" Apr 20 20:48:41.125761 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:41.125739 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fftbp\" (UniqueName: \"kubernetes.io/projected/677a00d8-2c24-4e16-b790-7d05ec523d1a-kube-api-access-fftbp\") pod \"isvc-sklearn-predictor-d8dbfbbb9-gttwj\" (UID: \"677a00d8-2c24-4e16-b790-7d05ec523d1a\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gttwj" Apr 20 20:48:41.618465 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:41.618429 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/677a00d8-2c24-4e16-b790-7d05ec523d1a-proxy-tls\") pod \"isvc-sklearn-predictor-d8dbfbbb9-gttwj\" (UID: \"677a00d8-2c24-4e16-b790-7d05ec523d1a\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gttwj" Apr 20 20:48:41.620706 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:41.620686 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/677a00d8-2c24-4e16-b790-7d05ec523d1a-proxy-tls\") pod \"isvc-sklearn-predictor-d8dbfbbb9-gttwj\" (UID: \"677a00d8-2c24-4e16-b790-7d05ec523d1a\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gttwj" Apr 20 20:48:41.911610 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:41.911523 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gttwj" Apr 20 20:48:42.035407 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:42.035375 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gttwj"] Apr 20 20:48:42.037546 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:48:42.037523 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod677a00d8_2c24_4e16_b790_7d05ec523d1a.slice/crio-f542a599912dabd8b0ad14b79f83e2504a5c75dba9ef4487b22b4a429b0309f4 WatchSource:0}: Error finding container f542a599912dabd8b0ad14b79f83e2504a5c75dba9ef4487b22b4a429b0309f4: Status 404 returned error can't find the container with id f542a599912dabd8b0ad14b79f83e2504a5c75dba9ef4487b22b4a429b0309f4 Apr 20 20:48:42.039379 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:42.039359 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:48:42.630127 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:42.630086 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gttwj" event={"ID":"677a00d8-2c24-4e16-b790-7d05ec523d1a","Type":"ContainerStarted","Data":"8f11c71138d8f6e3b8601c83a7c26b2e104ffc902ccd0c24dbf9a4b599370972"} Apr 20 20:48:42.630127 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:42.630133 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gttwj" event={"ID":"677a00d8-2c24-4e16-b790-7d05ec523d1a","Type":"ContainerStarted","Data":"f542a599912dabd8b0ad14b79f83e2504a5c75dba9ef4487b22b4a429b0309f4"} Apr 20 20:48:46.643754 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:46.643717 2576 generic.go:358] "Generic (PLEG): container finished" podID="677a00d8-2c24-4e16-b790-7d05ec523d1a" containerID="8f11c71138d8f6e3b8601c83a7c26b2e104ffc902ccd0c24dbf9a4b599370972" exitCode=0 Apr 20 20:48:46.644177 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:46.643789 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gttwj" event={"ID":"677a00d8-2c24-4e16-b790-7d05ec523d1a","Type":"ContainerDied","Data":"8f11c71138d8f6e3b8601c83a7c26b2e104ffc902ccd0c24dbf9a4b599370972"} Apr 20 20:48:47.648862 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:47.648827 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gttwj" event={"ID":"677a00d8-2c24-4e16-b790-7d05ec523d1a","Type":"ContainerStarted","Data":"92aad0ab8e5a2dc620e0286d6ad30702ce2e8af032c02394d3ec3b30aa810a07"} Apr 20 20:48:47.649258 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:47.648868 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gttwj" event={"ID":"677a00d8-2c24-4e16-b790-7d05ec523d1a","Type":"ContainerStarted","Data":"dee4e17d1482c7e6e7085befd73dbf2ec47c128d35f9ea299566acb3ee1bf7ae"} Apr 20 20:48:47.649258 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:47.649066 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gttwj" Apr 20 20:48:47.668685 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:47.668642 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gttwj" podStartSLOduration=7.668629456 podStartE2EDuration="7.668629456s" podCreationTimestamp="2026-04-20 20:48:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:48:47.666329001 +0000 UTC m=+2602.379011013" watchObservedRunningTime="2026-04-20 20:48:47.668629456 +0000 UTC m=+2602.381311454" Apr 20 20:48:48.652318 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:48.652287 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gttwj" Apr 20 20:48:48.653638 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:48.653610 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gttwj" podUID="677a00d8-2c24-4e16-b790-7d05ec523d1a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.53:8080: connect: connection refused" Apr 20 20:48:49.655580 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:49.655542 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gttwj" podUID="677a00d8-2c24-4e16-b790-7d05ec523d1a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.53:8080: connect: connection refused" Apr 20 20:48:54.660043 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:54.660013 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gttwj" Apr 20 20:48:54.660662 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:48:54.660633 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gttwj" podUID="677a00d8-2c24-4e16-b790-7d05ec523d1a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.53:8080: connect: connection refused" Apr 20 20:49:04.660917 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:49:04.660880 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gttwj" podUID="677a00d8-2c24-4e16-b790-7d05ec523d1a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.53:8080: connect: connection refused" Apr 20 20:49:14.661559 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:49:14.661516 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gttwj" podUID="677a00d8-2c24-4e16-b790-7d05ec523d1a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.53:8080: connect: connection refused" Apr 20 20:49:24.661370 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:49:24.661332 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gttwj" podUID="677a00d8-2c24-4e16-b790-7d05ec523d1a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.53:8080: connect: connection refused" Apr 20 20:49:34.660952 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:49:34.660909 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gttwj" podUID="677a00d8-2c24-4e16-b790-7d05ec523d1a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.53:8080: connect: connection refused" Apr 20 20:49:44.661025 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:49:44.660983 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gttwj" podUID="677a00d8-2c24-4e16-b790-7d05ec523d1a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.53:8080: connect: connection refused" Apr 20 20:49:54.661170 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:49:54.661141 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gttwj" Apr 20 20:50:01.103728 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:01.103689 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gttwj"] Apr 20 20:50:01.104192 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:01.103999 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gttwj" podUID="677a00d8-2c24-4e16-b790-7d05ec523d1a" containerName="kserve-container" containerID="cri-o://dee4e17d1482c7e6e7085befd73dbf2ec47c128d35f9ea299566acb3ee1bf7ae" gracePeriod=30 Apr 20 20:50:01.104192 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:01.104139 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gttwj" podUID="677a00d8-2c24-4e16-b790-7d05ec523d1a" containerName="kube-rbac-proxy" containerID="cri-o://92aad0ab8e5a2dc620e0286d6ad30702ce2e8af032c02394d3ec3b30aa810a07" gracePeriod=30 Apr 20 20:50:01.196087 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:01.196051 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-8g2db"] Apr 20 20:50:01.199598 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:01.199581 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-8g2db" Apr 20 20:50:01.201960 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:01.201934 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sklearn-v2-mlserver-predictor-serving-cert\"" Apr 20 20:50:01.202091 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:01.202057 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\"" Apr 20 20:50:01.209032 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:01.209002 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-8g2db"] Apr 20 20:50:01.233457 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:01.233431 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n65pd\" (UniqueName: \"kubernetes.io/projected/43a042b3-d860-41f0-b238-1a95776d12b8-kube-api-access-n65pd\") pod \"sklearn-v2-mlserver-predictor-65d8664766-8g2db\" (UID: \"43a042b3-d860-41f0-b238-1a95776d12b8\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-8g2db" Apr 20 20:50:01.233578 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:01.233469 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/43a042b3-d860-41f0-b238-1a95776d12b8-proxy-tls\") pod \"sklearn-v2-mlserver-predictor-65d8664766-8g2db\" (UID: \"43a042b3-d860-41f0-b238-1a95776d12b8\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-8g2db" Apr 20 20:50:01.233578 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:01.233526 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/43a042b3-d860-41f0-b238-1a95776d12b8-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"sklearn-v2-mlserver-predictor-65d8664766-8g2db\" (UID: \"43a042b3-d860-41f0-b238-1a95776d12b8\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-8g2db" Apr 20 20:50:01.233578 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:01.233559 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/43a042b3-d860-41f0-b238-1a95776d12b8-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-65d8664766-8g2db\" (UID: \"43a042b3-d860-41f0-b238-1a95776d12b8\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-8g2db" Apr 20 20:50:01.333935 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:01.333907 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n65pd\" (UniqueName: \"kubernetes.io/projected/43a042b3-d860-41f0-b238-1a95776d12b8-kube-api-access-n65pd\") pod \"sklearn-v2-mlserver-predictor-65d8664766-8g2db\" (UID: \"43a042b3-d860-41f0-b238-1a95776d12b8\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-8g2db" Apr 20 20:50:01.334152 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:01.333943 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/43a042b3-d860-41f0-b238-1a95776d12b8-proxy-tls\") pod \"sklearn-v2-mlserver-predictor-65d8664766-8g2db\" (UID: \"43a042b3-d860-41f0-b238-1a95776d12b8\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-8g2db" Apr 20 20:50:01.334152 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:01.333980 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/43a042b3-d860-41f0-b238-1a95776d12b8-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"sklearn-v2-mlserver-predictor-65d8664766-8g2db\" (UID: \"43a042b3-d860-41f0-b238-1a95776d12b8\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-8g2db" Apr 20 20:50:01.334152 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:01.334017 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/43a042b3-d860-41f0-b238-1a95776d12b8-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-65d8664766-8g2db\" (UID: \"43a042b3-d860-41f0-b238-1a95776d12b8\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-8g2db" Apr 20 20:50:01.334473 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:01.334454 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/43a042b3-d860-41f0-b238-1a95776d12b8-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-65d8664766-8g2db\" (UID: \"43a042b3-d860-41f0-b238-1a95776d12b8\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-8g2db" Apr 20 20:50:01.334743 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:01.334724 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/43a042b3-d860-41f0-b238-1a95776d12b8-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"sklearn-v2-mlserver-predictor-65d8664766-8g2db\" (UID: \"43a042b3-d860-41f0-b238-1a95776d12b8\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-8g2db" Apr 20 20:50:01.336376 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:01.336351 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/43a042b3-d860-41f0-b238-1a95776d12b8-proxy-tls\") pod \"sklearn-v2-mlserver-predictor-65d8664766-8g2db\" (UID: \"43a042b3-d860-41f0-b238-1a95776d12b8\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-8g2db" Apr 20 20:50:01.343638 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:01.343616 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n65pd\" (UniqueName: \"kubernetes.io/projected/43a042b3-d860-41f0-b238-1a95776d12b8-kube-api-access-n65pd\") pod \"sklearn-v2-mlserver-predictor-65d8664766-8g2db\" (UID: \"43a042b3-d860-41f0-b238-1a95776d12b8\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-8g2db" Apr 20 20:50:01.512812 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:01.512732 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-8g2db" Apr 20 20:50:01.632766 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:01.632737 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-8g2db"] Apr 20 20:50:01.634983 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:50:01.634955 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43a042b3_d860_41f0_b238_1a95776d12b8.slice/crio-5e5482c746af094412b5cac57481245b7ce8907ef98e6a699a47196367c640a4 WatchSource:0}: Error finding container 5e5482c746af094412b5cac57481245b7ce8907ef98e6a699a47196367c640a4: Status 404 returned error can't find the container with id 5e5482c746af094412b5cac57481245b7ce8907ef98e6a699a47196367c640a4 Apr 20 20:50:01.904731 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:01.904689 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-8g2db" event={"ID":"43a042b3-d860-41f0-b238-1a95776d12b8","Type":"ContainerStarted","Data":"926216eb9f04f63952b24dad684779ac27c8fb8c2c948af2a816b56133398fb7"} Apr 20 20:50:01.904731 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:01.904738 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-8g2db" event={"ID":"43a042b3-d860-41f0-b238-1a95776d12b8","Type":"ContainerStarted","Data":"5e5482c746af094412b5cac57481245b7ce8907ef98e6a699a47196367c640a4"} Apr 20 20:50:01.906632 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:01.906595 2576 generic.go:358] "Generic (PLEG): container finished" podID="677a00d8-2c24-4e16-b790-7d05ec523d1a" containerID="92aad0ab8e5a2dc620e0286d6ad30702ce2e8af032c02394d3ec3b30aa810a07" exitCode=2 Apr 20 20:50:01.906745 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:01.906636 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gttwj" event={"ID":"677a00d8-2c24-4e16-b790-7d05ec523d1a","Type":"ContainerDied","Data":"92aad0ab8e5a2dc620e0286d6ad30702ce2e8af032c02394d3ec3b30aa810a07"} Apr 20 20:50:04.656206 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:04.656166 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gttwj" podUID="677a00d8-2c24-4e16-b790-7d05ec523d1a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.53:8643/healthz\": dial tcp 10.132.0.53:8643: connect: connection refused" Apr 20 20:50:04.660738 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:04.660704 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gttwj" podUID="677a00d8-2c24-4e16-b790-7d05ec523d1a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.53:8080: connect: connection refused" Apr 20 20:50:04.919820 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:04.919759 2576 generic.go:358] "Generic (PLEG): container finished" podID="677a00d8-2c24-4e16-b790-7d05ec523d1a" containerID="dee4e17d1482c7e6e7085befd73dbf2ec47c128d35f9ea299566acb3ee1bf7ae" exitCode=0 Apr 20 20:50:04.919820 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:04.919802 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gttwj" event={"ID":"677a00d8-2c24-4e16-b790-7d05ec523d1a","Type":"ContainerDied","Data":"dee4e17d1482c7e6e7085befd73dbf2ec47c128d35f9ea299566acb3ee1bf7ae"} Apr 20 20:50:05.047327 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:05.047304 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gttwj" Apr 20 20:50:05.063244 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:05.063222 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/677a00d8-2c24-4e16-b790-7d05ec523d1a-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"677a00d8-2c24-4e16-b790-7d05ec523d1a\" (UID: \"677a00d8-2c24-4e16-b790-7d05ec523d1a\") " Apr 20 20:50:05.063378 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:05.063278 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/677a00d8-2c24-4e16-b790-7d05ec523d1a-kserve-provision-location\") pod \"677a00d8-2c24-4e16-b790-7d05ec523d1a\" (UID: \"677a00d8-2c24-4e16-b790-7d05ec523d1a\") " Apr 20 20:50:05.063378 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:05.063347 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fftbp\" (UniqueName: \"kubernetes.io/projected/677a00d8-2c24-4e16-b790-7d05ec523d1a-kube-api-access-fftbp\") pod \"677a00d8-2c24-4e16-b790-7d05ec523d1a\" (UID: \"677a00d8-2c24-4e16-b790-7d05ec523d1a\") " Apr 20 20:50:05.063378 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:05.063372 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/677a00d8-2c24-4e16-b790-7d05ec523d1a-proxy-tls\") pod \"677a00d8-2c24-4e16-b790-7d05ec523d1a\" (UID: \"677a00d8-2c24-4e16-b790-7d05ec523d1a\") " Apr 20 20:50:05.063637 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:05.063610 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/677a00d8-2c24-4e16-b790-7d05ec523d1a-isvc-sklearn-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-kube-rbac-proxy-sar-config") pod "677a00d8-2c24-4e16-b790-7d05ec523d1a" (UID: "677a00d8-2c24-4e16-b790-7d05ec523d1a"). InnerVolumeSpecName "isvc-sklearn-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:50:05.063691 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:05.063626 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/677a00d8-2c24-4e16-b790-7d05ec523d1a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "677a00d8-2c24-4e16-b790-7d05ec523d1a" (UID: "677a00d8-2c24-4e16-b790-7d05ec523d1a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:50:05.065451 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:05.065418 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/677a00d8-2c24-4e16-b790-7d05ec523d1a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "677a00d8-2c24-4e16-b790-7d05ec523d1a" (UID: "677a00d8-2c24-4e16-b790-7d05ec523d1a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:50:05.065721 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:05.065699 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/677a00d8-2c24-4e16-b790-7d05ec523d1a-kube-api-access-fftbp" (OuterVolumeSpecName: "kube-api-access-fftbp") pod "677a00d8-2c24-4e16-b790-7d05ec523d1a" (UID: "677a00d8-2c24-4e16-b790-7d05ec523d1a"). InnerVolumeSpecName "kube-api-access-fftbp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:50:05.164394 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:05.164361 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fftbp\" (UniqueName: \"kubernetes.io/projected/677a00d8-2c24-4e16-b790-7d05ec523d1a-kube-api-access-fftbp\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:50:05.164394 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:05.164390 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/677a00d8-2c24-4e16-b790-7d05ec523d1a-proxy-tls\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:50:05.164563 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:05.164408 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/677a00d8-2c24-4e16-b790-7d05ec523d1a-isvc-sklearn-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:50:05.164563 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:05.164422 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/677a00d8-2c24-4e16-b790-7d05ec523d1a-kserve-provision-location\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:50:05.924717 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:05.924684 2576 generic.go:358] "Generic (PLEG): container finished" podID="43a042b3-d860-41f0-b238-1a95776d12b8" containerID="926216eb9f04f63952b24dad684779ac27c8fb8c2c948af2a816b56133398fb7" exitCode=0 Apr 20 20:50:05.925134 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:05.924759 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-8g2db" event={"ID":"43a042b3-d860-41f0-b238-1a95776d12b8","Type":"ContainerDied","Data":"926216eb9f04f63952b24dad684779ac27c8fb8c2c948af2a816b56133398fb7"} Apr 20 20:50:05.926659 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:05.926637 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gttwj" event={"ID":"677a00d8-2c24-4e16-b790-7d05ec523d1a","Type":"ContainerDied","Data":"f542a599912dabd8b0ad14b79f83e2504a5c75dba9ef4487b22b4a429b0309f4"} Apr 20 20:50:05.926769 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:05.926662 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gttwj" Apr 20 20:50:05.926769 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:05.926679 2576 scope.go:117] "RemoveContainer" containerID="92aad0ab8e5a2dc620e0286d6ad30702ce2e8af032c02394d3ec3b30aa810a07" Apr 20 20:50:05.935868 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:05.935851 2576 scope.go:117] "RemoveContainer" containerID="dee4e17d1482c7e6e7085befd73dbf2ec47c128d35f9ea299566acb3ee1bf7ae" Apr 20 20:50:05.946674 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:05.946635 2576 scope.go:117] "RemoveContainer" containerID="8f11c71138d8f6e3b8601c83a7c26b2e104ffc902ccd0c24dbf9a4b599370972" Apr 20 20:50:05.956020 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:05.955993 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gttwj"] Apr 20 20:50:05.958804 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:05.958779 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-gttwj"] Apr 20 20:50:06.932174 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:06.932132 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-8g2db" event={"ID":"43a042b3-d860-41f0-b238-1a95776d12b8","Type":"ContainerStarted","Data":"c2ccdf631808a4d895df086e2a6c28cea5e101399d48101ac5478913471bc49f"} Apr 20 20:50:06.932568 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:06.932184 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-8g2db" event={"ID":"43a042b3-d860-41f0-b238-1a95776d12b8","Type":"ContainerStarted","Data":"7d10aee7bf4f8500a6bd9c28bfe8fe47be0bfb782f773aa9c4f9f56195cbe09e"} Apr 20 20:50:06.932568 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:06.932393 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-8g2db" Apr 20 20:50:06.950892 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:06.950828 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-8g2db" podStartSLOduration=5.9508149 podStartE2EDuration="5.9508149s" podCreationTimestamp="2026-04-20 20:50:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:50:06.949292647 +0000 UTC m=+2681.661974644" watchObservedRunningTime="2026-04-20 20:50:06.9508149 +0000 UTC m=+2681.663496896" Apr 20 20:50:07.936245 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:07.936218 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-8g2db" Apr 20 20:50:07.949651 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:07.949620 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="677a00d8-2c24-4e16-b790-7d05ec523d1a" path="/var/lib/kubelet/pods/677a00d8-2c24-4e16-b790-7d05ec523d1a/volumes" Apr 20 20:50:13.947756 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:13.947728 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-8g2db" Apr 20 20:50:44.038598 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:44.038561 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-8g2db" podUID="43a042b3-d860-41f0-b238-1a95776d12b8" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 20 20:50:53.951209 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:53.951179 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-8g2db" Apr 20 20:50:54.478591 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:54.478486 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z55qt_f78ac3d9-bcf1-43dd-aac7-1678831ee3ba/ovn-acl-logging/0.log" Apr 20 20:50:54.483603 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:50:54.483580 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z55qt_f78ac3d9-bcf1-43dd-aac7-1678831ee3ba/ovn-acl-logging/0.log" Apr 20 20:51:01.281291 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:01.281248 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-8g2db"] Apr 20 20:51:01.281785 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:01.281642 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-8g2db" podUID="43a042b3-d860-41f0-b238-1a95776d12b8" containerName="kserve-container" containerID="cri-o://7d10aee7bf4f8500a6bd9c28bfe8fe47be0bfb782f773aa9c4f9f56195cbe09e" gracePeriod=30 Apr 20 20:51:01.281858 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:01.281648 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-8g2db" podUID="43a042b3-d860-41f0-b238-1a95776d12b8" containerName="kube-rbac-proxy" containerID="cri-o://c2ccdf631808a4d895df086e2a6c28cea5e101399d48101ac5478913471bc49f" gracePeriod=30 Apr 20 20:51:01.349613 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:01.349584 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-j2w8v"] Apr 20 20:51:01.350000 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:01.349984 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="677a00d8-2c24-4e16-b790-7d05ec523d1a" containerName="storage-initializer" Apr 20 20:51:01.350124 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:01.350012 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="677a00d8-2c24-4e16-b790-7d05ec523d1a" containerName="storage-initializer" Apr 20 20:51:01.350124 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:01.350024 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="677a00d8-2c24-4e16-b790-7d05ec523d1a" containerName="kserve-container" Apr 20 20:51:01.350124 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:01.350033 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="677a00d8-2c24-4e16-b790-7d05ec523d1a" containerName="kserve-container" Apr 20 20:51:01.350124 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:01.350050 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="677a00d8-2c24-4e16-b790-7d05ec523d1a" containerName="kube-rbac-proxy" Apr 20 20:51:01.350124 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:01.350059 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="677a00d8-2c24-4e16-b790-7d05ec523d1a" containerName="kube-rbac-proxy" Apr 20 20:51:01.350413 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:01.350159 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="677a00d8-2c24-4e16-b790-7d05ec523d1a" containerName="kube-rbac-proxy" Apr 20 20:51:01.350413 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:01.350179 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="677a00d8-2c24-4e16-b790-7d05ec523d1a" containerName="kserve-container" Apr 20 20:51:01.353450 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:01.353432 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-j2w8v" Apr 20 20:51:01.355742 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:01.355723 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-runtime-predictor-serving-cert\"" Apr 20 20:51:01.356005 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:01.355981 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\"" Apr 20 20:51:01.362154 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:01.362133 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-j2w8v"] Apr 20 20:51:01.498195 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:01.498164 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v44hl\" (UniqueName: \"kubernetes.io/projected/7d48a708-3069-4fa8-84ae-bd51e5288869-kube-api-access-v44hl\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-j2w8v\" (UID: \"7d48a708-3069-4fa8-84ae-bd51e5288869\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-j2w8v" Apr 20 20:51:01.498374 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:01.498203 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7d48a708-3069-4fa8-84ae-bd51e5288869-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-j2w8v\" (UID: \"7d48a708-3069-4fa8-84ae-bd51e5288869\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-j2w8v" Apr 20 20:51:01.498374 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:01.498281 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7d48a708-3069-4fa8-84ae-bd51e5288869-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-j2w8v\" (UID: \"7d48a708-3069-4fa8-84ae-bd51e5288869\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-j2w8v" Apr 20 20:51:01.498374 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:01.498327 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7d48a708-3069-4fa8-84ae-bd51e5288869-proxy-tls\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-j2w8v\" (UID: \"7d48a708-3069-4fa8-84ae-bd51e5288869\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-j2w8v" Apr 20 20:51:01.598929 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:01.598899 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v44hl\" (UniqueName: \"kubernetes.io/projected/7d48a708-3069-4fa8-84ae-bd51e5288869-kube-api-access-v44hl\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-j2w8v\" (UID: \"7d48a708-3069-4fa8-84ae-bd51e5288869\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-j2w8v" Apr 20 20:51:01.599076 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:01.598941 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7d48a708-3069-4fa8-84ae-bd51e5288869-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-j2w8v\" (UID: \"7d48a708-3069-4fa8-84ae-bd51e5288869\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-j2w8v" Apr 20 20:51:01.599076 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:01.598977 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7d48a708-3069-4fa8-84ae-bd51e5288869-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-j2w8v\" (UID: \"7d48a708-3069-4fa8-84ae-bd51e5288869\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-j2w8v" Apr 20 20:51:01.599076 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:01.599000 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7d48a708-3069-4fa8-84ae-bd51e5288869-proxy-tls\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-j2w8v\" (UID: \"7d48a708-3069-4fa8-84ae-bd51e5288869\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-j2w8v" Apr 20 20:51:01.599283 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:51:01.599140 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-serving-cert: secret "isvc-sklearn-runtime-predictor-serving-cert" not found Apr 20 20:51:01.599283 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:51:01.599209 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d48a708-3069-4fa8-84ae-bd51e5288869-proxy-tls podName:7d48a708-3069-4fa8-84ae-bd51e5288869 nodeName:}" failed. No retries permitted until 2026-04-20 20:51:02.099192979 +0000 UTC m=+2736.811874954 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/7d48a708-3069-4fa8-84ae-bd51e5288869-proxy-tls") pod "isvc-sklearn-runtime-predictor-65cd49579f-j2w8v" (UID: "7d48a708-3069-4fa8-84ae-bd51e5288869") : secret "isvc-sklearn-runtime-predictor-serving-cert" not found Apr 20 20:51:01.599528 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:01.599500 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7d48a708-3069-4fa8-84ae-bd51e5288869-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-j2w8v\" (UID: \"7d48a708-3069-4fa8-84ae-bd51e5288869\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-j2w8v" Apr 20 20:51:01.599777 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:01.599754 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7d48a708-3069-4fa8-84ae-bd51e5288869-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-j2w8v\" (UID: \"7d48a708-3069-4fa8-84ae-bd51e5288869\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-j2w8v" Apr 20 20:51:01.607493 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:01.607475 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v44hl\" (UniqueName: \"kubernetes.io/projected/7d48a708-3069-4fa8-84ae-bd51e5288869-kube-api-access-v44hl\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-j2w8v\" (UID: \"7d48a708-3069-4fa8-84ae-bd51e5288869\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-j2w8v" Apr 20 20:51:02.103253 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:02.103226 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7d48a708-3069-4fa8-84ae-bd51e5288869-proxy-tls\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-j2w8v\" (UID: \"7d48a708-3069-4fa8-84ae-bd51e5288869\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-j2w8v" Apr 20 20:51:02.105558 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:02.105528 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7d48a708-3069-4fa8-84ae-bd51e5288869-proxy-tls\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-j2w8v\" (UID: \"7d48a708-3069-4fa8-84ae-bd51e5288869\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-j2w8v" Apr 20 20:51:02.109617 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:02.109593 2576 generic.go:358] "Generic (PLEG): container finished" podID="43a042b3-d860-41f0-b238-1a95776d12b8" containerID="c2ccdf631808a4d895df086e2a6c28cea5e101399d48101ac5478913471bc49f" exitCode=2 Apr 20 20:51:02.109722 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:02.109666 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-8g2db" event={"ID":"43a042b3-d860-41f0-b238-1a95776d12b8","Type":"ContainerDied","Data":"c2ccdf631808a4d895df086e2a6c28cea5e101399d48101ac5478913471bc49f"} Apr 20 20:51:02.264300 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:02.264269 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-j2w8v" Apr 20 20:51:02.385483 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:02.385403 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-j2w8v"] Apr 20 20:51:02.388420 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:51:02.388381 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d48a708_3069_4fa8_84ae_bd51e5288869.slice/crio-10bf099bb85db4436b73b0a0790b8b7904fbf7b26a1497a95204fde89bd9a797 WatchSource:0}: Error finding container 10bf099bb85db4436b73b0a0790b8b7904fbf7b26a1497a95204fde89bd9a797: Status 404 returned error can't find the container with id 10bf099bb85db4436b73b0a0790b8b7904fbf7b26a1497a95204fde89bd9a797 Apr 20 20:51:03.114204 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:03.114168 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-j2w8v" event={"ID":"7d48a708-3069-4fa8-84ae-bd51e5288869","Type":"ContainerStarted","Data":"3e18e2d452598cd50a6ee258dc4a3ba25272201938b30358881af876a39026a1"} Apr 20 20:51:03.114204 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:03.114207 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-j2w8v" event={"ID":"7d48a708-3069-4fa8-84ae-bd51e5288869","Type":"ContainerStarted","Data":"10bf099bb85db4436b73b0a0790b8b7904fbf7b26a1497a95204fde89bd9a797"} Apr 20 20:51:03.940045 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:03.939994 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-8g2db" podUID="43a042b3-d860-41f0-b238-1a95776d12b8" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.54:8643/healthz\": dial tcp 10.132.0.54:8643: connect: connection refused" Apr 20 20:51:08.024638 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:08.024616 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-8g2db" Apr 20 20:51:08.132878 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:08.132847 2576 generic.go:358] "Generic (PLEG): container finished" podID="43a042b3-d860-41f0-b238-1a95776d12b8" containerID="7d10aee7bf4f8500a6bd9c28bfe8fe47be0bfb782f773aa9c4f9f56195cbe09e" exitCode=0 Apr 20 20:51:08.133052 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:08.132927 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-8g2db" Apr 20 20:51:08.133052 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:08.132934 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-8g2db" event={"ID":"43a042b3-d860-41f0-b238-1a95776d12b8","Type":"ContainerDied","Data":"7d10aee7bf4f8500a6bd9c28bfe8fe47be0bfb782f773aa9c4f9f56195cbe09e"} Apr 20 20:51:08.133052 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:08.132975 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-8g2db" event={"ID":"43a042b3-d860-41f0-b238-1a95776d12b8","Type":"ContainerDied","Data":"5e5482c746af094412b5cac57481245b7ce8907ef98e6a699a47196367c640a4"} Apr 20 20:51:08.133052 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:08.132996 2576 scope.go:117] "RemoveContainer" containerID="c2ccdf631808a4d895df086e2a6c28cea5e101399d48101ac5478913471bc49f" Apr 20 20:51:08.134247 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:08.134227 2576 generic.go:358] "Generic (PLEG): container finished" podID="7d48a708-3069-4fa8-84ae-bd51e5288869" containerID="3e18e2d452598cd50a6ee258dc4a3ba25272201938b30358881af876a39026a1" exitCode=0 Apr 20 20:51:08.134347 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:08.134256 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-j2w8v" event={"ID":"7d48a708-3069-4fa8-84ae-bd51e5288869","Type":"ContainerDied","Data":"3e18e2d452598cd50a6ee258dc4a3ba25272201938b30358881af876a39026a1"} Apr 20 20:51:08.141041 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:08.141025 2576 scope.go:117] "RemoveContainer" containerID="7d10aee7bf4f8500a6bd9c28bfe8fe47be0bfb782f773aa9c4f9f56195cbe09e" Apr 20 20:51:08.148036 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:08.148019 2576 scope.go:117] "RemoveContainer" containerID="926216eb9f04f63952b24dad684779ac27c8fb8c2c948af2a816b56133398fb7" Apr 20 20:51:08.148218 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:08.148201 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n65pd\" (UniqueName: \"kubernetes.io/projected/43a042b3-d860-41f0-b238-1a95776d12b8-kube-api-access-n65pd\") pod \"43a042b3-d860-41f0-b238-1a95776d12b8\" (UID: \"43a042b3-d860-41f0-b238-1a95776d12b8\") " Apr 20 20:51:08.148331 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:08.148314 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/43a042b3-d860-41f0-b238-1a95776d12b8-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"43a042b3-d860-41f0-b238-1a95776d12b8\" (UID: \"43a042b3-d860-41f0-b238-1a95776d12b8\") " Apr 20 20:51:08.148401 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:08.148345 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/43a042b3-d860-41f0-b238-1a95776d12b8-kserve-provision-location\") pod \"43a042b3-d860-41f0-b238-1a95776d12b8\" (UID: \"43a042b3-d860-41f0-b238-1a95776d12b8\") " Apr 20 20:51:08.148401 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:08.148386 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/43a042b3-d860-41f0-b238-1a95776d12b8-proxy-tls\") pod \"43a042b3-d860-41f0-b238-1a95776d12b8\" (UID: \"43a042b3-d860-41f0-b238-1a95776d12b8\") " Apr 20 20:51:08.148717 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:08.148672 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43a042b3-d860-41f0-b238-1a95776d12b8-sklearn-v2-mlserver-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "sklearn-v2-mlserver-kube-rbac-proxy-sar-config") pod "43a042b3-d860-41f0-b238-1a95776d12b8" (UID: "43a042b3-d860-41f0-b238-1a95776d12b8"). InnerVolumeSpecName "sklearn-v2-mlserver-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:51:08.148808 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:08.148716 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43a042b3-d860-41f0-b238-1a95776d12b8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "43a042b3-d860-41f0-b238-1a95776d12b8" (UID: "43a042b3-d860-41f0-b238-1a95776d12b8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:51:08.150647 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:08.150623 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43a042b3-d860-41f0-b238-1a95776d12b8-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "43a042b3-d860-41f0-b238-1a95776d12b8" (UID: "43a042b3-d860-41f0-b238-1a95776d12b8"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:51:08.150647 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:08.150631 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43a042b3-d860-41f0-b238-1a95776d12b8-kube-api-access-n65pd" (OuterVolumeSpecName: "kube-api-access-n65pd") pod "43a042b3-d860-41f0-b238-1a95776d12b8" (UID: "43a042b3-d860-41f0-b238-1a95776d12b8"). InnerVolumeSpecName "kube-api-access-n65pd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:51:08.162169 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:08.162095 2576 scope.go:117] "RemoveContainer" containerID="c2ccdf631808a4d895df086e2a6c28cea5e101399d48101ac5478913471bc49f" Apr 20 20:51:08.162361 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:51:08.162340 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2ccdf631808a4d895df086e2a6c28cea5e101399d48101ac5478913471bc49f\": container with ID starting with c2ccdf631808a4d895df086e2a6c28cea5e101399d48101ac5478913471bc49f not found: ID does not exist" containerID="c2ccdf631808a4d895df086e2a6c28cea5e101399d48101ac5478913471bc49f" Apr 20 20:51:08.162414 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:08.162370 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2ccdf631808a4d895df086e2a6c28cea5e101399d48101ac5478913471bc49f"} err="failed to get container status \"c2ccdf631808a4d895df086e2a6c28cea5e101399d48101ac5478913471bc49f\": rpc error: code = NotFound desc = could not find container \"c2ccdf631808a4d895df086e2a6c28cea5e101399d48101ac5478913471bc49f\": container with ID starting with c2ccdf631808a4d895df086e2a6c28cea5e101399d48101ac5478913471bc49f not found: ID does not exist" Apr 20 20:51:08.162414 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:08.162388 2576 scope.go:117] "RemoveContainer" containerID="7d10aee7bf4f8500a6bd9c28bfe8fe47be0bfb782f773aa9c4f9f56195cbe09e" Apr 20 20:51:08.162621 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:51:08.162605 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d10aee7bf4f8500a6bd9c28bfe8fe47be0bfb782f773aa9c4f9f56195cbe09e\": container with ID starting with 7d10aee7bf4f8500a6bd9c28bfe8fe47be0bfb782f773aa9c4f9f56195cbe09e not found: ID does not exist" containerID="7d10aee7bf4f8500a6bd9c28bfe8fe47be0bfb782f773aa9c4f9f56195cbe09e" Apr 20 20:51:08.162659 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:08.162627 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d10aee7bf4f8500a6bd9c28bfe8fe47be0bfb782f773aa9c4f9f56195cbe09e"} err="failed to get container status \"7d10aee7bf4f8500a6bd9c28bfe8fe47be0bfb782f773aa9c4f9f56195cbe09e\": rpc error: code = NotFound desc = could not find container \"7d10aee7bf4f8500a6bd9c28bfe8fe47be0bfb782f773aa9c4f9f56195cbe09e\": container with ID starting with 7d10aee7bf4f8500a6bd9c28bfe8fe47be0bfb782f773aa9c4f9f56195cbe09e not found: ID does not exist" Apr 20 20:51:08.162659 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:08.162642 2576 scope.go:117] "RemoveContainer" containerID="926216eb9f04f63952b24dad684779ac27c8fb8c2c948af2a816b56133398fb7" Apr 20 20:51:08.162880 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:51:08.162864 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"926216eb9f04f63952b24dad684779ac27c8fb8c2c948af2a816b56133398fb7\": container with ID starting with 926216eb9f04f63952b24dad684779ac27c8fb8c2c948af2a816b56133398fb7 not found: ID does not exist" containerID="926216eb9f04f63952b24dad684779ac27c8fb8c2c948af2a816b56133398fb7" Apr 20 20:51:08.162924 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:08.162886 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"926216eb9f04f63952b24dad684779ac27c8fb8c2c948af2a816b56133398fb7"} err="failed to get container status \"926216eb9f04f63952b24dad684779ac27c8fb8c2c948af2a816b56133398fb7\": rpc error: code = NotFound desc = could not find container \"926216eb9f04f63952b24dad684779ac27c8fb8c2c948af2a816b56133398fb7\": container with ID starting with 926216eb9f04f63952b24dad684779ac27c8fb8c2c948af2a816b56133398fb7 not found: ID does not exist" Apr 20 20:51:08.249945 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:08.249919 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n65pd\" (UniqueName: \"kubernetes.io/projected/43a042b3-d860-41f0-b238-1a95776d12b8-kube-api-access-n65pd\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:51:08.250063 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:08.249951 2576 reconciler_common.go:299] "Volume detached for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/43a042b3-d860-41f0-b238-1a95776d12b8-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:51:08.250063 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:08.249967 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/43a042b3-d860-41f0-b238-1a95776d12b8-kserve-provision-location\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:51:08.250063 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:08.249981 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/43a042b3-d860-41f0-b238-1a95776d12b8-proxy-tls\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:51:08.459328 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:08.459260 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-8g2db"] Apr 20 20:51:08.462890 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:08.462863 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-8g2db"] Apr 20 20:51:09.141791 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:09.141755 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-j2w8v" event={"ID":"7d48a708-3069-4fa8-84ae-bd51e5288869","Type":"ContainerStarted","Data":"8dad194587a91dd83d47bb01b3efa00fb40f46acab506f4a390c055566e02792"} Apr 20 20:51:09.142221 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:09.141799 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-j2w8v" event={"ID":"7d48a708-3069-4fa8-84ae-bd51e5288869","Type":"ContainerStarted","Data":"d1847291c4691ef0ebb4138356d390233677130160219e2dff1ccc1050056271"} Apr 20 20:51:09.142221 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:09.142070 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-j2w8v" Apr 20 20:51:09.160603 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:09.160556 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-j2w8v" podStartSLOduration=8.160541358 podStartE2EDuration="8.160541358s" podCreationTimestamp="2026-04-20 20:51:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:51:09.158203015 +0000 UTC m=+2743.870885011" watchObservedRunningTime="2026-04-20 20:51:09.160541358 +0000 UTC m=+2743.873223356" Apr 20 20:51:09.944322 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:09.944285 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43a042b3-d860-41f0-b238-1a95776d12b8" path="/var/lib/kubelet/pods/43a042b3-d860-41f0-b238-1a95776d12b8/volumes" Apr 20 20:51:10.146779 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:10.146752 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-j2w8v" Apr 20 20:51:10.147935 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:10.147911 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-j2w8v" podUID="7d48a708-3069-4fa8-84ae-bd51e5288869" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.55:8080: connect: connection refused" Apr 20 20:51:11.150011 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:11.149966 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-j2w8v" podUID="7d48a708-3069-4fa8-84ae-bd51e5288869" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.55:8080: connect: connection refused" Apr 20 20:51:16.159023 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:16.158992 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-j2w8v" Apr 20 20:51:16.159939 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:16.159908 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-j2w8v" podUID="7d48a708-3069-4fa8-84ae-bd51e5288869" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.55:8080: connect: connection refused" Apr 20 20:51:26.160366 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:26.160320 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-j2w8v" Apr 20 20:51:38.318570 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:38.318505 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-runtime-predictor-65cd49579f-j2w8v_7d48a708-3069-4fa8-84ae-bd51e5288869/kserve-container/0.log" Apr 20 20:51:38.460665 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:38.460620 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-j2w8v"] Apr 20 20:51:38.461629 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:38.461573 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-j2w8v" podUID="7d48a708-3069-4fa8-84ae-bd51e5288869" containerName="kserve-container" containerID="cri-o://d1847291c4691ef0ebb4138356d390233677130160219e2dff1ccc1050056271" gracePeriod=30 Apr 20 20:51:38.462029 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:38.461973 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-j2w8v" podUID="7d48a708-3069-4fa8-84ae-bd51e5288869" containerName="kube-rbac-proxy" containerID="cri-o://8dad194587a91dd83d47bb01b3efa00fb40f46acab506f4a390c055566e02792" gracePeriod=30 Apr 20 20:51:38.541572 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:38.541546 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn"] Apr 20 20:51:38.542011 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:38.541987 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="43a042b3-d860-41f0-b238-1a95776d12b8" containerName="kserve-container" Apr 20 20:51:38.542105 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:38.542013 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="43a042b3-d860-41f0-b238-1a95776d12b8" containerName="kserve-container" Apr 20 20:51:38.542105 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:38.542040 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="43a042b3-d860-41f0-b238-1a95776d12b8" containerName="kube-rbac-proxy" Apr 20 20:51:38.542105 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:38.542050 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="43a042b3-d860-41f0-b238-1a95776d12b8" containerName="kube-rbac-proxy" Apr 20 20:51:38.542105 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:38.542065 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="43a042b3-d860-41f0-b238-1a95776d12b8" containerName="storage-initializer" Apr 20 20:51:38.542105 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:38.542075 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="43a042b3-d860-41f0-b238-1a95776d12b8" containerName="storage-initializer" Apr 20 20:51:38.542394 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:38.542196 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="43a042b3-d860-41f0-b238-1a95776d12b8" containerName="kube-rbac-proxy" Apr 20 20:51:38.542394 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:38.542213 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="43a042b3-d860-41f0-b238-1a95776d12b8" containerName="kserve-container" Apr 20 20:51:38.546612 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:38.546594 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn" Apr 20 20:51:38.549260 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:38.549235 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-runtime-predictor-serving-cert\"" Apr 20 20:51:38.549376 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:38.549268 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 20 20:51:38.556382 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:38.556358 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn"] Apr 20 20:51:38.575396 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:38.575341 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d89cb\" (UniqueName: \"kubernetes.io/projected/789b255e-4490-4b58-9c14-7929862d37eb-kube-api-access-d89cb\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn\" (UID: \"789b255e-4490-4b58-9c14-7929862d37eb\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn" Apr 20 20:51:38.575506 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:38.575397 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/789b255e-4490-4b58-9c14-7929862d37eb-proxy-tls\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn\" (UID: \"789b255e-4490-4b58-9c14-7929862d37eb\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn" Apr 20 20:51:38.575575 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:38.575532 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/789b255e-4490-4b58-9c14-7929862d37eb-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn\" (UID: \"789b255e-4490-4b58-9c14-7929862d37eb\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn" Apr 20 20:51:38.575633 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:38.575575 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/789b255e-4490-4b58-9c14-7929862d37eb-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn\" (UID: \"789b255e-4490-4b58-9c14-7929862d37eb\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn" Apr 20 20:51:38.676379 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:38.676347 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/789b255e-4490-4b58-9c14-7929862d37eb-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn\" (UID: \"789b255e-4490-4b58-9c14-7929862d37eb\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn" Apr 20 20:51:38.676498 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:38.676384 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/789b255e-4490-4b58-9c14-7929862d37eb-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn\" (UID: \"789b255e-4490-4b58-9c14-7929862d37eb\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn" Apr 20 20:51:38.676498 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:38.676446 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d89cb\" (UniqueName: \"kubernetes.io/projected/789b255e-4490-4b58-9c14-7929862d37eb-kube-api-access-d89cb\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn\" (UID: \"789b255e-4490-4b58-9c14-7929862d37eb\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn" Apr 20 20:51:38.676632 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:38.676605 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/789b255e-4490-4b58-9c14-7929862d37eb-proxy-tls\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn\" (UID: \"789b255e-4490-4b58-9c14-7929862d37eb\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn" Apr 20 20:51:38.676775 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:51:38.676752 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-serving-cert: secret "isvc-sklearn-v2-runtime-predictor-serving-cert" not found Apr 20 20:51:38.676849 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:51:38.676837 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/789b255e-4490-4b58-9c14-7929862d37eb-proxy-tls podName:789b255e-4490-4b58-9c14-7929862d37eb nodeName:}" failed. No retries permitted until 2026-04-20 20:51:39.176814601 +0000 UTC m=+2773.889496575 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/789b255e-4490-4b58-9c14-7929862d37eb-proxy-tls") pod "isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn" (UID: "789b255e-4490-4b58-9c14-7929862d37eb") : secret "isvc-sklearn-v2-runtime-predictor-serving-cert" not found Apr 20 20:51:38.676896 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:38.676852 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/789b255e-4490-4b58-9c14-7929862d37eb-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn\" (UID: \"789b255e-4490-4b58-9c14-7929862d37eb\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn" Apr 20 20:51:38.676998 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:38.676980 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/789b255e-4490-4b58-9c14-7929862d37eb-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn\" (UID: \"789b255e-4490-4b58-9c14-7929862d37eb\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn" Apr 20 20:51:38.685266 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:38.685244 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d89cb\" (UniqueName: \"kubernetes.io/projected/789b255e-4490-4b58-9c14-7929862d37eb-kube-api-access-d89cb\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn\" (UID: \"789b255e-4490-4b58-9c14-7929862d37eb\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn" Apr 20 20:51:39.181589 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:39.181560 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/789b255e-4490-4b58-9c14-7929862d37eb-proxy-tls\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn\" (UID: \"789b255e-4490-4b58-9c14-7929862d37eb\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn" Apr 20 20:51:39.181764 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:51:39.181744 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-serving-cert: secret "isvc-sklearn-v2-runtime-predictor-serving-cert" not found Apr 20 20:51:39.181829 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:51:39.181819 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/789b255e-4490-4b58-9c14-7929862d37eb-proxy-tls podName:789b255e-4490-4b58-9c14-7929862d37eb nodeName:}" failed. No retries permitted until 2026-04-20 20:51:40.181799653 +0000 UTC m=+2774.894481633 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/789b255e-4490-4b58-9c14-7929862d37eb-proxy-tls") pod "isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn" (UID: "789b255e-4490-4b58-9c14-7929862d37eb") : secret "isvc-sklearn-v2-runtime-predictor-serving-cert" not found Apr 20 20:51:39.252704 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:39.252668 2576 generic.go:358] "Generic (PLEG): container finished" podID="7d48a708-3069-4fa8-84ae-bd51e5288869" containerID="8dad194587a91dd83d47bb01b3efa00fb40f46acab506f4a390c055566e02792" exitCode=2 Apr 20 20:51:39.252704 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:39.252700 2576 generic.go:358] "Generic (PLEG): container finished" podID="7d48a708-3069-4fa8-84ae-bd51e5288869" containerID="d1847291c4691ef0ebb4138356d390233677130160219e2dff1ccc1050056271" exitCode=0 Apr 20 20:51:39.252983 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:39.252747 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-j2w8v" event={"ID":"7d48a708-3069-4fa8-84ae-bd51e5288869","Type":"ContainerDied","Data":"8dad194587a91dd83d47bb01b3efa00fb40f46acab506f4a390c055566e02792"} Apr 20 20:51:39.252983 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:39.252798 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-j2w8v" event={"ID":"7d48a708-3069-4fa8-84ae-bd51e5288869","Type":"ContainerDied","Data":"d1847291c4691ef0ebb4138356d390233677130160219e2dff1ccc1050056271"} Apr 20 20:51:39.302729 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:39.302711 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-j2w8v" Apr 20 20:51:39.383896 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:39.383827 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v44hl\" (UniqueName: \"kubernetes.io/projected/7d48a708-3069-4fa8-84ae-bd51e5288869-kube-api-access-v44hl\") pod \"7d48a708-3069-4fa8-84ae-bd51e5288869\" (UID: \"7d48a708-3069-4fa8-84ae-bd51e5288869\") " Apr 20 20:51:39.383896 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:39.383868 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7d48a708-3069-4fa8-84ae-bd51e5288869-kserve-provision-location\") pod \"7d48a708-3069-4fa8-84ae-bd51e5288869\" (UID: \"7d48a708-3069-4fa8-84ae-bd51e5288869\") " Apr 20 20:51:39.384349 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:39.383910 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7d48a708-3069-4fa8-84ae-bd51e5288869-proxy-tls\") pod \"7d48a708-3069-4fa8-84ae-bd51e5288869\" (UID: \"7d48a708-3069-4fa8-84ae-bd51e5288869\") " Apr 20 20:51:39.384349 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:39.384034 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7d48a708-3069-4fa8-84ae-bd51e5288869-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"7d48a708-3069-4fa8-84ae-bd51e5288869\" (UID: \"7d48a708-3069-4fa8-84ae-bd51e5288869\") " Apr 20 20:51:39.384427 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:39.384362 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d48a708-3069-4fa8-84ae-bd51e5288869-isvc-sklearn-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-runtime-kube-rbac-proxy-sar-config") pod "7d48a708-3069-4fa8-84ae-bd51e5288869" (UID: "7d48a708-3069-4fa8-84ae-bd51e5288869"). InnerVolumeSpecName "isvc-sklearn-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:51:39.385749 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:39.385732 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d48a708-3069-4fa8-84ae-bd51e5288869-kube-api-access-v44hl" (OuterVolumeSpecName: "kube-api-access-v44hl") pod "7d48a708-3069-4fa8-84ae-bd51e5288869" (UID: "7d48a708-3069-4fa8-84ae-bd51e5288869"). InnerVolumeSpecName "kube-api-access-v44hl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:51:39.386169 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:39.386154 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d48a708-3069-4fa8-84ae-bd51e5288869-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "7d48a708-3069-4fa8-84ae-bd51e5288869" (UID: "7d48a708-3069-4fa8-84ae-bd51e5288869"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:51:39.407721 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:39.407694 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d48a708-3069-4fa8-84ae-bd51e5288869-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7d48a708-3069-4fa8-84ae-bd51e5288869" (UID: "7d48a708-3069-4fa8-84ae-bd51e5288869"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:51:39.485002 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:39.484979 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v44hl\" (UniqueName: \"kubernetes.io/projected/7d48a708-3069-4fa8-84ae-bd51e5288869-kube-api-access-v44hl\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:51:39.485088 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:39.485002 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7d48a708-3069-4fa8-84ae-bd51e5288869-kserve-provision-location\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:51:39.485088 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:39.485013 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7d48a708-3069-4fa8-84ae-bd51e5288869-proxy-tls\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:51:39.485088 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:39.485024 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/7d48a708-3069-4fa8-84ae-bd51e5288869-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:51:40.191154 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:40.191100 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/789b255e-4490-4b58-9c14-7929862d37eb-proxy-tls\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn\" (UID: \"789b255e-4490-4b58-9c14-7929862d37eb\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn" Apr 20 20:51:40.193531 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:40.193505 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/789b255e-4490-4b58-9c14-7929862d37eb-proxy-tls\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn\" (UID: \"789b255e-4490-4b58-9c14-7929862d37eb\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn" Apr 20 20:51:40.257878 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:40.257815 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-j2w8v" event={"ID":"7d48a708-3069-4fa8-84ae-bd51e5288869","Type":"ContainerDied","Data":"10bf099bb85db4436b73b0a0790b8b7904fbf7b26a1497a95204fde89bd9a797"} Apr 20 20:51:40.257878 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:40.257861 2576 scope.go:117] "RemoveContainer" containerID="8dad194587a91dd83d47bb01b3efa00fb40f46acab506f4a390c055566e02792" Apr 20 20:51:40.257878 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:40.257861 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-j2w8v" Apr 20 20:51:40.265906 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:40.265892 2576 scope.go:117] "RemoveContainer" containerID="d1847291c4691ef0ebb4138356d390233677130160219e2dff1ccc1050056271" Apr 20 20:51:40.272858 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:40.272840 2576 scope.go:117] "RemoveContainer" containerID="3e18e2d452598cd50a6ee258dc4a3ba25272201938b30358881af876a39026a1" Apr 20 20:51:40.277158 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:40.277136 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-j2w8v"] Apr 20 20:51:40.278793 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:40.278770 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-j2w8v"] Apr 20 20:51:40.358062 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:40.358041 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn" Apr 20 20:51:40.477224 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:40.477199 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn"] Apr 20 20:51:40.478983 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:51:40.478948 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod789b255e_4490_4b58_9c14_7929862d37eb.slice/crio-5afd6dcb85d2fe62089f0cab2dcb7117cf3595d42de82450dbd14dcd5fa57e53 WatchSource:0}: Error finding container 5afd6dcb85d2fe62089f0cab2dcb7117cf3595d42de82450dbd14dcd5fa57e53: Status 404 returned error can't find the container with id 5afd6dcb85d2fe62089f0cab2dcb7117cf3595d42de82450dbd14dcd5fa57e53 Apr 20 20:51:41.262261 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:41.262226 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn" event={"ID":"789b255e-4490-4b58-9c14-7929862d37eb","Type":"ContainerStarted","Data":"af539c02a56e870c417513febdc0ebfcc3351ca75da1736ce0b484832f991a8f"} Apr 20 20:51:41.262261 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:41.262267 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn" event={"ID":"789b255e-4490-4b58-9c14-7929862d37eb","Type":"ContainerStarted","Data":"5afd6dcb85d2fe62089f0cab2dcb7117cf3595d42de82450dbd14dcd5fa57e53"} Apr 20 20:51:41.944260 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:41.944228 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d48a708-3069-4fa8-84ae-bd51e5288869" path="/var/lib/kubelet/pods/7d48a708-3069-4fa8-84ae-bd51e5288869/volumes" Apr 20 20:51:44.275073 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:44.275042 2576 generic.go:358] "Generic (PLEG): container finished" podID="789b255e-4490-4b58-9c14-7929862d37eb" containerID="af539c02a56e870c417513febdc0ebfcc3351ca75da1736ce0b484832f991a8f" exitCode=0 Apr 20 20:51:44.275396 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:44.275135 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn" event={"ID":"789b255e-4490-4b58-9c14-7929862d37eb","Type":"ContainerDied","Data":"af539c02a56e870c417513febdc0ebfcc3351ca75da1736ce0b484832f991a8f"} Apr 20 20:51:45.279870 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:45.279834 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn" event={"ID":"789b255e-4490-4b58-9c14-7929862d37eb","Type":"ContainerStarted","Data":"cc194a3b9055096edc20ba3d616103a41c6873bef20f48c0836bab335fdcfdd7"} Apr 20 20:51:45.279870 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:45.279876 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn" event={"ID":"789b255e-4490-4b58-9c14-7929862d37eb","Type":"ContainerStarted","Data":"9c8392508974f2463eb455e6764aead12c8ed4baf160c27c3273e1906403ca33"} Apr 20 20:51:45.280358 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:45.280085 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn" Apr 20 20:51:45.297525 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:45.297468 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn" podStartSLOduration=7.297451345 podStartE2EDuration="7.297451345s" podCreationTimestamp="2026-04-20 20:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:51:45.297276239 +0000 UTC m=+2780.009958294" watchObservedRunningTime="2026-04-20 20:51:45.297451345 +0000 UTC m=+2780.010133339" Apr 20 20:51:46.284148 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:46.284117 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn" Apr 20 20:51:52.292967 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:51:52.292933 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn" Apr 20 20:52:22.338655 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:22.338601 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn" podUID="789b255e-4490-4b58-9c14-7929862d37eb" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 20 20:52:32.295617 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:32.295588 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn" Apr 20 20:52:38.632244 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:38.632209 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn"] Apr 20 20:52:38.632649 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:38.632607 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn" podUID="789b255e-4490-4b58-9c14-7929862d37eb" containerName="kserve-container" containerID="cri-o://9c8392508974f2463eb455e6764aead12c8ed4baf160c27c3273e1906403ca33" gracePeriod=30 Apr 20 20:52:38.632742 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:38.632709 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn" podUID="789b255e-4490-4b58-9c14-7929862d37eb" containerName="kube-rbac-proxy" containerID="cri-o://cc194a3b9055096edc20ba3d616103a41c6873bef20f48c0836bab335fdcfdd7" gracePeriod=30 Apr 20 20:52:38.691320 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:38.691285 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-5wbvc"] Apr 20 20:52:38.691697 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:38.691680 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7d48a708-3069-4fa8-84ae-bd51e5288869" containerName="storage-initializer" Apr 20 20:52:38.691783 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:38.691701 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d48a708-3069-4fa8-84ae-bd51e5288869" containerName="storage-initializer" Apr 20 20:52:38.691783 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:38.691735 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7d48a708-3069-4fa8-84ae-bd51e5288869" containerName="kube-rbac-proxy" Apr 20 20:52:38.691783 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:38.691744 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d48a708-3069-4fa8-84ae-bd51e5288869" containerName="kube-rbac-proxy" Apr 20 20:52:38.691783 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:38.691761 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7d48a708-3069-4fa8-84ae-bd51e5288869" containerName="kserve-container" Apr 20 20:52:38.691783 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:38.691770 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d48a708-3069-4fa8-84ae-bd51e5288869" containerName="kserve-container" Apr 20 20:52:38.692044 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:38.691858 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="7d48a708-3069-4fa8-84ae-bd51e5288869" containerName="kube-rbac-proxy" Apr 20 20:52:38.692044 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:38.691872 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="7d48a708-3069-4fa8-84ae-bd51e5288869" containerName="kserve-container" Apr 20 20:52:38.696185 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:38.696165 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-5wbvc" Apr 20 20:52:38.698354 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:38.698336 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-predictor-serving-cert\"" Apr 20 20:52:38.698455 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:38.698357 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-kube-rbac-proxy-sar-config\"" Apr 20 20:52:38.703855 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:38.703832 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-5wbvc"] Apr 20 20:52:38.736393 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:38.736364 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dcea1326-bb23-43e3-87f8-ce6ec2876f6c-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-69755fbb9-5wbvc\" (UID: \"dcea1326-bb23-43e3-87f8-ce6ec2876f6c\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-5wbvc" Apr 20 20:52:38.736492 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:38.736410 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxznl\" (UniqueName: \"kubernetes.io/projected/dcea1326-bb23-43e3-87f8-ce6ec2876f6c-kube-api-access-jxznl\") pod \"isvc-sklearn-v2-predictor-69755fbb9-5wbvc\" (UID: \"dcea1326-bb23-43e3-87f8-ce6ec2876f6c\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-5wbvc" Apr 20 20:52:38.736492 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:38.736436 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dcea1326-bb23-43e3-87f8-ce6ec2876f6c-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-predictor-69755fbb9-5wbvc\" (UID: \"dcea1326-bb23-43e3-87f8-ce6ec2876f6c\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-5wbvc" Apr 20 20:52:38.736492 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:38.736454 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dcea1326-bb23-43e3-87f8-ce6ec2876f6c-proxy-tls\") pod \"isvc-sklearn-v2-predictor-69755fbb9-5wbvc\" (UID: \"dcea1326-bb23-43e3-87f8-ce6ec2876f6c\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-5wbvc" Apr 20 20:52:38.837524 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:38.837500 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dcea1326-bb23-43e3-87f8-ce6ec2876f6c-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-69755fbb9-5wbvc\" (UID: \"dcea1326-bb23-43e3-87f8-ce6ec2876f6c\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-5wbvc" Apr 20 20:52:38.837629 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:38.837550 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jxznl\" (UniqueName: \"kubernetes.io/projected/dcea1326-bb23-43e3-87f8-ce6ec2876f6c-kube-api-access-jxznl\") pod \"isvc-sklearn-v2-predictor-69755fbb9-5wbvc\" (UID: \"dcea1326-bb23-43e3-87f8-ce6ec2876f6c\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-5wbvc" Apr 20 20:52:38.837629 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:38.837572 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dcea1326-bb23-43e3-87f8-ce6ec2876f6c-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-predictor-69755fbb9-5wbvc\" (UID: \"dcea1326-bb23-43e3-87f8-ce6ec2876f6c\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-5wbvc" Apr 20 20:52:38.837830 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:38.837804 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dcea1326-bb23-43e3-87f8-ce6ec2876f6c-proxy-tls\") pod \"isvc-sklearn-v2-predictor-69755fbb9-5wbvc\" (UID: \"dcea1326-bb23-43e3-87f8-ce6ec2876f6c\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-5wbvc" Apr 20 20:52:38.837989 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:52:38.837966 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-v2-predictor-serving-cert: secret "isvc-sklearn-v2-predictor-serving-cert" not found Apr 20 20:52:38.838064 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:38.838015 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dcea1326-bb23-43e3-87f8-ce6ec2876f6c-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-69755fbb9-5wbvc\" (UID: \"dcea1326-bb23-43e3-87f8-ce6ec2876f6c\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-5wbvc" Apr 20 20:52:38.838064 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:52:38.838054 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dcea1326-bb23-43e3-87f8-ce6ec2876f6c-proxy-tls podName:dcea1326-bb23-43e3-87f8-ce6ec2876f6c nodeName:}" failed. No retries permitted until 2026-04-20 20:52:39.338034958 +0000 UTC m=+2834.050716936 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/dcea1326-bb23-43e3-87f8-ce6ec2876f6c-proxy-tls") pod "isvc-sklearn-v2-predictor-69755fbb9-5wbvc" (UID: "dcea1326-bb23-43e3-87f8-ce6ec2876f6c") : secret "isvc-sklearn-v2-predictor-serving-cert" not found Apr 20 20:52:38.838270 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:38.838253 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dcea1326-bb23-43e3-87f8-ce6ec2876f6c-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-predictor-69755fbb9-5wbvc\" (UID: \"dcea1326-bb23-43e3-87f8-ce6ec2876f6c\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-5wbvc" Apr 20 20:52:38.845969 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:38.845945 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxznl\" (UniqueName: \"kubernetes.io/projected/dcea1326-bb23-43e3-87f8-ce6ec2876f6c-kube-api-access-jxznl\") pod \"isvc-sklearn-v2-predictor-69755fbb9-5wbvc\" (UID: \"dcea1326-bb23-43e3-87f8-ce6ec2876f6c\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-5wbvc" Apr 20 20:52:39.341457 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:39.341421 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dcea1326-bb23-43e3-87f8-ce6ec2876f6c-proxy-tls\") pod \"isvc-sklearn-v2-predictor-69755fbb9-5wbvc\" (UID: \"dcea1326-bb23-43e3-87f8-ce6ec2876f6c\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-5wbvc" Apr 20 20:52:39.343916 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:39.343883 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dcea1326-bb23-43e3-87f8-ce6ec2876f6c-proxy-tls\") pod \"isvc-sklearn-v2-predictor-69755fbb9-5wbvc\" (UID: \"dcea1326-bb23-43e3-87f8-ce6ec2876f6c\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-5wbvc" Apr 20 20:52:39.472312 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:39.472284 2576 generic.go:358] "Generic (PLEG): container finished" podID="789b255e-4490-4b58-9c14-7929862d37eb" containerID="cc194a3b9055096edc20ba3d616103a41c6873bef20f48c0836bab335fdcfdd7" exitCode=2 Apr 20 20:52:39.472449 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:39.472358 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn" event={"ID":"789b255e-4490-4b58-9c14-7929862d37eb","Type":"ContainerDied","Data":"cc194a3b9055096edc20ba3d616103a41c6873bef20f48c0836bab335fdcfdd7"} Apr 20 20:52:39.607567 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:39.607494 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-5wbvc" Apr 20 20:52:39.734848 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:39.734822 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-5wbvc"] Apr 20 20:52:39.737178 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:52:39.737142 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddcea1326_bb23_43e3_87f8_ce6ec2876f6c.slice/crio-4726c470c2000d7c3fdcad1e161d6959611e5b93a476559b9dd3a76970c6dfdb WatchSource:0}: Error finding container 4726c470c2000d7c3fdcad1e161d6959611e5b93a476559b9dd3a76970c6dfdb: Status 404 returned error can't find the container with id 4726c470c2000d7c3fdcad1e161d6959611e5b93a476559b9dd3a76970c6dfdb Apr 20 20:52:40.476947 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:40.476913 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-5wbvc" event={"ID":"dcea1326-bb23-43e3-87f8-ce6ec2876f6c","Type":"ContainerStarted","Data":"9129b1ecf1e374ffb02ae338ac381809f5e3bc44abd5ffce3afe7a874bed7e09"} Apr 20 20:52:40.476947 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:40.476951 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-5wbvc" event={"ID":"dcea1326-bb23-43e3-87f8-ce6ec2876f6c","Type":"ContainerStarted","Data":"4726c470c2000d7c3fdcad1e161d6959611e5b93a476559b9dd3a76970c6dfdb"} Apr 20 20:52:42.288376 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:42.288339 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn" podUID="789b255e-4490-4b58-9c14-7929862d37eb" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.56:8643/healthz\": dial tcp 10.132.0.56:8643: connect: connection refused" Apr 20 20:52:42.294545 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:42.294512 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn" podUID="789b255e-4490-4b58-9c14-7929862d37eb" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.56:8080/v2/models/isvc-sklearn-v2-runtime/ready\": dial tcp 10.132.0.56:8080: connect: connection refused" Apr 20 20:52:43.487216 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:43.487186 2576 generic.go:358] "Generic (PLEG): container finished" podID="dcea1326-bb23-43e3-87f8-ce6ec2876f6c" containerID="9129b1ecf1e374ffb02ae338ac381809f5e3bc44abd5ffce3afe7a874bed7e09" exitCode=0 Apr 20 20:52:43.487507 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:43.487259 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-5wbvc" event={"ID":"dcea1326-bb23-43e3-87f8-ce6ec2876f6c","Type":"ContainerDied","Data":"9129b1ecf1e374ffb02ae338ac381809f5e3bc44abd5ffce3afe7a874bed7e09"} Apr 20 20:52:44.492311 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:44.492279 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-5wbvc" event={"ID":"dcea1326-bb23-43e3-87f8-ce6ec2876f6c","Type":"ContainerStarted","Data":"817ecaf184f5558b73c86488c9e574beaf8ec5ace5de50d2b76baaec21f37e34"} Apr 20 20:52:44.492718 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:44.492319 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-5wbvc" event={"ID":"dcea1326-bb23-43e3-87f8-ce6ec2876f6c","Type":"ContainerStarted","Data":"eec324011df7b67fc0019d04ca9d8853d23878eb8202e636a1c70a6b8d5dd9c0"} Apr 20 20:52:44.492718 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:44.492592 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-5wbvc" Apr 20 20:52:44.492794 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:44.492713 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-5wbvc" Apr 20 20:52:44.493812 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:44.493771 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-5wbvc" podUID="dcea1326-bb23-43e3-87f8-ce6ec2876f6c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.57:8080: connect: connection refused" Apr 20 20:52:44.511328 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:44.511287 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-5wbvc" podStartSLOduration=6.511274948 podStartE2EDuration="6.511274948s" podCreationTimestamp="2026-04-20 20:52:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:52:44.509300278 +0000 UTC m=+2839.221982275" watchObservedRunningTime="2026-04-20 20:52:44.511274948 +0000 UTC m=+2839.223956945" Apr 20 20:52:45.500596 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:45.500551 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-5wbvc" podUID="dcea1326-bb23-43e3-87f8-ce6ec2876f6c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.57:8080: connect: connection refused" Apr 20 20:52:45.672188 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:45.672164 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn" Apr 20 20:52:45.688359 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:45.688290 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/789b255e-4490-4b58-9c14-7929862d37eb-kserve-provision-location\") pod \"789b255e-4490-4b58-9c14-7929862d37eb\" (UID: \"789b255e-4490-4b58-9c14-7929862d37eb\") " Apr 20 20:52:45.688359 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:45.688341 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/789b255e-4490-4b58-9c14-7929862d37eb-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"789b255e-4490-4b58-9c14-7929862d37eb\" (UID: \"789b255e-4490-4b58-9c14-7929862d37eb\") " Apr 20 20:52:45.688521 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:45.688401 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/789b255e-4490-4b58-9c14-7929862d37eb-proxy-tls\") pod \"789b255e-4490-4b58-9c14-7929862d37eb\" (UID: \"789b255e-4490-4b58-9c14-7929862d37eb\") " Apr 20 20:52:45.688521 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:45.688452 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d89cb\" (UniqueName: \"kubernetes.io/projected/789b255e-4490-4b58-9c14-7929862d37eb-kube-api-access-d89cb\") pod \"789b255e-4490-4b58-9c14-7929862d37eb\" (UID: \"789b255e-4490-4b58-9c14-7929862d37eb\") " Apr 20 20:52:45.688697 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:45.688672 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/789b255e-4490-4b58-9c14-7929862d37eb-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "789b255e-4490-4b58-9c14-7929862d37eb" (UID: "789b255e-4490-4b58-9c14-7929862d37eb"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:52:45.688797 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:45.688771 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/789b255e-4490-4b58-9c14-7929862d37eb-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config") pod "789b255e-4490-4b58-9c14-7929862d37eb" (UID: "789b255e-4490-4b58-9c14-7929862d37eb"). InnerVolumeSpecName "isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:52:45.691011 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:45.690979 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/789b255e-4490-4b58-9c14-7929862d37eb-kube-api-access-d89cb" (OuterVolumeSpecName: "kube-api-access-d89cb") pod "789b255e-4490-4b58-9c14-7929862d37eb" (UID: "789b255e-4490-4b58-9c14-7929862d37eb"). InnerVolumeSpecName "kube-api-access-d89cb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:52:45.691281 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:45.691253 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/789b255e-4490-4b58-9c14-7929862d37eb-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "789b255e-4490-4b58-9c14-7929862d37eb" (UID: "789b255e-4490-4b58-9c14-7929862d37eb"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:52:45.789947 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:45.789924 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/789b255e-4490-4b58-9c14-7929862d37eb-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:52:45.789947 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:45.789946 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/789b255e-4490-4b58-9c14-7929862d37eb-proxy-tls\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:52:45.790096 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:45.789959 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d89cb\" (UniqueName: \"kubernetes.io/projected/789b255e-4490-4b58-9c14-7929862d37eb-kube-api-access-d89cb\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:52:45.790096 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:45.789970 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/789b255e-4490-4b58-9c14-7929862d37eb-kserve-provision-location\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:52:46.501472 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:46.501436 2576 generic.go:358] "Generic (PLEG): container finished" podID="789b255e-4490-4b58-9c14-7929862d37eb" containerID="9c8392508974f2463eb455e6764aead12c8ed4baf160c27c3273e1906403ca33" exitCode=0 Apr 20 20:52:46.501848 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:46.501527 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn" event={"ID":"789b255e-4490-4b58-9c14-7929862d37eb","Type":"ContainerDied","Data":"9c8392508974f2463eb455e6764aead12c8ed4baf160c27c3273e1906403ca33"} Apr 20 20:52:46.501848 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:46.501554 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn" Apr 20 20:52:46.501848 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:46.501574 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn" event={"ID":"789b255e-4490-4b58-9c14-7929862d37eb","Type":"ContainerDied","Data":"5afd6dcb85d2fe62089f0cab2dcb7117cf3595d42de82450dbd14dcd5fa57e53"} Apr 20 20:52:46.501848 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:46.501598 2576 scope.go:117] "RemoveContainer" containerID="cc194a3b9055096edc20ba3d616103a41c6873bef20f48c0836bab335fdcfdd7" Apr 20 20:52:46.509492 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:46.509472 2576 scope.go:117] "RemoveContainer" containerID="9c8392508974f2463eb455e6764aead12c8ed4baf160c27c3273e1906403ca33" Apr 20 20:52:46.516506 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:46.516490 2576 scope.go:117] "RemoveContainer" containerID="af539c02a56e870c417513febdc0ebfcc3351ca75da1736ce0b484832f991a8f" Apr 20 20:52:46.522211 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:46.522190 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn"] Apr 20 20:52:46.523834 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:46.523821 2576 scope.go:117] "RemoveContainer" containerID="cc194a3b9055096edc20ba3d616103a41c6873bef20f48c0836bab335fdcfdd7" Apr 20 20:52:46.524054 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:52:46.524037 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc194a3b9055096edc20ba3d616103a41c6873bef20f48c0836bab335fdcfdd7\": container with ID starting with cc194a3b9055096edc20ba3d616103a41c6873bef20f48c0836bab335fdcfdd7 not found: ID does not exist" containerID="cc194a3b9055096edc20ba3d616103a41c6873bef20f48c0836bab335fdcfdd7" Apr 20 20:52:46.524099 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:46.524062 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc194a3b9055096edc20ba3d616103a41c6873bef20f48c0836bab335fdcfdd7"} err="failed to get container status \"cc194a3b9055096edc20ba3d616103a41c6873bef20f48c0836bab335fdcfdd7\": rpc error: code = NotFound desc = could not find container \"cc194a3b9055096edc20ba3d616103a41c6873bef20f48c0836bab335fdcfdd7\": container with ID starting with cc194a3b9055096edc20ba3d616103a41c6873bef20f48c0836bab335fdcfdd7 not found: ID does not exist" Apr 20 20:52:46.524099 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:46.524077 2576 scope.go:117] "RemoveContainer" containerID="9c8392508974f2463eb455e6764aead12c8ed4baf160c27c3273e1906403ca33" Apr 20 20:52:46.524345 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:52:46.524328 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c8392508974f2463eb455e6764aead12c8ed4baf160c27c3273e1906403ca33\": container with ID starting with 9c8392508974f2463eb455e6764aead12c8ed4baf160c27c3273e1906403ca33 not found: ID does not exist" containerID="9c8392508974f2463eb455e6764aead12c8ed4baf160c27c3273e1906403ca33" Apr 20 20:52:46.524403 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:46.524349 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c8392508974f2463eb455e6764aead12c8ed4baf160c27c3273e1906403ca33"} err="failed to get container status \"9c8392508974f2463eb455e6764aead12c8ed4baf160c27c3273e1906403ca33\": rpc error: code = NotFound desc = could not find container \"9c8392508974f2463eb455e6764aead12c8ed4baf160c27c3273e1906403ca33\": container with ID starting with 9c8392508974f2463eb455e6764aead12c8ed4baf160c27c3273e1906403ca33 not found: ID does not exist" Apr 20 20:52:46.524403 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:46.524361 2576 scope.go:117] "RemoveContainer" containerID="af539c02a56e870c417513febdc0ebfcc3351ca75da1736ce0b484832f991a8f" Apr 20 20:52:46.524539 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:52:46.524520 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af539c02a56e870c417513febdc0ebfcc3351ca75da1736ce0b484832f991a8f\": container with ID starting with af539c02a56e870c417513febdc0ebfcc3351ca75da1736ce0b484832f991a8f not found: ID does not exist" containerID="af539c02a56e870c417513febdc0ebfcc3351ca75da1736ce0b484832f991a8f" Apr 20 20:52:46.524578 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:46.524543 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af539c02a56e870c417513febdc0ebfcc3351ca75da1736ce0b484832f991a8f"} err="failed to get container status \"af539c02a56e870c417513febdc0ebfcc3351ca75da1736ce0b484832f991a8f\": rpc error: code = NotFound desc = could not find container \"af539c02a56e870c417513febdc0ebfcc3351ca75da1736ce0b484832f991a8f\": container with ID starting with af539c02a56e870c417513febdc0ebfcc3351ca75da1736ce0b484832f991a8f not found: ID does not exist" Apr 20 20:52:46.529299 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:46.529280 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-s9vvn"] Apr 20 20:52:47.944233 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:47.944201 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="789b255e-4490-4b58-9c14-7929862d37eb" path="/var/lib/kubelet/pods/789b255e-4490-4b58-9c14-7929862d37eb/volumes" Apr 20 20:52:50.502228 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:50.502203 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-5wbvc" Apr 20 20:52:50.502773 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:52:50.502749 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-5wbvc" podUID="dcea1326-bb23-43e3-87f8-ce6ec2876f6c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.57:8080: connect: connection refused" Apr 20 20:53:00.506772 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:53:00.506728 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-5wbvc" podUID="dcea1326-bb23-43e3-87f8-ce6ec2876f6c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.57:8080: connect: connection refused" Apr 20 20:53:10.502791 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:53:10.502751 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-5wbvc" podUID="dcea1326-bb23-43e3-87f8-ce6ec2876f6c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.57:8080: connect: connection refused" Apr 20 20:53:20.503469 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:53:20.503430 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-5wbvc" podUID="dcea1326-bb23-43e3-87f8-ce6ec2876f6c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.57:8080: connect: connection refused" Apr 20 20:53:30.502864 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:53:30.502827 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-5wbvc" podUID="dcea1326-bb23-43e3-87f8-ce6ec2876f6c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.57:8080: connect: connection refused" Apr 20 20:53:40.503216 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:53:40.503178 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-5wbvc" podUID="dcea1326-bb23-43e3-87f8-ce6ec2876f6c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.57:8080: connect: connection refused" Apr 20 20:53:50.503260 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:53:50.503228 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-5wbvc" Apr 20 20:53:58.895855 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:53:58.895816 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-5wbvc"] Apr 20 20:53:58.896415 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:53:58.896170 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-5wbvc" podUID="dcea1326-bb23-43e3-87f8-ce6ec2876f6c" containerName="kserve-container" containerID="cri-o://eec324011df7b67fc0019d04ca9d8853d23878eb8202e636a1c70a6b8d5dd9c0" gracePeriod=30 Apr 20 20:53:58.896415 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:53:58.896310 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-5wbvc" podUID="dcea1326-bb23-43e3-87f8-ce6ec2876f6c" containerName="kube-rbac-proxy" containerID="cri-o://817ecaf184f5558b73c86488c9e574beaf8ec5ace5de50d2b76baaec21f37e34" gracePeriod=30 Apr 20 20:53:58.993749 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:53:58.993721 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht"] Apr 20 20:53:58.994065 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:53:58.994053 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="789b255e-4490-4b58-9c14-7929862d37eb" containerName="kserve-container" Apr 20 20:53:58.994145 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:53:58.994067 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="789b255e-4490-4b58-9c14-7929862d37eb" containerName="kserve-container" Apr 20 20:53:58.994145 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:53:58.994083 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="789b255e-4490-4b58-9c14-7929862d37eb" containerName="storage-initializer" Apr 20 20:53:58.994145 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:53:58.994091 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="789b255e-4490-4b58-9c14-7929862d37eb" containerName="storage-initializer" Apr 20 20:53:58.994145 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:53:58.994101 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="789b255e-4490-4b58-9c14-7929862d37eb" containerName="kube-rbac-proxy" Apr 20 20:53:58.994145 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:53:58.994119 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="789b255e-4490-4b58-9c14-7929862d37eb" containerName="kube-rbac-proxy" Apr 20 20:53:58.994313 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:53:58.994184 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="789b255e-4490-4b58-9c14-7929862d37eb" containerName="kserve-container" Apr 20 20:53:58.994313 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:53:58.994192 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="789b255e-4490-4b58-9c14-7929862d37eb" containerName="kube-rbac-proxy" Apr 20 20:53:58.997304 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:53:58.997281 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht" Apr 20 20:53:58.999868 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:53:58.999843 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\"" Apr 20 20:53:58.999972 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:53:58.999846 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-mixed-predictor-serving-cert\"" Apr 20 20:53:59.007757 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:53:59.007733 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht"] Apr 20 20:53:59.135482 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:53:59.135454 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqhrl\" (UniqueName: \"kubernetes.io/projected/29196e1f-5f82-4028-9890-4c3ea89e97c7-kube-api-access-rqhrl\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht\" (UID: \"29196e1f-5f82-4028-9890-4c3ea89e97c7\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht" Apr 20 20:53:59.135618 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:53:59.135495 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/29196e1f-5f82-4028-9890-4c3ea89e97c7-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht\" (UID: \"29196e1f-5f82-4028-9890-4c3ea89e97c7\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht" Apr 20 20:53:59.135618 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:53:59.135575 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/29196e1f-5f82-4028-9890-4c3ea89e97c7-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht\" (UID: \"29196e1f-5f82-4028-9890-4c3ea89e97c7\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht" Apr 20 20:53:59.135618 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:53:59.135602 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/29196e1f-5f82-4028-9890-4c3ea89e97c7-proxy-tls\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht\" (UID: \"29196e1f-5f82-4028-9890-4c3ea89e97c7\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht" Apr 20 20:53:59.236332 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:53:59.236248 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/29196e1f-5f82-4028-9890-4c3ea89e97c7-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht\" (UID: \"29196e1f-5f82-4028-9890-4c3ea89e97c7\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht" Apr 20 20:53:59.236332 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:53:59.236297 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/29196e1f-5f82-4028-9890-4c3ea89e97c7-proxy-tls\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht\" (UID: \"29196e1f-5f82-4028-9890-4c3ea89e97c7\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht" Apr 20 20:53:59.236553 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:53:59.236336 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rqhrl\" (UniqueName: \"kubernetes.io/projected/29196e1f-5f82-4028-9890-4c3ea89e97c7-kube-api-access-rqhrl\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht\" (UID: \"29196e1f-5f82-4028-9890-4c3ea89e97c7\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht" Apr 20 20:53:59.236553 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:53:59.236370 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/29196e1f-5f82-4028-9890-4c3ea89e97c7-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht\" (UID: \"29196e1f-5f82-4028-9890-4c3ea89e97c7\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht" Apr 20 20:53:59.236553 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:53:59.236469 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-serving-cert: secret "isvc-sklearn-v2-mixed-predictor-serving-cert" not found Apr 20 20:53:59.236553 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:53:59.236546 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29196e1f-5f82-4028-9890-4c3ea89e97c7-proxy-tls podName:29196e1f-5f82-4028-9890-4c3ea89e97c7 nodeName:}" failed. No retries permitted until 2026-04-20 20:53:59.73652473 +0000 UTC m=+2914.449206705 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/29196e1f-5f82-4028-9890-4c3ea89e97c7-proxy-tls") pod "isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht" (UID: "29196e1f-5f82-4028-9890-4c3ea89e97c7") : secret "isvc-sklearn-v2-mixed-predictor-serving-cert" not found Apr 20 20:53:59.236846 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:53:59.236756 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/29196e1f-5f82-4028-9890-4c3ea89e97c7-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht\" (UID: \"29196e1f-5f82-4028-9890-4c3ea89e97c7\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht" Apr 20 20:53:59.237006 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:53:59.236983 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/29196e1f-5f82-4028-9890-4c3ea89e97c7-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht\" (UID: \"29196e1f-5f82-4028-9890-4c3ea89e97c7\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht" Apr 20 20:53:59.247180 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:53:59.247153 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqhrl\" (UniqueName: \"kubernetes.io/projected/29196e1f-5f82-4028-9890-4c3ea89e97c7-kube-api-access-rqhrl\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht\" (UID: \"29196e1f-5f82-4028-9890-4c3ea89e97c7\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht" Apr 20 20:53:59.745081 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:53:59.745038 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/29196e1f-5f82-4028-9890-4c3ea89e97c7-proxy-tls\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht\" (UID: \"29196e1f-5f82-4028-9890-4c3ea89e97c7\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht" Apr 20 20:53:59.747169 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:53:59.747137 2576 generic.go:358] "Generic (PLEG): container finished" podID="dcea1326-bb23-43e3-87f8-ce6ec2876f6c" containerID="817ecaf184f5558b73c86488c9e574beaf8ec5ace5de50d2b76baaec21f37e34" exitCode=2 Apr 20 20:53:59.747310 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:53:59.747194 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-5wbvc" event={"ID":"dcea1326-bb23-43e3-87f8-ce6ec2876f6c","Type":"ContainerDied","Data":"817ecaf184f5558b73c86488c9e574beaf8ec5ace5de50d2b76baaec21f37e34"} Apr 20 20:53:59.747932 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:53:59.747907 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/29196e1f-5f82-4028-9890-4c3ea89e97c7-proxy-tls\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht\" (UID: \"29196e1f-5f82-4028-9890-4c3ea89e97c7\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht" Apr 20 20:53:59.908679 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:53:59.908641 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht" Apr 20 20:54:00.030790 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:54:00.030767 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht"] Apr 20 20:54:00.033091 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:54:00.033066 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29196e1f_5f82_4028_9890_4c3ea89e97c7.slice/crio-b57af04852f22829ca92136305d8821f0b034a3117fd5efc801cf51f26730522 WatchSource:0}: Error finding container b57af04852f22829ca92136305d8821f0b034a3117fd5efc801cf51f26730522: Status 404 returned error can't find the container with id b57af04852f22829ca92136305d8821f0b034a3117fd5efc801cf51f26730522 Apr 20 20:54:00.035038 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:54:00.035018 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:54:00.498232 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:54:00.498186 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-5wbvc" podUID="dcea1326-bb23-43e3-87f8-ce6ec2876f6c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.57:8643/healthz\": dial tcp 10.132.0.57:8643: connect: connection refused" Apr 20 20:54:00.503611 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:54:00.503569 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-5wbvc" podUID="dcea1326-bb23-43e3-87f8-ce6ec2876f6c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.57:8080: connect: connection refused" Apr 20 20:54:00.752140 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:54:00.752026 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht" event={"ID":"29196e1f-5f82-4028-9890-4c3ea89e97c7","Type":"ContainerStarted","Data":"4068141a8fb6d07a29a4f55ac33dd31894d32387156a83a2a5404e8c1a7d2b2a"} Apr 20 20:54:00.752140 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:54:00.752071 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht" event={"ID":"29196e1f-5f82-4028-9890-4c3ea89e97c7","Type":"ContainerStarted","Data":"b57af04852f22829ca92136305d8821f0b034a3117fd5efc801cf51f26730522"} Apr 20 20:54:02.945552 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:54:02.945529 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-5wbvc" Apr 20 20:54:03.068261 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:54:03.068224 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dcea1326-bb23-43e3-87f8-ce6ec2876f6c-kserve-provision-location\") pod \"dcea1326-bb23-43e3-87f8-ce6ec2876f6c\" (UID: \"dcea1326-bb23-43e3-87f8-ce6ec2876f6c\") " Apr 20 20:54:03.068420 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:54:03.068282 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxznl\" (UniqueName: \"kubernetes.io/projected/dcea1326-bb23-43e3-87f8-ce6ec2876f6c-kube-api-access-jxznl\") pod \"dcea1326-bb23-43e3-87f8-ce6ec2876f6c\" (UID: \"dcea1326-bb23-43e3-87f8-ce6ec2876f6c\") " Apr 20 20:54:03.068420 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:54:03.068328 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dcea1326-bb23-43e3-87f8-ce6ec2876f6c-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"dcea1326-bb23-43e3-87f8-ce6ec2876f6c\" (UID: \"dcea1326-bb23-43e3-87f8-ce6ec2876f6c\") " Apr 20 20:54:03.068420 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:54:03.068349 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dcea1326-bb23-43e3-87f8-ce6ec2876f6c-proxy-tls\") pod \"dcea1326-bb23-43e3-87f8-ce6ec2876f6c\" (UID: \"dcea1326-bb23-43e3-87f8-ce6ec2876f6c\") " Apr 20 20:54:03.068596 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:54:03.068491 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcea1326-bb23-43e3-87f8-ce6ec2876f6c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "dcea1326-bb23-43e3-87f8-ce6ec2876f6c" (UID: "dcea1326-bb23-43e3-87f8-ce6ec2876f6c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:54:03.068660 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:54:03.068618 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dcea1326-bb23-43e3-87f8-ce6ec2876f6c-kserve-provision-location\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:54:03.068660 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:54:03.068648 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcea1326-bb23-43e3-87f8-ce6ec2876f6c-isvc-sklearn-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-v2-kube-rbac-proxy-sar-config") pod "dcea1326-bb23-43e3-87f8-ce6ec2876f6c" (UID: "dcea1326-bb23-43e3-87f8-ce6ec2876f6c"). InnerVolumeSpecName "isvc-sklearn-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:54:03.070360 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:54:03.070340 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcea1326-bb23-43e3-87f8-ce6ec2876f6c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "dcea1326-bb23-43e3-87f8-ce6ec2876f6c" (UID: "dcea1326-bb23-43e3-87f8-ce6ec2876f6c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:54:03.070441 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:54:03.070400 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcea1326-bb23-43e3-87f8-ce6ec2876f6c-kube-api-access-jxznl" (OuterVolumeSpecName: "kube-api-access-jxznl") pod "dcea1326-bb23-43e3-87f8-ce6ec2876f6c" (UID: "dcea1326-bb23-43e3-87f8-ce6ec2876f6c"). InnerVolumeSpecName "kube-api-access-jxznl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:54:03.169771 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:54:03.169750 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dcea1326-bb23-43e3-87f8-ce6ec2876f6c-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:54:03.169771 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:54:03.169771 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dcea1326-bb23-43e3-87f8-ce6ec2876f6c-proxy-tls\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:54:03.169889 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:54:03.169783 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jxznl\" (UniqueName: \"kubernetes.io/projected/dcea1326-bb23-43e3-87f8-ce6ec2876f6c-kube-api-access-jxznl\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:54:03.764710 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:54:03.764678 2576 generic.go:358] "Generic (PLEG): container finished" podID="dcea1326-bb23-43e3-87f8-ce6ec2876f6c" containerID="eec324011df7b67fc0019d04ca9d8853d23878eb8202e636a1c70a6b8d5dd9c0" exitCode=0 Apr 20 20:54:03.764896 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:54:03.764748 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-5wbvc" event={"ID":"dcea1326-bb23-43e3-87f8-ce6ec2876f6c","Type":"ContainerDied","Data":"eec324011df7b67fc0019d04ca9d8853d23878eb8202e636a1c70a6b8d5dd9c0"} Apr 20 20:54:03.764896 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:54:03.764764 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-5wbvc" Apr 20 20:54:03.764896 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:54:03.764781 2576 scope.go:117] "RemoveContainer" containerID="817ecaf184f5558b73c86488c9e574beaf8ec5ace5de50d2b76baaec21f37e34" Apr 20 20:54:03.764896 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:54:03.764771 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-5wbvc" event={"ID":"dcea1326-bb23-43e3-87f8-ce6ec2876f6c","Type":"ContainerDied","Data":"4726c470c2000d7c3fdcad1e161d6959611e5b93a476559b9dd3a76970c6dfdb"} Apr 20 20:54:03.773009 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:54:03.772987 2576 scope.go:117] "RemoveContainer" containerID="eec324011df7b67fc0019d04ca9d8853d23878eb8202e636a1c70a6b8d5dd9c0" Apr 20 20:54:03.779985 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:54:03.779969 2576 scope.go:117] "RemoveContainer" containerID="9129b1ecf1e374ffb02ae338ac381809f5e3bc44abd5ffce3afe7a874bed7e09" Apr 20 20:54:03.786306 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:54:03.786283 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-5wbvc"] Apr 20 20:54:03.786738 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:54:03.786720 2576 scope.go:117] "RemoveContainer" containerID="817ecaf184f5558b73c86488c9e574beaf8ec5ace5de50d2b76baaec21f37e34" Apr 20 20:54:03.787028 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:54:03.787012 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"817ecaf184f5558b73c86488c9e574beaf8ec5ace5de50d2b76baaec21f37e34\": container with ID starting with 817ecaf184f5558b73c86488c9e574beaf8ec5ace5de50d2b76baaec21f37e34 not found: ID does not exist" containerID="817ecaf184f5558b73c86488c9e574beaf8ec5ace5de50d2b76baaec21f37e34" Apr 20 20:54:03.787098 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:54:03.787037 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"817ecaf184f5558b73c86488c9e574beaf8ec5ace5de50d2b76baaec21f37e34"} err="failed to get container status \"817ecaf184f5558b73c86488c9e574beaf8ec5ace5de50d2b76baaec21f37e34\": rpc error: code = NotFound desc = could not find container \"817ecaf184f5558b73c86488c9e574beaf8ec5ace5de50d2b76baaec21f37e34\": container with ID starting with 817ecaf184f5558b73c86488c9e574beaf8ec5ace5de50d2b76baaec21f37e34 not found: ID does not exist" Apr 20 20:54:03.787098 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:54:03.787053 2576 scope.go:117] "RemoveContainer" containerID="eec324011df7b67fc0019d04ca9d8853d23878eb8202e636a1c70a6b8d5dd9c0" Apr 20 20:54:03.787323 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:54:03.787303 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eec324011df7b67fc0019d04ca9d8853d23878eb8202e636a1c70a6b8d5dd9c0\": container with ID starting with eec324011df7b67fc0019d04ca9d8853d23878eb8202e636a1c70a6b8d5dd9c0 not found: ID does not exist" containerID="eec324011df7b67fc0019d04ca9d8853d23878eb8202e636a1c70a6b8d5dd9c0" Apr 20 20:54:03.787374 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:54:03.787329 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eec324011df7b67fc0019d04ca9d8853d23878eb8202e636a1c70a6b8d5dd9c0"} err="failed to get container status \"eec324011df7b67fc0019d04ca9d8853d23878eb8202e636a1c70a6b8d5dd9c0\": rpc error: code = NotFound desc = could not find container \"eec324011df7b67fc0019d04ca9d8853d23878eb8202e636a1c70a6b8d5dd9c0\": container with ID starting with eec324011df7b67fc0019d04ca9d8853d23878eb8202e636a1c70a6b8d5dd9c0 not found: ID does not exist" Apr 20 20:54:03.787374 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:54:03.787344 2576 scope.go:117] "RemoveContainer" containerID="9129b1ecf1e374ffb02ae338ac381809f5e3bc44abd5ffce3afe7a874bed7e09" Apr 20 20:54:03.787582 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:54:03.787565 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9129b1ecf1e374ffb02ae338ac381809f5e3bc44abd5ffce3afe7a874bed7e09\": container with ID starting with 9129b1ecf1e374ffb02ae338ac381809f5e3bc44abd5ffce3afe7a874bed7e09 not found: ID does not exist" containerID="9129b1ecf1e374ffb02ae338ac381809f5e3bc44abd5ffce3afe7a874bed7e09" Apr 20 20:54:03.787634 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:54:03.787590 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9129b1ecf1e374ffb02ae338ac381809f5e3bc44abd5ffce3afe7a874bed7e09"} err="failed to get container status \"9129b1ecf1e374ffb02ae338ac381809f5e3bc44abd5ffce3afe7a874bed7e09\": rpc error: code = NotFound desc = could not find container \"9129b1ecf1e374ffb02ae338ac381809f5e3bc44abd5ffce3afe7a874bed7e09\": container with ID starting with 9129b1ecf1e374ffb02ae338ac381809f5e3bc44abd5ffce3afe7a874bed7e09 not found: ID does not exist" Apr 20 20:54:03.790426 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:54:03.790405 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-5wbvc"] Apr 20 20:54:03.944614 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:54:03.944582 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcea1326-bb23-43e3-87f8-ce6ec2876f6c" path="/var/lib/kubelet/pods/dcea1326-bb23-43e3-87f8-ce6ec2876f6c/volumes" Apr 20 20:54:04.769579 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:54:04.769550 2576 generic.go:358] "Generic (PLEG): container finished" podID="29196e1f-5f82-4028-9890-4c3ea89e97c7" containerID="4068141a8fb6d07a29a4f55ac33dd31894d32387156a83a2a5404e8c1a7d2b2a" exitCode=0 Apr 20 20:54:04.769997 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:54:04.769636 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht" event={"ID":"29196e1f-5f82-4028-9890-4c3ea89e97c7","Type":"ContainerDied","Data":"4068141a8fb6d07a29a4f55ac33dd31894d32387156a83a2a5404e8c1a7d2b2a"} Apr 20 20:54:05.776101 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:54:05.776069 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht" event={"ID":"29196e1f-5f82-4028-9890-4c3ea89e97c7","Type":"ContainerStarted","Data":"42b674022d57184d2903e75d637a61d7ddbd4632ce31ac5da19ef4d0e8ed1fbc"} Apr 20 20:54:05.776584 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:54:05.776105 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht" event={"ID":"29196e1f-5f82-4028-9890-4c3ea89e97c7","Type":"ContainerStarted","Data":"ab3765163940e4f5c5243de66d6b9d15a9c30fb2060399cb2efbd027b9e84cf0"} Apr 20 20:54:05.776584 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:54:05.776376 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht" Apr 20 20:54:05.796320 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:54:05.796282 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht" podStartSLOduration=7.796269241 podStartE2EDuration="7.796269241s" podCreationTimestamp="2026-04-20 20:53:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:54:05.794094264 +0000 UTC m=+2920.506776261" watchObservedRunningTime="2026-04-20 20:54:05.796269241 +0000 UTC m=+2920.508951238" Apr 20 20:54:06.779336 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:54:06.779303 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht" Apr 20 20:54:06.780581 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:54:06.780548 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht" podUID="29196e1f-5f82-4028-9890-4c3ea89e97c7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 20 20:54:07.782566 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:54:07.782528 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht" podUID="29196e1f-5f82-4028-9890-4c3ea89e97c7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 20 20:54:12.787352 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:54:12.787320 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht" Apr 20 20:54:12.787891 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:54:12.787865 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht" podUID="29196e1f-5f82-4028-9890-4c3ea89e97c7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 20 20:54:22.787960 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:54:22.787925 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht" podUID="29196e1f-5f82-4028-9890-4c3ea89e97c7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 20 20:54:32.788276 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:54:32.788233 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht" podUID="29196e1f-5f82-4028-9890-4c3ea89e97c7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 20 20:54:42.787882 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:54:42.787800 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht" podUID="29196e1f-5f82-4028-9890-4c3ea89e97c7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 20 20:54:52.788283 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:54:52.788240 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht" podUID="29196e1f-5f82-4028-9890-4c3ea89e97c7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 20 20:55:02.788285 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:55:02.788243 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht" podUID="29196e1f-5f82-4028-9890-4c3ea89e97c7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 20 20:55:12.788290 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:55:12.788254 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht" Apr 20 20:55:19.126340 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:55:19.126306 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht"] Apr 20 20:55:19.126939 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:55:19.126904 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht" podUID="29196e1f-5f82-4028-9890-4c3ea89e97c7" containerName="kserve-container" containerID="cri-o://ab3765163940e4f5c5243de66d6b9d15a9c30fb2060399cb2efbd027b9e84cf0" gracePeriod=30 Apr 20 20:55:19.127101 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:55:19.126927 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht" podUID="29196e1f-5f82-4028-9890-4c3ea89e97c7" containerName="kube-rbac-proxy" containerID="cri-o://42b674022d57184d2903e75d637a61d7ddbd4632ce31ac5da19ef4d0e8ed1fbc" gracePeriod=30 Apr 20 20:55:20.017004 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:55:20.016970 2576 generic.go:358] "Generic (PLEG): container finished" podID="29196e1f-5f82-4028-9890-4c3ea89e97c7" containerID="42b674022d57184d2903e75d637a61d7ddbd4632ce31ac5da19ef4d0e8ed1fbc" exitCode=2 Apr 20 20:55:20.017187 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:55:20.017044 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht" event={"ID":"29196e1f-5f82-4028-9890-4c3ea89e97c7","Type":"ContainerDied","Data":"42b674022d57184d2903e75d637a61d7ddbd4632ce31ac5da19ef4d0e8ed1fbc"} Apr 20 20:55:22.783499 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:55:22.783459 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht" podUID="29196e1f-5f82-4028-9890-4c3ea89e97c7" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.58:8643/healthz\": dial tcp 10.132.0.58:8643: connect: connection refused" Apr 20 20:55:22.787804 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:55:22.787779 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht" podUID="29196e1f-5f82-4028-9890-4c3ea89e97c7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 20 20:55:23.029368 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:55:23.029330 2576 generic.go:358] "Generic (PLEG): container finished" podID="29196e1f-5f82-4028-9890-4c3ea89e97c7" containerID="ab3765163940e4f5c5243de66d6b9d15a9c30fb2060399cb2efbd027b9e84cf0" exitCode=0 Apr 20 20:55:23.029493 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:55:23.029387 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht" event={"ID":"29196e1f-5f82-4028-9890-4c3ea89e97c7","Type":"ContainerDied","Data":"ab3765163940e4f5c5243de66d6b9d15a9c30fb2060399cb2efbd027b9e84cf0"} Apr 20 20:55:23.066223 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:55:23.066204 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht" Apr 20 20:55:23.174012 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:55:23.173987 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/29196e1f-5f82-4028-9890-4c3ea89e97c7-kserve-provision-location\") pod \"29196e1f-5f82-4028-9890-4c3ea89e97c7\" (UID: \"29196e1f-5f82-4028-9890-4c3ea89e97c7\") " Apr 20 20:55:23.174161 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:55:23.174031 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/29196e1f-5f82-4028-9890-4c3ea89e97c7-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"29196e1f-5f82-4028-9890-4c3ea89e97c7\" (UID: \"29196e1f-5f82-4028-9890-4c3ea89e97c7\") " Apr 20 20:55:23.174161 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:55:23.174082 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqhrl\" (UniqueName: \"kubernetes.io/projected/29196e1f-5f82-4028-9890-4c3ea89e97c7-kube-api-access-rqhrl\") pod \"29196e1f-5f82-4028-9890-4c3ea89e97c7\" (UID: \"29196e1f-5f82-4028-9890-4c3ea89e97c7\") " Apr 20 20:55:23.174161 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:55:23.174155 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/29196e1f-5f82-4028-9890-4c3ea89e97c7-proxy-tls\") pod \"29196e1f-5f82-4028-9890-4c3ea89e97c7\" (UID: \"29196e1f-5f82-4028-9890-4c3ea89e97c7\") " Apr 20 20:55:23.174421 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:55:23.174339 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29196e1f-5f82-4028-9890-4c3ea89e97c7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "29196e1f-5f82-4028-9890-4c3ea89e97c7" (UID: "29196e1f-5f82-4028-9890-4c3ea89e97c7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:55:23.174551 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:55:23.174524 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29196e1f-5f82-4028-9890-4c3ea89e97c7-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config") pod "29196e1f-5f82-4028-9890-4c3ea89e97c7" (UID: "29196e1f-5f82-4028-9890-4c3ea89e97c7"). InnerVolumeSpecName "isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:55:23.176201 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:55:23.176173 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29196e1f-5f82-4028-9890-4c3ea89e97c7-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "29196e1f-5f82-4028-9890-4c3ea89e97c7" (UID: "29196e1f-5f82-4028-9890-4c3ea89e97c7"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:55:23.176303 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:55:23.176214 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29196e1f-5f82-4028-9890-4c3ea89e97c7-kube-api-access-rqhrl" (OuterVolumeSpecName: "kube-api-access-rqhrl") pod "29196e1f-5f82-4028-9890-4c3ea89e97c7" (UID: "29196e1f-5f82-4028-9890-4c3ea89e97c7"). InnerVolumeSpecName "kube-api-access-rqhrl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:55:23.274892 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:55:23.274867 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/29196e1f-5f82-4028-9890-4c3ea89e97c7-proxy-tls\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:55:23.274892 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:55:23.274891 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/29196e1f-5f82-4028-9890-4c3ea89e97c7-kserve-provision-location\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:55:23.275036 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:55:23.274902 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/29196e1f-5f82-4028-9890-4c3ea89e97c7-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:55:23.275036 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:55:23.274912 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rqhrl\" (UniqueName: \"kubernetes.io/projected/29196e1f-5f82-4028-9890-4c3ea89e97c7-kube-api-access-rqhrl\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:55:24.033823 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:55:24.033798 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht" Apr 20 20:55:24.034203 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:55:24.033803 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht" event={"ID":"29196e1f-5f82-4028-9890-4c3ea89e97c7","Type":"ContainerDied","Data":"b57af04852f22829ca92136305d8821f0b034a3117fd5efc801cf51f26730522"} Apr 20 20:55:24.034203 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:55:24.033909 2576 scope.go:117] "RemoveContainer" containerID="42b674022d57184d2903e75d637a61d7ddbd4632ce31ac5da19ef4d0e8ed1fbc" Apr 20 20:55:24.042089 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:55:24.042054 2576 scope.go:117] "RemoveContainer" containerID="ab3765163940e4f5c5243de66d6b9d15a9c30fb2060399cb2efbd027b9e84cf0" Apr 20 20:55:24.052474 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:55:24.052450 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht"] Apr 20 20:55:24.053166 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:55:24.052876 2576 scope.go:117] "RemoveContainer" containerID="4068141a8fb6d07a29a4f55ac33dd31894d32387156a83a2a5404e8c1a7d2b2a" Apr 20 20:55:24.055038 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:55:24.055018 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-2cbht"] Apr 20 20:55:25.945407 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:55:25.945367 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29196e1f-5f82-4028-9890-4c3ea89e97c7" path="/var/lib/kubelet/pods/29196e1f-5f82-4028-9890-4c3ea89e97c7/volumes" Apr 20 20:55:54.501767 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:55:54.501649 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z55qt_f78ac3d9-bcf1-43dd-aac7-1678831ee3ba/ovn-acl-logging/0.log" Apr 20 20:55:54.507733 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:55:54.507713 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z55qt_f78ac3d9-bcf1-43dd-aac7-1678831ee3ba/ovn-acl-logging/0.log" Apr 20 20:56:00.599703 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:00.599666 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v"] Apr 20 20:56:00.600178 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:00.599986 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="29196e1f-5f82-4028-9890-4c3ea89e97c7" containerName="kserve-container" Apr 20 20:56:00.600178 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:00.599997 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="29196e1f-5f82-4028-9890-4c3ea89e97c7" containerName="kserve-container" Apr 20 20:56:00.600178 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:00.600005 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="29196e1f-5f82-4028-9890-4c3ea89e97c7" containerName="kube-rbac-proxy" Apr 20 20:56:00.600178 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:00.600011 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="29196e1f-5f82-4028-9890-4c3ea89e97c7" containerName="kube-rbac-proxy" Apr 20 20:56:00.600178 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:00.600023 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="29196e1f-5f82-4028-9890-4c3ea89e97c7" containerName="storage-initializer" Apr 20 20:56:00.600178 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:00.600028 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="29196e1f-5f82-4028-9890-4c3ea89e97c7" containerName="storage-initializer" Apr 20 20:56:00.600178 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:00.600036 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dcea1326-bb23-43e3-87f8-ce6ec2876f6c" containerName="storage-initializer" Apr 20 20:56:00.600178 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:00.600041 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcea1326-bb23-43e3-87f8-ce6ec2876f6c" containerName="storage-initializer" Apr 20 20:56:00.600178 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:00.600052 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dcea1326-bb23-43e3-87f8-ce6ec2876f6c" containerName="kserve-container" Apr 20 20:56:00.600178 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:00.600057 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcea1326-bb23-43e3-87f8-ce6ec2876f6c" containerName="kserve-container" Apr 20 20:56:00.600178 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:00.600064 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dcea1326-bb23-43e3-87f8-ce6ec2876f6c" containerName="kube-rbac-proxy" Apr 20 20:56:00.600178 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:00.600068 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcea1326-bb23-43e3-87f8-ce6ec2876f6c" containerName="kube-rbac-proxy" Apr 20 20:56:00.600178 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:00.600150 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="dcea1326-bb23-43e3-87f8-ce6ec2876f6c" containerName="kserve-container" Apr 20 20:56:00.600178 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:00.600159 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="dcea1326-bb23-43e3-87f8-ce6ec2876f6c" containerName="kube-rbac-proxy" Apr 20 20:56:00.600178 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:00.600165 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="29196e1f-5f82-4028-9890-4c3ea89e97c7" containerName="kube-rbac-proxy" Apr 20 20:56:00.600178 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:00.600172 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="29196e1f-5f82-4028-9890-4c3ea89e97c7" containerName="kserve-container" Apr 20 20:56:00.602915 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:00.602898 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v" Apr 20 20:56:00.605475 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:00.605451 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 20 20:56:00.605630 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:00.605611 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-runtime-predictor-serving-cert\"" Apr 20 20:56:00.605698 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:00.605632 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 20 20:56:00.605698 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:00.605638 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\"" Apr 20 20:56:00.605948 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:00.605933 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-c5zv6\"" Apr 20 20:56:00.613064 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:00.613034 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v"] Apr 20 20:56:00.644238 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:00.644206 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0882a4cb-0400-4962-b778-7841f8fb63e5-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v\" (UID: \"0882a4cb-0400-4962-b778-7841f8fb63e5\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v" Apr 20 20:56:00.644340 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:00.644255 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8c8v\" (UniqueName: \"kubernetes.io/projected/0882a4cb-0400-4962-b778-7841f8fb63e5-kube-api-access-h8c8v\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v\" (UID: \"0882a4cb-0400-4962-b778-7841f8fb63e5\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v" Apr 20 20:56:00.644400 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:00.644381 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0882a4cb-0400-4962-b778-7841f8fb63e5-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v\" (UID: \"0882a4cb-0400-4962-b778-7841f8fb63e5\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v" Apr 20 20:56:00.644441 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:00.644414 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0882a4cb-0400-4962-b778-7841f8fb63e5-proxy-tls\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v\" (UID: \"0882a4cb-0400-4962-b778-7841f8fb63e5\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v" Apr 20 20:56:00.745751 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:00.745724 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0882a4cb-0400-4962-b778-7841f8fb63e5-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v\" (UID: \"0882a4cb-0400-4962-b778-7841f8fb63e5\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v" Apr 20 20:56:00.745860 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:00.745757 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0882a4cb-0400-4962-b778-7841f8fb63e5-proxy-tls\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v\" (UID: \"0882a4cb-0400-4962-b778-7841f8fb63e5\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v" Apr 20 20:56:00.745860 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:00.745785 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0882a4cb-0400-4962-b778-7841f8fb63e5-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v\" (UID: \"0882a4cb-0400-4962-b778-7841f8fb63e5\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v" Apr 20 20:56:00.745860 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:00.745805 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h8c8v\" (UniqueName: \"kubernetes.io/projected/0882a4cb-0400-4962-b778-7841f8fb63e5-kube-api-access-h8c8v\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v\" (UID: \"0882a4cb-0400-4962-b778-7841f8fb63e5\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v" Apr 20 20:56:00.746011 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:56:00.745875 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-serving-cert: secret "isvc-tensorflow-runtime-predictor-serving-cert" not found Apr 20 20:56:00.746011 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:56:00.745932 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0882a4cb-0400-4962-b778-7841f8fb63e5-proxy-tls podName:0882a4cb-0400-4962-b778-7841f8fb63e5 nodeName:}" failed. No retries permitted until 2026-04-20 20:56:01.245913331 +0000 UTC m=+3035.958595318 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/0882a4cb-0400-4962-b778-7841f8fb63e5-proxy-tls") pod "isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v" (UID: "0882a4cb-0400-4962-b778-7841f8fb63e5") : secret "isvc-tensorflow-runtime-predictor-serving-cert" not found Apr 20 20:56:00.746245 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:00.746221 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0882a4cb-0400-4962-b778-7841f8fb63e5-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v\" (UID: \"0882a4cb-0400-4962-b778-7841f8fb63e5\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v" Apr 20 20:56:00.746445 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:00.746427 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0882a4cb-0400-4962-b778-7841f8fb63e5-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v\" (UID: \"0882a4cb-0400-4962-b778-7841f8fb63e5\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v" Apr 20 20:56:00.753937 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:00.753920 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8c8v\" (UniqueName: \"kubernetes.io/projected/0882a4cb-0400-4962-b778-7841f8fb63e5-kube-api-access-h8c8v\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v\" (UID: \"0882a4cb-0400-4962-b778-7841f8fb63e5\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v" Apr 20 20:56:01.250985 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:01.250952 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0882a4cb-0400-4962-b778-7841f8fb63e5-proxy-tls\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v\" (UID: \"0882a4cb-0400-4962-b778-7841f8fb63e5\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v" Apr 20 20:56:01.253376 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:01.253347 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0882a4cb-0400-4962-b778-7841f8fb63e5-proxy-tls\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v\" (UID: \"0882a4cb-0400-4962-b778-7841f8fb63e5\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v" Apr 20 20:56:01.514200 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:01.514097 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v" Apr 20 20:56:01.637677 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:01.637627 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v"] Apr 20 20:56:01.640099 ip-10-0-143-23 kubenswrapper[2576]: W0420 20:56:01.640073 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0882a4cb_0400_4962_b778_7841f8fb63e5.slice/crio-b2228c09dbd19a7c1a6f40bab989d33f2a876730108b96640df50a679cabbe6e WatchSource:0}: Error finding container b2228c09dbd19a7c1a6f40bab989d33f2a876730108b96640df50a679cabbe6e: Status 404 returned error can't find the container with id b2228c09dbd19a7c1a6f40bab989d33f2a876730108b96640df50a679cabbe6e Apr 20 20:56:02.164272 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:02.164234 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v" event={"ID":"0882a4cb-0400-4962-b778-7841f8fb63e5","Type":"ContainerStarted","Data":"a4131a96f430fe2bb58413424aec9d26bb425e83ba63a310b1ab72ee14b764a4"} Apr 20 20:56:02.164272 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:02.164276 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v" event={"ID":"0882a4cb-0400-4962-b778-7841f8fb63e5","Type":"ContainerStarted","Data":"b2228c09dbd19a7c1a6f40bab989d33f2a876730108b96640df50a679cabbe6e"} Apr 20 20:56:07.184390 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:07.184352 2576 generic.go:358] "Generic (PLEG): container finished" podID="0882a4cb-0400-4962-b778-7841f8fb63e5" containerID="a4131a96f430fe2bb58413424aec9d26bb425e83ba63a310b1ab72ee14b764a4" exitCode=0 Apr 20 20:56:07.184802 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:07.184394 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v" event={"ID":"0882a4cb-0400-4962-b778-7841f8fb63e5","Type":"ContainerDied","Data":"a4131a96f430fe2bb58413424aec9d26bb425e83ba63a310b1ab72ee14b764a4"} Apr 20 20:56:11.204120 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:11.204070 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v" event={"ID":"0882a4cb-0400-4962-b778-7841f8fb63e5","Type":"ContainerStarted","Data":"37c57a388ad25ba28fe19b49510e72c82c75af6363c6e2ccdd26018904144006"} Apr 20 20:56:11.204120 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:11.204105 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v" event={"ID":"0882a4cb-0400-4962-b778-7841f8fb63e5","Type":"ContainerStarted","Data":"afd6922e23782d7045c4a82d45822916da5330a796c9583e924c258719a156c8"} Apr 20 20:56:11.204508 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:11.204327 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v" Apr 20 20:56:11.226610 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:11.226557 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v" podStartSLOduration=7.69529382 podStartE2EDuration="11.22654207s" podCreationTimestamp="2026-04-20 20:56:00 +0000 UTC" firstStartedPulling="2026-04-20 20:56:07.185552836 +0000 UTC m=+3041.898234811" lastFinishedPulling="2026-04-20 20:56:10.716801083 +0000 UTC m=+3045.429483061" observedRunningTime="2026-04-20 20:56:11.225122004 +0000 UTC m=+3045.937803992" watchObservedRunningTime="2026-04-20 20:56:11.22654207 +0000 UTC m=+3045.939224067" Apr 20 20:56:12.207274 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:12.207241 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v" Apr 20 20:56:12.208297 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:12.208272 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v" podUID="0882a4cb-0400-4962-b778-7841f8fb63e5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.59:8080: connect: connection refused" Apr 20 20:56:13.209903 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:13.209851 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v" podUID="0882a4cb-0400-4962-b778-7841f8fb63e5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.59:8080: connect: connection refused" Apr 20 20:56:18.214466 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:18.214439 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v" Apr 20 20:56:18.214964 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:18.214935 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v" podUID="0882a4cb-0400-4962-b778-7841f8fb63e5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.59:8080: connect: connection refused" Apr 20 20:56:28.215988 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:28.215957 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v" Apr 20 20:56:40.570162 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:40.570130 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v"] Apr 20 20:56:40.570539 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:40.570422 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v" podUID="0882a4cb-0400-4962-b778-7841f8fb63e5" containerName="kserve-container" containerID="cri-o://afd6922e23782d7045c4a82d45822916da5330a796c9583e924c258719a156c8" gracePeriod=30 Apr 20 20:56:40.570539 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:40.570465 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v" podUID="0882a4cb-0400-4962-b778-7841f8fb63e5" containerName="kube-rbac-proxy" containerID="cri-o://37c57a388ad25ba28fe19b49510e72c82c75af6363c6e2ccdd26018904144006" gracePeriod=30 Apr 20 20:56:41.303338 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:41.303306 2576 generic.go:358] "Generic (PLEG): container finished" podID="0882a4cb-0400-4962-b778-7841f8fb63e5" containerID="37c57a388ad25ba28fe19b49510e72c82c75af6363c6e2ccdd26018904144006" exitCode=2 Apr 20 20:56:41.303502 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:41.303381 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v" event={"ID":"0882a4cb-0400-4962-b778-7841f8fb63e5","Type":"ContainerDied","Data":"37c57a388ad25ba28fe19b49510e72c82c75af6363c6e2ccdd26018904144006"} Apr 20 20:56:43.210315 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:43.210274 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v" podUID="0882a4cb-0400-4962-b778-7841f8fb63e5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.59:8643/healthz\": dial tcp 10.132.0.59:8643: connect: connection refused" Apr 20 20:56:48.210826 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:48.210789 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v" podUID="0882a4cb-0400-4962-b778-7841f8fb63e5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.59:8643/healthz\": dial tcp 10.132.0.59:8643: connect: connection refused" Apr 20 20:56:53.210362 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:53.210326 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v" podUID="0882a4cb-0400-4962-b778-7841f8fb63e5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.59:8643/healthz\": dial tcp 10.132.0.59:8643: connect: connection refused" Apr 20 20:56:53.210718 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:53.210430 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v" Apr 20 20:56:58.210889 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:56:58.210847 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v" podUID="0882a4cb-0400-4962-b778-7841f8fb63e5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.59:8643/healthz\": dial tcp 10.132.0.59:8643: connect: connection refused" Apr 20 20:57:03.210938 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:57:03.210896 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v" podUID="0882a4cb-0400-4962-b778-7841f8fb63e5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.59:8643/healthz\": dial tcp 10.132.0.59:8643: connect: connection refused" Apr 20 20:57:08.210619 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:57:08.210577 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v" podUID="0882a4cb-0400-4962-b778-7841f8fb63e5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.59:8643/healthz\": dial tcp 10.132.0.59:8643: connect: connection refused" Apr 20 20:57:11.202088 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:57:11.202065 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v" Apr 20 20:57:11.383136 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:57:11.383042 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0882a4cb-0400-4962-b778-7841f8fb63e5-proxy-tls\") pod \"0882a4cb-0400-4962-b778-7841f8fb63e5\" (UID: \"0882a4cb-0400-4962-b778-7841f8fb63e5\") " Apr 20 20:57:11.383136 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:57:11.383106 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8c8v\" (UniqueName: \"kubernetes.io/projected/0882a4cb-0400-4962-b778-7841f8fb63e5-kube-api-access-h8c8v\") pod \"0882a4cb-0400-4962-b778-7841f8fb63e5\" (UID: \"0882a4cb-0400-4962-b778-7841f8fb63e5\") " Apr 20 20:57:11.383351 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:57:11.383158 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0882a4cb-0400-4962-b778-7841f8fb63e5-kserve-provision-location\") pod \"0882a4cb-0400-4962-b778-7841f8fb63e5\" (UID: \"0882a4cb-0400-4962-b778-7841f8fb63e5\") " Apr 20 20:57:11.383351 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:57:11.383188 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0882a4cb-0400-4962-b778-7841f8fb63e5-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"0882a4cb-0400-4962-b778-7841f8fb63e5\" (UID: \"0882a4cb-0400-4962-b778-7841f8fb63e5\") " Apr 20 20:57:11.383569 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:57:11.383534 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0882a4cb-0400-4962-b778-7841f8fb63e5-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-tensorflow-runtime-kube-rbac-proxy-sar-config") pod "0882a4cb-0400-4962-b778-7841f8fb63e5" (UID: "0882a4cb-0400-4962-b778-7841f8fb63e5"). InnerVolumeSpecName "isvc-tensorflow-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:57:11.385200 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:57:11.385176 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0882a4cb-0400-4962-b778-7841f8fb63e5-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0882a4cb-0400-4962-b778-7841f8fb63e5" (UID: "0882a4cb-0400-4962-b778-7841f8fb63e5"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:57:11.385372 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:57:11.385352 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0882a4cb-0400-4962-b778-7841f8fb63e5-kube-api-access-h8c8v" (OuterVolumeSpecName: "kube-api-access-h8c8v") pod "0882a4cb-0400-4962-b778-7841f8fb63e5" (UID: "0882a4cb-0400-4962-b778-7841f8fb63e5"). InnerVolumeSpecName "kube-api-access-h8c8v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:57:11.394330 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:57:11.394298 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0882a4cb-0400-4962-b778-7841f8fb63e5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0882a4cb-0400-4962-b778-7841f8fb63e5" (UID: "0882a4cb-0400-4962-b778-7841f8fb63e5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:57:11.408364 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:57:11.408335 2576 generic.go:358] "Generic (PLEG): container finished" podID="0882a4cb-0400-4962-b778-7841f8fb63e5" containerID="afd6922e23782d7045c4a82d45822916da5330a796c9583e924c258719a156c8" exitCode=137 Apr 20 20:57:11.408468 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:57:11.408407 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v" Apr 20 20:57:11.408468 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:57:11.408428 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v" event={"ID":"0882a4cb-0400-4962-b778-7841f8fb63e5","Type":"ContainerDied","Data":"afd6922e23782d7045c4a82d45822916da5330a796c9583e924c258719a156c8"} Apr 20 20:57:11.408468 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:57:11.408465 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v" event={"ID":"0882a4cb-0400-4962-b778-7841f8fb63e5","Type":"ContainerDied","Data":"b2228c09dbd19a7c1a6f40bab989d33f2a876730108b96640df50a679cabbe6e"} Apr 20 20:57:11.408586 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:57:11.408487 2576 scope.go:117] "RemoveContainer" containerID="37c57a388ad25ba28fe19b49510e72c82c75af6363c6e2ccdd26018904144006" Apr 20 20:57:11.417696 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:57:11.417675 2576 scope.go:117] "RemoveContainer" containerID="afd6922e23782d7045c4a82d45822916da5330a796c9583e924c258719a156c8" Apr 20 20:57:11.424724 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:57:11.424704 2576 scope.go:117] "RemoveContainer" containerID="a4131a96f430fe2bb58413424aec9d26bb425e83ba63a310b1ab72ee14b764a4" Apr 20 20:57:11.431421 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:57:11.431396 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v"] Apr 20 20:57:11.432170 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:57:11.432148 2576 scope.go:117] "RemoveContainer" containerID="37c57a388ad25ba28fe19b49510e72c82c75af6363c6e2ccdd26018904144006" Apr 20 20:57:11.432394 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:57:11.432377 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37c57a388ad25ba28fe19b49510e72c82c75af6363c6e2ccdd26018904144006\": container with ID starting with 37c57a388ad25ba28fe19b49510e72c82c75af6363c6e2ccdd26018904144006 not found: ID does not exist" containerID="37c57a388ad25ba28fe19b49510e72c82c75af6363c6e2ccdd26018904144006" Apr 20 20:57:11.432456 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:57:11.432401 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37c57a388ad25ba28fe19b49510e72c82c75af6363c6e2ccdd26018904144006"} err="failed to get container status \"37c57a388ad25ba28fe19b49510e72c82c75af6363c6e2ccdd26018904144006\": rpc error: code = NotFound desc = could not find container \"37c57a388ad25ba28fe19b49510e72c82c75af6363c6e2ccdd26018904144006\": container with ID starting with 37c57a388ad25ba28fe19b49510e72c82c75af6363c6e2ccdd26018904144006 not found: ID does not exist" Apr 20 20:57:11.432456 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:57:11.432419 2576 scope.go:117] "RemoveContainer" containerID="afd6922e23782d7045c4a82d45822916da5330a796c9583e924c258719a156c8" Apr 20 20:57:11.432682 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:57:11.432659 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afd6922e23782d7045c4a82d45822916da5330a796c9583e924c258719a156c8\": container with ID starting with afd6922e23782d7045c4a82d45822916da5330a796c9583e924c258719a156c8 not found: ID does not exist" containerID="afd6922e23782d7045c4a82d45822916da5330a796c9583e924c258719a156c8" Apr 20 20:57:11.432796 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:57:11.432691 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afd6922e23782d7045c4a82d45822916da5330a796c9583e924c258719a156c8"} err="failed to get container status \"afd6922e23782d7045c4a82d45822916da5330a796c9583e924c258719a156c8\": rpc error: code = NotFound desc = could not find container \"afd6922e23782d7045c4a82d45822916da5330a796c9583e924c258719a156c8\": container with ID starting with afd6922e23782d7045c4a82d45822916da5330a796c9583e924c258719a156c8 not found: ID does not exist" Apr 20 20:57:11.432796 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:57:11.432713 2576 scope.go:117] "RemoveContainer" containerID="a4131a96f430fe2bb58413424aec9d26bb425e83ba63a310b1ab72ee14b764a4" Apr 20 20:57:11.433414 ip-10-0-143-23 kubenswrapper[2576]: E0420 20:57:11.433330 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4131a96f430fe2bb58413424aec9d26bb425e83ba63a310b1ab72ee14b764a4\": container with ID starting with a4131a96f430fe2bb58413424aec9d26bb425e83ba63a310b1ab72ee14b764a4 not found: ID does not exist" containerID="a4131a96f430fe2bb58413424aec9d26bb425e83ba63a310b1ab72ee14b764a4" Apr 20 20:57:11.433414 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:57:11.433391 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4131a96f430fe2bb58413424aec9d26bb425e83ba63a310b1ab72ee14b764a4"} err="failed to get container status \"a4131a96f430fe2bb58413424aec9d26bb425e83ba63a310b1ab72ee14b764a4\": rpc error: code = NotFound desc = could not find container \"a4131a96f430fe2bb58413424aec9d26bb425e83ba63a310b1ab72ee14b764a4\": container with ID starting with a4131a96f430fe2bb58413424aec9d26bb425e83ba63a310b1ab72ee14b764a4 not found: ID does not exist" Apr 20 20:57:11.434761 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:57:11.434743 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-8vx6v"] Apr 20 20:57:11.484173 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:57:11.484152 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0882a4cb-0400-4962-b778-7841f8fb63e5-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:57:11.484255 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:57:11.484174 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0882a4cb-0400-4962-b778-7841f8fb63e5-proxy-tls\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:57:11.484255 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:57:11.484186 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h8c8v\" (UniqueName: \"kubernetes.io/projected/0882a4cb-0400-4962-b778-7841f8fb63e5-kube-api-access-h8c8v\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:57:11.484255 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:57:11.484196 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0882a4cb-0400-4962-b778-7841f8fb63e5-kserve-provision-location\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 20:57:11.944464 ip-10-0-143-23 kubenswrapper[2576]: I0420 20:57:11.944434 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0882a4cb-0400-4962-b778-7841f8fb63e5" path="/var/lib/kubelet/pods/0882a4cb-0400-4962-b778-7841f8fb63e5/volumes" Apr 20 21:00:23.755788 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:00:23.755757 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-s4zrs"] Apr 20 21:00:23.756232 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:00:23.756128 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0882a4cb-0400-4962-b778-7841f8fb63e5" containerName="kserve-container" Apr 20 21:00:23.756232 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:00:23.756141 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0882a4cb-0400-4962-b778-7841f8fb63e5" containerName="kserve-container" Apr 20 21:00:23.756232 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:00:23.756158 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0882a4cb-0400-4962-b778-7841f8fb63e5" containerName="storage-initializer" Apr 20 21:00:23.756232 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:00:23.756164 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0882a4cb-0400-4962-b778-7841f8fb63e5" containerName="storage-initializer" Apr 20 21:00:23.756232 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:00:23.756174 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0882a4cb-0400-4962-b778-7841f8fb63e5" containerName="kube-rbac-proxy" Apr 20 21:00:23.756232 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:00:23.756180 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0882a4cb-0400-4962-b778-7841f8fb63e5" containerName="kube-rbac-proxy" Apr 20 21:00:23.756429 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:00:23.756236 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="0882a4cb-0400-4962-b778-7841f8fb63e5" containerName="kserve-container" Apr 20 21:00:23.756429 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:00:23.756247 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="0882a4cb-0400-4962-b778-7841f8fb63e5" containerName="kube-rbac-proxy" Apr 20 21:00:23.759289 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:00:23.759274 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-s4zrs" Apr 20 21:00:23.761703 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:00:23.761678 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-mlserver-predictor-serving-cert\"" Apr 20 21:00:23.761837 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:00:23.761765 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-c5zv6\"" Apr 20 21:00:23.761837 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:00:23.761812 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 20 21:00:23.761958 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:00:23.761866 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 20 21:00:23.761958 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:00:23.761912 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\"" Apr 20 21:00:23.769350 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:00:23.769320 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-s4zrs"] Apr 20 21:00:23.860417 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:00:23.860389 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/087ffc97-2de9-405c-b35e-6882975ff471-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-s4zrs\" (UID: \"087ffc97-2de9-405c-b35e-6882975ff471\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-s4zrs" Apr 20 21:00:23.860534 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:00:23.860422 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/087ffc97-2de9-405c-b35e-6882975ff471-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-s4zrs\" (UID: \"087ffc97-2de9-405c-b35e-6882975ff471\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-s4zrs" Apr 20 21:00:23.860534 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:00:23.860441 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8jn2\" (UniqueName: \"kubernetes.io/projected/087ffc97-2de9-405c-b35e-6882975ff471-kube-api-access-t8jn2\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-s4zrs\" (UID: \"087ffc97-2de9-405c-b35e-6882975ff471\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-s4zrs" Apr 20 21:00:23.860629 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:00:23.860542 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/087ffc97-2de9-405c-b35e-6882975ff471-proxy-tls\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-s4zrs\" (UID: \"087ffc97-2de9-405c-b35e-6882975ff471\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-s4zrs" Apr 20 21:00:23.961047 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:00:23.961018 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/087ffc97-2de9-405c-b35e-6882975ff471-proxy-tls\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-s4zrs\" (UID: \"087ffc97-2de9-405c-b35e-6882975ff471\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-s4zrs" Apr 20 21:00:23.961213 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:00:23.961064 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/087ffc97-2de9-405c-b35e-6882975ff471-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-s4zrs\" (UID: \"087ffc97-2de9-405c-b35e-6882975ff471\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-s4zrs" Apr 20 21:00:23.961213 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:00:23.961087 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/087ffc97-2de9-405c-b35e-6882975ff471-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-s4zrs\" (UID: \"087ffc97-2de9-405c-b35e-6882975ff471\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-s4zrs" Apr 20 21:00:23.961213 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:00:23.961104 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t8jn2\" (UniqueName: \"kubernetes.io/projected/087ffc97-2de9-405c-b35e-6882975ff471-kube-api-access-t8jn2\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-s4zrs\" (UID: \"087ffc97-2de9-405c-b35e-6882975ff471\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-s4zrs" Apr 20 21:00:23.961463 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:00:23.961444 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/087ffc97-2de9-405c-b35e-6882975ff471-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-s4zrs\" (UID: \"087ffc97-2de9-405c-b35e-6882975ff471\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-s4zrs" Apr 20 21:00:23.961791 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:00:23.961770 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/087ffc97-2de9-405c-b35e-6882975ff471-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-s4zrs\" (UID: \"087ffc97-2de9-405c-b35e-6882975ff471\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-s4zrs" Apr 20 21:00:23.963430 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:00:23.963412 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/087ffc97-2de9-405c-b35e-6882975ff471-proxy-tls\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-s4zrs\" (UID: \"087ffc97-2de9-405c-b35e-6882975ff471\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-s4zrs" Apr 20 21:00:23.969022 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:00:23.969000 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8jn2\" (UniqueName: \"kubernetes.io/projected/087ffc97-2de9-405c-b35e-6882975ff471-kube-api-access-t8jn2\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-s4zrs\" (UID: \"087ffc97-2de9-405c-b35e-6882975ff471\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-s4zrs" Apr 20 21:00:24.071944 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:00:24.071923 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-s4zrs" Apr 20 21:00:24.191468 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:00:24.191348 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-s4zrs"] Apr 20 21:00:24.194269 ip-10-0-143-23 kubenswrapper[2576]: W0420 21:00:24.194241 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod087ffc97_2de9_405c_b35e_6882975ff471.slice/crio-fdb7b419c613ba9f3e002f499ae63c0d59f6ca53f09b166458a9e888cc0637a2 WatchSource:0}: Error finding container fdb7b419c613ba9f3e002f499ae63c0d59f6ca53f09b166458a9e888cc0637a2: Status 404 returned error can't find the container with id fdb7b419c613ba9f3e002f499ae63c0d59f6ca53f09b166458a9e888cc0637a2 Apr 20 21:00:24.196557 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:00:24.196539 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 21:00:25.038336 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:00:25.038300 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-s4zrs" event={"ID":"087ffc97-2de9-405c-b35e-6882975ff471","Type":"ContainerStarted","Data":"19db1d2b13569e77f39a98f697b365e7cef6e97083910ce5df960f73ca56d870"} Apr 20 21:00:25.038336 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:00:25.038336 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-s4zrs" event={"ID":"087ffc97-2de9-405c-b35e-6882975ff471","Type":"ContainerStarted","Data":"fdb7b419c613ba9f3e002f499ae63c0d59f6ca53f09b166458a9e888cc0637a2"} Apr 20 21:00:28.054677 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:00:28.054639 2576 generic.go:358] "Generic (PLEG): container finished" podID="087ffc97-2de9-405c-b35e-6882975ff471" containerID="19db1d2b13569e77f39a98f697b365e7cef6e97083910ce5df960f73ca56d870" exitCode=0 Apr 20 21:00:28.055098 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:00:28.054718 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-s4zrs" event={"ID":"087ffc97-2de9-405c-b35e-6882975ff471","Type":"ContainerDied","Data":"19db1d2b13569e77f39a98f697b365e7cef6e97083910ce5df960f73ca56d870"} Apr 20 21:00:29.060360 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:00:29.060323 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-s4zrs" event={"ID":"087ffc97-2de9-405c-b35e-6882975ff471","Type":"ContainerStarted","Data":"5892531b62e463cf3902f0dcb194c090d6fffb5d6ddb4b2cf8e7750f5163a2a0"} Apr 20 21:00:29.060360 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:00:29.060361 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-s4zrs" event={"ID":"087ffc97-2de9-405c-b35e-6882975ff471","Type":"ContainerStarted","Data":"420dc7b50a8a03c1662f759f8fc059dad38695abc7a071719e30f3f0f6a1b62c"} Apr 20 21:00:29.060756 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:00:29.060695 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-s4zrs" Apr 20 21:00:29.060756 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:00:29.060729 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-s4zrs" Apr 20 21:00:29.082133 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:00:29.082079 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-s4zrs" podStartSLOduration=6.082066538 podStartE2EDuration="6.082066538s" podCreationTimestamp="2026-04-20 21:00:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 21:00:29.080086631 +0000 UTC m=+3303.792768639" watchObservedRunningTime="2026-04-20 21:00:29.082066538 +0000 UTC m=+3303.794748599" Apr 20 21:00:35.069544 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:00:35.069511 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-s4zrs" Apr 20 21:00:54.526041 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:00:54.525921 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z55qt_f78ac3d9-bcf1-43dd-aac7-1678831ee3ba/ovn-acl-logging/0.log" Apr 20 21:00:54.533309 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:00:54.532377 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z55qt_f78ac3d9-bcf1-43dd-aac7-1678831ee3ba/ovn-acl-logging/0.log" Apr 20 21:01:05.073957 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:05.073928 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-s4zrs" Apr 20 21:01:13.829789 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:13.829759 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-s4zrs"] Apr 20 21:01:13.830255 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:13.830092 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-s4zrs" podUID="087ffc97-2de9-405c-b35e-6882975ff471" containerName="kserve-container" containerID="cri-o://420dc7b50a8a03c1662f759f8fc059dad38695abc7a071719e30f3f0f6a1b62c" gracePeriod=30 Apr 20 21:01:13.830255 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:13.830168 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-s4zrs" podUID="087ffc97-2de9-405c-b35e-6882975ff471" containerName="kube-rbac-proxy" containerID="cri-o://5892531b62e463cf3902f0dcb194c090d6fffb5d6ddb4b2cf8e7750f5163a2a0" gracePeriod=30 Apr 20 21:01:13.910913 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:13.910887 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8shp9"] Apr 20 21:01:13.914621 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:13.914603 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8shp9" Apr 20 21:01:13.917069 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:13.917053 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"xgboost-v2-mlserver-predictor-serving-cert\"" Apr 20 21:01:13.917171 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:13.917095 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\"" Apr 20 21:01:13.922882 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:13.922860 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8shp9"] Apr 20 21:01:13.944490 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:13.944463 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpcmb\" (UniqueName: \"kubernetes.io/projected/6b67b7a1-b2ad-4986-96b1-910e92def0ce-kube-api-access-kpcmb\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-8shp9\" (UID: \"6b67b7a1-b2ad-4986-96b1-910e92def0ce\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8shp9" Apr 20 21:01:13.944626 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:13.944522 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6b67b7a1-b2ad-4986-96b1-910e92def0ce-proxy-tls\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-8shp9\" (UID: \"6b67b7a1-b2ad-4986-96b1-910e92def0ce\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8shp9" Apr 20 21:01:13.944626 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:13.944555 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6b67b7a1-b2ad-4986-96b1-910e92def0ce-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-8shp9\" (UID: \"6b67b7a1-b2ad-4986-96b1-910e92def0ce\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8shp9" Apr 20 21:01:13.944626 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:13.944581 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6b67b7a1-b2ad-4986-96b1-910e92def0ce-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-8shp9\" (UID: \"6b67b7a1-b2ad-4986-96b1-910e92def0ce\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8shp9" Apr 20 21:01:14.045005 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:14.044975 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kpcmb\" (UniqueName: \"kubernetes.io/projected/6b67b7a1-b2ad-4986-96b1-910e92def0ce-kube-api-access-kpcmb\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-8shp9\" (UID: \"6b67b7a1-b2ad-4986-96b1-910e92def0ce\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8shp9" Apr 20 21:01:14.045159 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:14.045025 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6b67b7a1-b2ad-4986-96b1-910e92def0ce-proxy-tls\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-8shp9\" (UID: \"6b67b7a1-b2ad-4986-96b1-910e92def0ce\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8shp9" Apr 20 21:01:14.045235 ip-10-0-143-23 kubenswrapper[2576]: E0420 21:01:14.045159 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-serving-cert: secret "xgboost-v2-mlserver-predictor-serving-cert" not found Apr 20 21:01:14.045235 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:14.045158 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6b67b7a1-b2ad-4986-96b1-910e92def0ce-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-8shp9\" (UID: \"6b67b7a1-b2ad-4986-96b1-910e92def0ce\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8shp9" Apr 20 21:01:14.045235 ip-10-0-143-23 kubenswrapper[2576]: E0420 21:01:14.045229 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b67b7a1-b2ad-4986-96b1-910e92def0ce-proxy-tls podName:6b67b7a1-b2ad-4986-96b1-910e92def0ce nodeName:}" failed. No retries permitted until 2026-04-20 21:01:14.545208789 +0000 UTC m=+3349.257890768 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/6b67b7a1-b2ad-4986-96b1-910e92def0ce-proxy-tls") pod "xgboost-v2-mlserver-predictor-7799869d6f-8shp9" (UID: "6b67b7a1-b2ad-4986-96b1-910e92def0ce") : secret "xgboost-v2-mlserver-predictor-serving-cert" not found Apr 20 21:01:14.045400 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:14.045264 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6b67b7a1-b2ad-4986-96b1-910e92def0ce-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-8shp9\" (UID: \"6b67b7a1-b2ad-4986-96b1-910e92def0ce\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8shp9" Apr 20 21:01:14.045614 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:14.045595 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6b67b7a1-b2ad-4986-96b1-910e92def0ce-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-8shp9\" (UID: \"6b67b7a1-b2ad-4986-96b1-910e92def0ce\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8shp9" Apr 20 21:01:14.045766 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:14.045748 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6b67b7a1-b2ad-4986-96b1-910e92def0ce-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-8shp9\" (UID: \"6b67b7a1-b2ad-4986-96b1-910e92def0ce\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8shp9" Apr 20 21:01:14.056077 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:14.056050 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpcmb\" (UniqueName: \"kubernetes.io/projected/6b67b7a1-b2ad-4986-96b1-910e92def0ce-kube-api-access-kpcmb\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-8shp9\" (UID: \"6b67b7a1-b2ad-4986-96b1-910e92def0ce\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8shp9" Apr 20 21:01:14.222061 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:14.221977 2576 generic.go:358] "Generic (PLEG): container finished" podID="087ffc97-2de9-405c-b35e-6882975ff471" containerID="5892531b62e463cf3902f0dcb194c090d6fffb5d6ddb4b2cf8e7750f5163a2a0" exitCode=2 Apr 20 21:01:14.222061 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:14.222050 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-s4zrs" event={"ID":"087ffc97-2de9-405c-b35e-6882975ff471","Type":"ContainerDied","Data":"5892531b62e463cf3902f0dcb194c090d6fffb5d6ddb4b2cf8e7750f5163a2a0"} Apr 20 21:01:14.548737 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:14.548708 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6b67b7a1-b2ad-4986-96b1-910e92def0ce-proxy-tls\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-8shp9\" (UID: \"6b67b7a1-b2ad-4986-96b1-910e92def0ce\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8shp9" Apr 20 21:01:14.551092 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:14.551067 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6b67b7a1-b2ad-4986-96b1-910e92def0ce-proxy-tls\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-8shp9\" (UID: \"6b67b7a1-b2ad-4986-96b1-910e92def0ce\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8shp9" Apr 20 21:01:14.826309 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:14.826224 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8shp9" Apr 20 21:01:14.946739 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:14.946711 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8shp9"] Apr 20 21:01:14.948045 ip-10-0-143-23 kubenswrapper[2576]: W0420 21:01:14.948016 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b67b7a1_b2ad_4986_96b1_910e92def0ce.slice/crio-7fda5262cb6d767906429b874c1335f5d27e993ceeeed59d585584c9d5cc8fbf WatchSource:0}: Error finding container 7fda5262cb6d767906429b874c1335f5d27e993ceeeed59d585584c9d5cc8fbf: Status 404 returned error can't find the container with id 7fda5262cb6d767906429b874c1335f5d27e993ceeeed59d585584c9d5cc8fbf Apr 20 21:01:15.064749 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:15.064700 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-s4zrs" podUID="087ffc97-2de9-405c-b35e-6882975ff471" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.60:8643/healthz\": dial tcp 10.132.0.60:8643: connect: connection refused" Apr 20 21:01:15.226718 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:15.226632 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8shp9" event={"ID":"6b67b7a1-b2ad-4986-96b1-910e92def0ce","Type":"ContainerStarted","Data":"645f206487e09bd518d09b9cea5c49b8fb343a8a350d047d4552144ee4588627"} Apr 20 21:01:15.226718 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:15.226666 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8shp9" event={"ID":"6b67b7a1-b2ad-4986-96b1-910e92def0ce","Type":"ContainerStarted","Data":"7fda5262cb6d767906429b874c1335f5d27e993ceeeed59d585584c9d5cc8fbf"} Apr 20 21:01:19.240992 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:19.240955 2576 generic.go:358] "Generic (PLEG): container finished" podID="6b67b7a1-b2ad-4986-96b1-910e92def0ce" containerID="645f206487e09bd518d09b9cea5c49b8fb343a8a350d047d4552144ee4588627" exitCode=0 Apr 20 21:01:19.241374 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:19.241032 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8shp9" event={"ID":"6b67b7a1-b2ad-4986-96b1-910e92def0ce","Type":"ContainerDied","Data":"645f206487e09bd518d09b9cea5c49b8fb343a8a350d047d4552144ee4588627"} Apr 20 21:01:19.776989 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:19.776967 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-s4zrs" Apr 20 21:01:19.893434 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:19.893370 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/087ffc97-2de9-405c-b35e-6882975ff471-kserve-provision-location\") pod \"087ffc97-2de9-405c-b35e-6882975ff471\" (UID: \"087ffc97-2de9-405c-b35e-6882975ff471\") " Apr 20 21:01:19.893434 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:19.893412 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/087ffc97-2de9-405c-b35e-6882975ff471-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"087ffc97-2de9-405c-b35e-6882975ff471\" (UID: \"087ffc97-2de9-405c-b35e-6882975ff471\") " Apr 20 21:01:19.893599 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:19.893445 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/087ffc97-2de9-405c-b35e-6882975ff471-proxy-tls\") pod \"087ffc97-2de9-405c-b35e-6882975ff471\" (UID: \"087ffc97-2de9-405c-b35e-6882975ff471\") " Apr 20 21:01:19.893599 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:19.893486 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8jn2\" (UniqueName: \"kubernetes.io/projected/087ffc97-2de9-405c-b35e-6882975ff471-kube-api-access-t8jn2\") pod \"087ffc97-2de9-405c-b35e-6882975ff471\" (UID: \"087ffc97-2de9-405c-b35e-6882975ff471\") " Apr 20 21:01:19.893726 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:19.893700 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/087ffc97-2de9-405c-b35e-6882975ff471-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "087ffc97-2de9-405c-b35e-6882975ff471" (UID: "087ffc97-2de9-405c-b35e-6882975ff471"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:01:19.893837 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:19.893817 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/087ffc97-2de9-405c-b35e-6882975ff471-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config") pod "087ffc97-2de9-405c-b35e-6882975ff471" (UID: "087ffc97-2de9-405c-b35e-6882975ff471"). InnerVolumeSpecName "isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:01:19.895510 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:19.895488 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/087ffc97-2de9-405c-b35e-6882975ff471-kube-api-access-t8jn2" (OuterVolumeSpecName: "kube-api-access-t8jn2") pod "087ffc97-2de9-405c-b35e-6882975ff471" (UID: "087ffc97-2de9-405c-b35e-6882975ff471"). InnerVolumeSpecName "kube-api-access-t8jn2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:01:19.895604 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:19.895583 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/087ffc97-2de9-405c-b35e-6882975ff471-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "087ffc97-2de9-405c-b35e-6882975ff471" (UID: "087ffc97-2de9-405c-b35e-6882975ff471"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:01:19.994843 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:19.994823 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t8jn2\" (UniqueName: \"kubernetes.io/projected/087ffc97-2de9-405c-b35e-6882975ff471-kube-api-access-t8jn2\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 21:01:19.994843 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:19.994845 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/087ffc97-2de9-405c-b35e-6882975ff471-kserve-provision-location\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 21:01:19.994986 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:19.994855 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/087ffc97-2de9-405c-b35e-6882975ff471-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 21:01:19.994986 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:19.994865 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/087ffc97-2de9-405c-b35e-6882975ff471-proxy-tls\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 21:01:20.246226 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:20.246158 2576 generic.go:358] "Generic (PLEG): container finished" podID="087ffc97-2de9-405c-b35e-6882975ff471" containerID="420dc7b50a8a03c1662f759f8fc059dad38695abc7a071719e30f3f0f6a1b62c" exitCode=0 Apr 20 21:01:20.246592 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:20.246256 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-s4zrs" Apr 20 21:01:20.246592 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:20.246255 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-s4zrs" event={"ID":"087ffc97-2de9-405c-b35e-6882975ff471","Type":"ContainerDied","Data":"420dc7b50a8a03c1662f759f8fc059dad38695abc7a071719e30f3f0f6a1b62c"} Apr 20 21:01:20.246592 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:20.246298 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-s4zrs" event={"ID":"087ffc97-2de9-405c-b35e-6882975ff471","Type":"ContainerDied","Data":"fdb7b419c613ba9f3e002f499ae63c0d59f6ca53f09b166458a9e888cc0637a2"} Apr 20 21:01:20.246592 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:20.246318 2576 scope.go:117] "RemoveContainer" containerID="5892531b62e463cf3902f0dcb194c090d6fffb5d6ddb4b2cf8e7750f5163a2a0" Apr 20 21:01:20.248407 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:20.248361 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8shp9" event={"ID":"6b67b7a1-b2ad-4986-96b1-910e92def0ce","Type":"ContainerStarted","Data":"76abc8f54693af7f1c792c9f05845cfe9eba763e59839bee9bb2a5e11ac11314"} Apr 20 21:01:20.248549 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:20.248423 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8shp9" event={"ID":"6b67b7a1-b2ad-4986-96b1-910e92def0ce","Type":"ContainerStarted","Data":"ad150112a7c2c8672d98d3bfebdbb2891ba158a947527bde060f4d6ff9db7e47"} Apr 20 21:01:20.248664 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:20.248629 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8shp9" Apr 20 21:01:20.248775 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:20.248694 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8shp9" Apr 20 21:01:20.254619 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:20.254602 2576 scope.go:117] "RemoveContainer" containerID="420dc7b50a8a03c1662f759f8fc059dad38695abc7a071719e30f3f0f6a1b62c" Apr 20 21:01:20.261666 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:20.261649 2576 scope.go:117] "RemoveContainer" containerID="19db1d2b13569e77f39a98f697b365e7cef6e97083910ce5df960f73ca56d870" Apr 20 21:01:20.269147 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:20.269124 2576 scope.go:117] "RemoveContainer" containerID="5892531b62e463cf3902f0dcb194c090d6fffb5d6ddb4b2cf8e7750f5163a2a0" Apr 20 21:01:20.269414 ip-10-0-143-23 kubenswrapper[2576]: E0420 21:01:20.269391 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5892531b62e463cf3902f0dcb194c090d6fffb5d6ddb4b2cf8e7750f5163a2a0\": container with ID starting with 5892531b62e463cf3902f0dcb194c090d6fffb5d6ddb4b2cf8e7750f5163a2a0 not found: ID does not exist" containerID="5892531b62e463cf3902f0dcb194c090d6fffb5d6ddb4b2cf8e7750f5163a2a0" Apr 20 21:01:20.269503 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:20.269429 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5892531b62e463cf3902f0dcb194c090d6fffb5d6ddb4b2cf8e7750f5163a2a0"} err="failed to get container status \"5892531b62e463cf3902f0dcb194c090d6fffb5d6ddb4b2cf8e7750f5163a2a0\": rpc error: code = NotFound desc = could not find container \"5892531b62e463cf3902f0dcb194c090d6fffb5d6ddb4b2cf8e7750f5163a2a0\": container with ID starting with 5892531b62e463cf3902f0dcb194c090d6fffb5d6ddb4b2cf8e7750f5163a2a0 not found: ID does not exist" Apr 20 21:01:20.269503 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:20.269452 2576 scope.go:117] "RemoveContainer" containerID="420dc7b50a8a03c1662f759f8fc059dad38695abc7a071719e30f3f0f6a1b62c" Apr 20 21:01:20.269774 ip-10-0-143-23 kubenswrapper[2576]: E0420 21:01:20.269751 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"420dc7b50a8a03c1662f759f8fc059dad38695abc7a071719e30f3f0f6a1b62c\": container with ID starting with 420dc7b50a8a03c1662f759f8fc059dad38695abc7a071719e30f3f0f6a1b62c not found: ID does not exist" containerID="420dc7b50a8a03c1662f759f8fc059dad38695abc7a071719e30f3f0f6a1b62c" Apr 20 21:01:20.269834 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:20.269783 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"420dc7b50a8a03c1662f759f8fc059dad38695abc7a071719e30f3f0f6a1b62c"} err="failed to get container status \"420dc7b50a8a03c1662f759f8fc059dad38695abc7a071719e30f3f0f6a1b62c\": rpc error: code = NotFound desc = could not find container \"420dc7b50a8a03c1662f759f8fc059dad38695abc7a071719e30f3f0f6a1b62c\": container with ID starting with 420dc7b50a8a03c1662f759f8fc059dad38695abc7a071719e30f3f0f6a1b62c not found: ID does not exist" Apr 20 21:01:20.269834 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:20.269799 2576 scope.go:117] "RemoveContainer" containerID="19db1d2b13569e77f39a98f697b365e7cef6e97083910ce5df960f73ca56d870" Apr 20 21:01:20.270066 ip-10-0-143-23 kubenswrapper[2576]: E0420 21:01:20.270043 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19db1d2b13569e77f39a98f697b365e7cef6e97083910ce5df960f73ca56d870\": container with ID starting with 19db1d2b13569e77f39a98f697b365e7cef6e97083910ce5df960f73ca56d870 not found: ID does not exist" containerID="19db1d2b13569e77f39a98f697b365e7cef6e97083910ce5df960f73ca56d870" Apr 20 21:01:20.270135 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:20.270072 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19db1d2b13569e77f39a98f697b365e7cef6e97083910ce5df960f73ca56d870"} err="failed to get container status \"19db1d2b13569e77f39a98f697b365e7cef6e97083910ce5df960f73ca56d870\": rpc error: code = NotFound desc = could not find container \"19db1d2b13569e77f39a98f697b365e7cef6e97083910ce5df960f73ca56d870\": container with ID starting with 19db1d2b13569e77f39a98f697b365e7cef6e97083910ce5df960f73ca56d870 not found: ID does not exist" Apr 20 21:01:20.271158 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:20.271101 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8shp9" podStartSLOduration=7.271089192 podStartE2EDuration="7.271089192s" podCreationTimestamp="2026-04-20 21:01:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 21:01:20.268807184 +0000 UTC m=+3354.981489183" watchObservedRunningTime="2026-04-20 21:01:20.271089192 +0000 UTC m=+3354.983771209" Apr 20 21:01:20.282298 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:20.282277 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-s4zrs"] Apr 20 21:01:20.285312 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:20.285288 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-s4zrs"] Apr 20 21:01:21.944783 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:21.944746 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="087ffc97-2de9-405c-b35e-6882975ff471" path="/var/lib/kubelet/pods/087ffc97-2de9-405c-b35e-6882975ff471/volumes" Apr 20 21:01:26.257725 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:26.257697 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8shp9" Apr 20 21:01:56.261302 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:01:56.261271 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8shp9" Apr 20 21:02:04.008956 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:02:04.008924 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8shp9"] Apr 20 21:02:04.009528 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:02:04.009350 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8shp9" podUID="6b67b7a1-b2ad-4986-96b1-910e92def0ce" containerName="kserve-container" containerID="cri-o://ad150112a7c2c8672d98d3bfebdbb2891ba158a947527bde060f4d6ff9db7e47" gracePeriod=30 Apr 20 21:02:04.009528 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:02:04.009401 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8shp9" podUID="6b67b7a1-b2ad-4986-96b1-910e92def0ce" containerName="kube-rbac-proxy" containerID="cri-o://76abc8f54693af7f1c792c9f05845cfe9eba763e59839bee9bb2a5e11ac11314" gracePeriod=30 Apr 20 21:02:04.400071 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:02:04.400032 2576 generic.go:358] "Generic (PLEG): container finished" podID="6b67b7a1-b2ad-4986-96b1-910e92def0ce" containerID="76abc8f54693af7f1c792c9f05845cfe9eba763e59839bee9bb2a5e11ac11314" exitCode=2 Apr 20 21:02:04.400260 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:02:04.400104 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8shp9" event={"ID":"6b67b7a1-b2ad-4986-96b1-910e92def0ce","Type":"ContainerDied","Data":"76abc8f54693af7f1c792c9f05845cfe9eba763e59839bee9bb2a5e11ac11314"} Apr 20 21:02:06.253447 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:02:06.253360 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8shp9" podUID="6b67b7a1-b2ad-4986-96b1-910e92def0ce" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.61:8643/healthz\": dial tcp 10.132.0.61:8643: connect: connection refused" Apr 20 21:02:06.258822 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:02:06.258796 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8shp9" podUID="6b67b7a1-b2ad-4986-96b1-910e92def0ce" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.61:8080/v2/models/xgboost-v2-mlserver/ready\": dial tcp 10.132.0.61:8080: connect: connection refused" Apr 20 21:02:09.555742 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:02:09.555720 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8shp9" Apr 20 21:02:09.677601 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:02:09.677533 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6b67b7a1-b2ad-4986-96b1-910e92def0ce-proxy-tls\") pod \"6b67b7a1-b2ad-4986-96b1-910e92def0ce\" (UID: \"6b67b7a1-b2ad-4986-96b1-910e92def0ce\") " Apr 20 21:02:09.677601 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:02:09.677573 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpcmb\" (UniqueName: \"kubernetes.io/projected/6b67b7a1-b2ad-4986-96b1-910e92def0ce-kube-api-access-kpcmb\") pod \"6b67b7a1-b2ad-4986-96b1-910e92def0ce\" (UID: \"6b67b7a1-b2ad-4986-96b1-910e92def0ce\") " Apr 20 21:02:09.677601 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:02:09.677592 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6b67b7a1-b2ad-4986-96b1-910e92def0ce-kserve-provision-location\") pod \"6b67b7a1-b2ad-4986-96b1-910e92def0ce\" (UID: \"6b67b7a1-b2ad-4986-96b1-910e92def0ce\") " Apr 20 21:02:09.677809 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:02:09.677677 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6b67b7a1-b2ad-4986-96b1-910e92def0ce-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"6b67b7a1-b2ad-4986-96b1-910e92def0ce\" (UID: \"6b67b7a1-b2ad-4986-96b1-910e92def0ce\") " Apr 20 21:02:09.678011 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:02:09.677969 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b67b7a1-b2ad-4986-96b1-910e92def0ce-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6b67b7a1-b2ad-4986-96b1-910e92def0ce" (UID: "6b67b7a1-b2ad-4986-96b1-910e92def0ce"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:02:09.678161 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:02:09.678044 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b67b7a1-b2ad-4986-96b1-910e92def0ce-xgboost-v2-mlserver-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "xgboost-v2-mlserver-kube-rbac-proxy-sar-config") pod "6b67b7a1-b2ad-4986-96b1-910e92def0ce" (UID: "6b67b7a1-b2ad-4986-96b1-910e92def0ce"). InnerVolumeSpecName "xgboost-v2-mlserver-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:02:09.679556 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:02:09.679530 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b67b7a1-b2ad-4986-96b1-910e92def0ce-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "6b67b7a1-b2ad-4986-96b1-910e92def0ce" (UID: "6b67b7a1-b2ad-4986-96b1-910e92def0ce"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:02:09.679657 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:02:09.679644 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b67b7a1-b2ad-4986-96b1-910e92def0ce-kube-api-access-kpcmb" (OuterVolumeSpecName: "kube-api-access-kpcmb") pod "6b67b7a1-b2ad-4986-96b1-910e92def0ce" (UID: "6b67b7a1-b2ad-4986-96b1-910e92def0ce"). InnerVolumeSpecName "kube-api-access-kpcmb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:02:09.778668 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:02:09.778638 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6b67b7a1-b2ad-4986-96b1-910e92def0ce-proxy-tls\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 21:02:09.778668 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:02:09.778664 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kpcmb\" (UniqueName: \"kubernetes.io/projected/6b67b7a1-b2ad-4986-96b1-910e92def0ce-kube-api-access-kpcmb\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 21:02:09.778827 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:02:09.778675 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6b67b7a1-b2ad-4986-96b1-910e92def0ce-kserve-provision-location\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 21:02:09.778827 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:02:09.778685 2576 reconciler_common.go:299] "Volume detached for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6b67b7a1-b2ad-4986-96b1-910e92def0ce-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 21:02:10.421434 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:02:10.421401 2576 generic.go:358] "Generic (PLEG): container finished" podID="6b67b7a1-b2ad-4986-96b1-910e92def0ce" containerID="ad150112a7c2c8672d98d3bfebdbb2891ba158a947527bde060f4d6ff9db7e47" exitCode=0 Apr 20 21:02:10.421610 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:02:10.421471 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8shp9" Apr 20 21:02:10.421610 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:02:10.421472 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8shp9" event={"ID":"6b67b7a1-b2ad-4986-96b1-910e92def0ce","Type":"ContainerDied","Data":"ad150112a7c2c8672d98d3bfebdbb2891ba158a947527bde060f4d6ff9db7e47"} Apr 20 21:02:10.421610 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:02:10.421514 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8shp9" event={"ID":"6b67b7a1-b2ad-4986-96b1-910e92def0ce","Type":"ContainerDied","Data":"7fda5262cb6d767906429b874c1335f5d27e993ceeeed59d585584c9d5cc8fbf"} Apr 20 21:02:10.421610 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:02:10.421534 2576 scope.go:117] "RemoveContainer" containerID="76abc8f54693af7f1c792c9f05845cfe9eba763e59839bee9bb2a5e11ac11314" Apr 20 21:02:10.429895 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:02:10.429692 2576 scope.go:117] "RemoveContainer" containerID="ad150112a7c2c8672d98d3bfebdbb2891ba158a947527bde060f4d6ff9db7e47" Apr 20 21:02:10.436831 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:02:10.436813 2576 scope.go:117] "RemoveContainer" containerID="645f206487e09bd518d09b9cea5c49b8fb343a8a350d047d4552144ee4588627" Apr 20 21:02:10.441545 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:02:10.441519 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8shp9"] Apr 20 21:02:10.444254 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:02:10.444237 2576 scope.go:117] "RemoveContainer" containerID="76abc8f54693af7f1c792c9f05845cfe9eba763e59839bee9bb2a5e11ac11314" Apr 20 21:02:10.444621 ip-10-0-143-23 kubenswrapper[2576]: E0420 21:02:10.444595 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76abc8f54693af7f1c792c9f05845cfe9eba763e59839bee9bb2a5e11ac11314\": container with ID starting with 76abc8f54693af7f1c792c9f05845cfe9eba763e59839bee9bb2a5e11ac11314 not found: ID does not exist" containerID="76abc8f54693af7f1c792c9f05845cfe9eba763e59839bee9bb2a5e11ac11314" Apr 20 21:02:10.444745 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:02:10.444630 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76abc8f54693af7f1c792c9f05845cfe9eba763e59839bee9bb2a5e11ac11314"} err="failed to get container status \"76abc8f54693af7f1c792c9f05845cfe9eba763e59839bee9bb2a5e11ac11314\": rpc error: code = NotFound desc = could not find container \"76abc8f54693af7f1c792c9f05845cfe9eba763e59839bee9bb2a5e11ac11314\": container with ID starting with 76abc8f54693af7f1c792c9f05845cfe9eba763e59839bee9bb2a5e11ac11314 not found: ID does not exist" Apr 20 21:02:10.444745 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:02:10.444655 2576 scope.go:117] "RemoveContainer" containerID="ad150112a7c2c8672d98d3bfebdbb2891ba158a947527bde060f4d6ff9db7e47" Apr 20 21:02:10.445077 ip-10-0-143-23 kubenswrapper[2576]: E0420 21:02:10.445054 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad150112a7c2c8672d98d3bfebdbb2891ba158a947527bde060f4d6ff9db7e47\": container with ID starting with ad150112a7c2c8672d98d3bfebdbb2891ba158a947527bde060f4d6ff9db7e47 not found: ID does not exist" containerID="ad150112a7c2c8672d98d3bfebdbb2891ba158a947527bde060f4d6ff9db7e47" Apr 20 21:02:10.445206 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:02:10.445082 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad150112a7c2c8672d98d3bfebdbb2891ba158a947527bde060f4d6ff9db7e47"} err="failed to get container status \"ad150112a7c2c8672d98d3bfebdbb2891ba158a947527bde060f4d6ff9db7e47\": rpc error: code = NotFound desc = could not find container \"ad150112a7c2c8672d98d3bfebdbb2891ba158a947527bde060f4d6ff9db7e47\": container with ID starting with ad150112a7c2c8672d98d3bfebdbb2891ba158a947527bde060f4d6ff9db7e47 not found: ID does not exist" Apr 20 21:02:10.445206 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:02:10.445099 2576 scope.go:117] "RemoveContainer" containerID="645f206487e09bd518d09b9cea5c49b8fb343a8a350d047d4552144ee4588627" Apr 20 21:02:10.445405 ip-10-0-143-23 kubenswrapper[2576]: E0420 21:02:10.445381 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"645f206487e09bd518d09b9cea5c49b8fb343a8a350d047d4552144ee4588627\": container with ID starting with 645f206487e09bd518d09b9cea5c49b8fb343a8a350d047d4552144ee4588627 not found: ID does not exist" containerID="645f206487e09bd518d09b9cea5c49b8fb343a8a350d047d4552144ee4588627" Apr 20 21:02:10.445458 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:02:10.445409 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"645f206487e09bd518d09b9cea5c49b8fb343a8a350d047d4552144ee4588627"} err="failed to get container status \"645f206487e09bd518d09b9cea5c49b8fb343a8a350d047d4552144ee4588627\": rpc error: code = NotFound desc = could not find container \"645f206487e09bd518d09b9cea5c49b8fb343a8a350d047d4552144ee4588627\": container with ID starting with 645f206487e09bd518d09b9cea5c49b8fb343a8a350d047d4552144ee4588627 not found: ID does not exist" Apr 20 21:02:10.447217 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:02:10.447199 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-8shp9"] Apr 20 21:02:11.944056 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:02:11.944023 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b67b7a1-b2ad-4986-96b1-910e92def0ce" path="/var/lib/kubelet/pods/6b67b7a1-b2ad-4986-96b1-910e92def0ce/volumes" Apr 20 21:03:24.258932 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:03:24.258898 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qq75l"] Apr 20 21:03:24.259545 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:03:24.259388 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b67b7a1-b2ad-4986-96b1-910e92def0ce" containerName="storage-initializer" Apr 20 21:03:24.259545 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:03:24.259408 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b67b7a1-b2ad-4986-96b1-910e92def0ce" containerName="storage-initializer" Apr 20 21:03:24.259545 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:03:24.259422 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="087ffc97-2de9-405c-b35e-6882975ff471" containerName="kserve-container" Apr 20 21:03:24.259545 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:03:24.259432 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="087ffc97-2de9-405c-b35e-6882975ff471" containerName="kserve-container" Apr 20 21:03:24.259545 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:03:24.259442 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b67b7a1-b2ad-4986-96b1-910e92def0ce" containerName="kube-rbac-proxy" Apr 20 21:03:24.259545 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:03:24.259452 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b67b7a1-b2ad-4986-96b1-910e92def0ce" containerName="kube-rbac-proxy" Apr 20 21:03:24.259545 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:03:24.259475 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="087ffc97-2de9-405c-b35e-6882975ff471" containerName="storage-initializer" Apr 20 21:03:24.259545 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:03:24.259483 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="087ffc97-2de9-405c-b35e-6882975ff471" containerName="storage-initializer" Apr 20 21:03:24.259545 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:03:24.259494 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="087ffc97-2de9-405c-b35e-6882975ff471" containerName="kube-rbac-proxy" Apr 20 21:03:24.259545 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:03:24.259502 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="087ffc97-2de9-405c-b35e-6882975ff471" containerName="kube-rbac-proxy" Apr 20 21:03:24.259545 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:03:24.259511 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b67b7a1-b2ad-4986-96b1-910e92def0ce" containerName="kserve-container" Apr 20 21:03:24.259545 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:03:24.259520 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b67b7a1-b2ad-4986-96b1-910e92def0ce" containerName="kserve-container" Apr 20 21:03:24.260176 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:03:24.259612 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="6b67b7a1-b2ad-4986-96b1-910e92def0ce" containerName="kserve-container" Apr 20 21:03:24.260176 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:03:24.259626 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="6b67b7a1-b2ad-4986-96b1-910e92def0ce" containerName="kube-rbac-proxy" Apr 20 21:03:24.260176 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:03:24.259639 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="087ffc97-2de9-405c-b35e-6882975ff471" containerName="kube-rbac-proxy" Apr 20 21:03:24.260176 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:03:24.259651 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="087ffc97-2de9-405c-b35e-6882975ff471" containerName="kserve-container" Apr 20 21:03:24.263187 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:03:24.263164 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qq75l" Apr 20 21:03:24.265676 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:03:24.265653 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-runtime-predictor-serving-cert\"" Apr 20 21:03:24.265799 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:03:24.265749 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 20 21:03:24.265930 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:03:24.265905 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 20 21:03:24.266054 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:03:24.265955 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 20 21:03:24.266054 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:03:24.265982 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-c5zv6\"" Apr 20 21:03:24.274523 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:03:24.274496 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qq75l"] Apr 20 21:03:24.396882 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:03:24.396848 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rl8x\" (UniqueName: \"kubernetes.io/projected/40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d-kube-api-access-8rl8x\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-qq75l\" (UID: \"40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qq75l" Apr 20 21:03:24.397031 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:03:24.396895 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-qq75l\" (UID: \"40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qq75l" Apr 20 21:03:24.397031 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:03:24.396975 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d-proxy-tls\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-qq75l\" (UID: \"40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qq75l" Apr 20 21:03:24.397031 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:03:24.397026 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-qq75l\" (UID: \"40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qq75l" Apr 20 21:03:24.497700 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:03:24.497669 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-qq75l\" (UID: \"40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qq75l" Apr 20 21:03:24.497842 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:03:24.497718 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8rl8x\" (UniqueName: \"kubernetes.io/projected/40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d-kube-api-access-8rl8x\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-qq75l\" (UID: \"40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qq75l" Apr 20 21:03:24.497842 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:03:24.497759 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-qq75l\" (UID: \"40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qq75l" Apr 20 21:03:24.497842 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:03:24.497803 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d-proxy-tls\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-qq75l\" (UID: \"40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qq75l" Apr 20 21:03:24.498157 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:03:24.498136 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-qq75l\" (UID: \"40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qq75l" Apr 20 21:03:24.498345 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:03:24.498326 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-qq75l\" (UID: \"40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qq75l" Apr 20 21:03:24.500194 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:03:24.500174 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d-proxy-tls\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-qq75l\" (UID: \"40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qq75l" Apr 20 21:03:24.505620 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:03:24.505594 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rl8x\" (UniqueName: \"kubernetes.io/projected/40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d-kube-api-access-8rl8x\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-qq75l\" (UID: \"40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qq75l" Apr 20 21:03:24.574542 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:03:24.574521 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qq75l" Apr 20 21:03:24.697925 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:03:24.697901 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qq75l"] Apr 20 21:03:24.699852 ip-10-0-143-23 kubenswrapper[2576]: W0420 21:03:24.699822 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40cfc1c0_2ec1_4356_b34b_3e26d3f55b0d.slice/crio-dd25f1220114a75f7d1f4207080a8ff2dd397a3d9ac9585dd21bd58061e2f9f2 WatchSource:0}: Error finding container dd25f1220114a75f7d1f4207080a8ff2dd397a3d9ac9585dd21bd58061e2f9f2: Status 404 returned error can't find the container with id dd25f1220114a75f7d1f4207080a8ff2dd397a3d9ac9585dd21bd58061e2f9f2 Apr 20 21:03:25.668200 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:03:25.668160 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qq75l" event={"ID":"40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d","Type":"ContainerStarted","Data":"f005fc14487c3bb3be17f715b5bb0d6a710dedec7393533550c9e4b493071d7b"} Apr 20 21:03:25.668200 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:03:25.668202 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qq75l" event={"ID":"40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d","Type":"ContainerStarted","Data":"dd25f1220114a75f7d1f4207080a8ff2dd397a3d9ac9585dd21bd58061e2f9f2"} Apr 20 21:03:28.679796 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:03:28.679762 2576 generic.go:358] "Generic (PLEG): container finished" podID="40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d" containerID="f005fc14487c3bb3be17f715b5bb0d6a710dedec7393533550c9e4b493071d7b" exitCode=0 Apr 20 21:03:28.680179 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:03:28.679834 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qq75l" event={"ID":"40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d","Type":"ContainerDied","Data":"f005fc14487c3bb3be17f715b5bb0d6a710dedec7393533550c9e4b493071d7b"} Apr 20 21:03:29.690152 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:03:29.690101 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qq75l" event={"ID":"40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d","Type":"ContainerStarted","Data":"c1249e1595815b2413aed5a8b527d614268481dff032eb403b97885fd58ff8f6"} Apr 20 21:03:29.690152 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:03:29.690153 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qq75l" event={"ID":"40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d","Type":"ContainerStarted","Data":"037eb80a24125bb38663ad53257abd58925d6c513da46b1c224f210c0812934e"} Apr 20 21:03:29.690622 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:03:29.690490 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qq75l" Apr 20 21:03:29.690622 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:03:29.690520 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qq75l" Apr 20 21:03:29.708897 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:03:29.708858 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qq75l" podStartSLOduration=5.708847788 podStartE2EDuration="5.708847788s" podCreationTimestamp="2026-04-20 21:03:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 21:03:29.707182842 +0000 UTC m=+3484.419864839" watchObservedRunningTime="2026-04-20 21:03:29.708847788 +0000 UTC m=+3484.421529785" Apr 20 21:03:35.698675 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:03:35.698648 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qq75l" Apr 20 21:04:05.739031 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:04:05.738978 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qq75l" podUID="40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 20 21:04:15.701065 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:04:15.701030 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qq75l" Apr 20 21:04:24.354993 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:04:24.354959 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qq75l"] Apr 20 21:04:24.355380 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:04:24.355341 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qq75l" podUID="40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d" containerName="kserve-container" containerID="cri-o://037eb80a24125bb38663ad53257abd58925d6c513da46b1c224f210c0812934e" gracePeriod=30 Apr 20 21:04:24.355380 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:04:24.355358 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qq75l" podUID="40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d" containerName="kube-rbac-proxy" containerID="cri-o://c1249e1595815b2413aed5a8b527d614268481dff032eb403b97885fd58ff8f6" gracePeriod=30 Apr 20 21:04:24.880670 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:04:24.880635 2576 generic.go:358] "Generic (PLEG): container finished" podID="40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d" containerID="c1249e1595815b2413aed5a8b527d614268481dff032eb403b97885fd58ff8f6" exitCode=2 Apr 20 21:04:24.880858 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:04:24.880707 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qq75l" event={"ID":"40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d","Type":"ContainerDied","Data":"c1249e1595815b2413aed5a8b527d614268481dff032eb403b97885fd58ff8f6"} Apr 20 21:04:25.693935 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:04:25.693894 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qq75l" podUID="40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.62:8643/healthz\": dial tcp 10.132.0.62:8643: connect: connection refused" Apr 20 21:04:26.740280 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:04:26.740228 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qq75l" podUID="40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.62:8080/v2/models/isvc-xgboost-v2-runtime/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Apr 20 21:04:30.693883 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:04:30.693839 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qq75l" podUID="40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.62:8643/healthz\": dial tcp 10.132.0.62:8643: connect: connection refused" Apr 20 21:04:34.799915 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:04:34.799893 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qq75l" Apr 20 21:04:34.912973 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:04:34.912904 2576 generic.go:358] "Generic (PLEG): container finished" podID="40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d" containerID="037eb80a24125bb38663ad53257abd58925d6c513da46b1c224f210c0812934e" exitCode=0 Apr 20 21:04:34.913085 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:04:34.912978 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qq75l" Apr 20 21:04:34.913085 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:04:34.912990 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qq75l" event={"ID":"40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d","Type":"ContainerDied","Data":"037eb80a24125bb38663ad53257abd58925d6c513da46b1c224f210c0812934e"} Apr 20 21:04:34.913085 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:04:34.913029 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qq75l" event={"ID":"40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d","Type":"ContainerDied","Data":"dd25f1220114a75f7d1f4207080a8ff2dd397a3d9ac9585dd21bd58061e2f9f2"} Apr 20 21:04:34.913085 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:04:34.913045 2576 scope.go:117] "RemoveContainer" containerID="c1249e1595815b2413aed5a8b527d614268481dff032eb403b97885fd58ff8f6" Apr 20 21:04:34.920704 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:04:34.920688 2576 scope.go:117] "RemoveContainer" containerID="037eb80a24125bb38663ad53257abd58925d6c513da46b1c224f210c0812934e" Apr 20 21:04:34.927785 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:04:34.927770 2576 scope.go:117] "RemoveContainer" containerID="f005fc14487c3bb3be17f715b5bb0d6a710dedec7393533550c9e4b493071d7b" Apr 20 21:04:34.934606 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:04:34.934588 2576 scope.go:117] "RemoveContainer" containerID="c1249e1595815b2413aed5a8b527d614268481dff032eb403b97885fd58ff8f6" Apr 20 21:04:34.934812 ip-10-0-143-23 kubenswrapper[2576]: E0420 21:04:34.934798 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1249e1595815b2413aed5a8b527d614268481dff032eb403b97885fd58ff8f6\": container with ID starting with c1249e1595815b2413aed5a8b527d614268481dff032eb403b97885fd58ff8f6 not found: ID does not exist" containerID="c1249e1595815b2413aed5a8b527d614268481dff032eb403b97885fd58ff8f6" Apr 20 21:04:34.934862 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:04:34.934820 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1249e1595815b2413aed5a8b527d614268481dff032eb403b97885fd58ff8f6"} err="failed to get container status \"c1249e1595815b2413aed5a8b527d614268481dff032eb403b97885fd58ff8f6\": rpc error: code = NotFound desc = could not find container \"c1249e1595815b2413aed5a8b527d614268481dff032eb403b97885fd58ff8f6\": container with ID starting with c1249e1595815b2413aed5a8b527d614268481dff032eb403b97885fd58ff8f6 not found: ID does not exist" Apr 20 21:04:34.934862 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:04:34.934835 2576 scope.go:117] "RemoveContainer" containerID="037eb80a24125bb38663ad53257abd58925d6c513da46b1c224f210c0812934e" Apr 20 21:04:34.935036 ip-10-0-143-23 kubenswrapper[2576]: E0420 21:04:34.935023 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"037eb80a24125bb38663ad53257abd58925d6c513da46b1c224f210c0812934e\": container with ID starting with 037eb80a24125bb38663ad53257abd58925d6c513da46b1c224f210c0812934e not found: ID does not exist" containerID="037eb80a24125bb38663ad53257abd58925d6c513da46b1c224f210c0812934e" Apr 20 21:04:34.935080 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:04:34.935039 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"037eb80a24125bb38663ad53257abd58925d6c513da46b1c224f210c0812934e"} err="failed to get container status \"037eb80a24125bb38663ad53257abd58925d6c513da46b1c224f210c0812934e\": rpc error: code = NotFound desc = could not find container \"037eb80a24125bb38663ad53257abd58925d6c513da46b1c224f210c0812934e\": container with ID starting with 037eb80a24125bb38663ad53257abd58925d6c513da46b1c224f210c0812934e not found: ID does not exist" Apr 20 21:04:34.935080 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:04:34.935050 2576 scope.go:117] "RemoveContainer" containerID="f005fc14487c3bb3be17f715b5bb0d6a710dedec7393533550c9e4b493071d7b" Apr 20 21:04:34.935264 ip-10-0-143-23 kubenswrapper[2576]: E0420 21:04:34.935246 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f005fc14487c3bb3be17f715b5bb0d6a710dedec7393533550c9e4b493071d7b\": container with ID starting with f005fc14487c3bb3be17f715b5bb0d6a710dedec7393533550c9e4b493071d7b not found: ID does not exist" containerID="f005fc14487c3bb3be17f715b5bb0d6a710dedec7393533550c9e4b493071d7b" Apr 20 21:04:34.935306 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:04:34.935269 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f005fc14487c3bb3be17f715b5bb0d6a710dedec7393533550c9e4b493071d7b"} err="failed to get container status \"f005fc14487c3bb3be17f715b5bb0d6a710dedec7393533550c9e4b493071d7b\": rpc error: code = NotFound desc = could not find container \"f005fc14487c3bb3be17f715b5bb0d6a710dedec7393533550c9e4b493071d7b\": container with ID starting with f005fc14487c3bb3be17f715b5bb0d6a710dedec7393533550c9e4b493071d7b not found: ID does not exist" Apr 20 21:04:34.951089 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:04:34.951069 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rl8x\" (UniqueName: \"kubernetes.io/projected/40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d-kube-api-access-8rl8x\") pod \"40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d\" (UID: \"40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d\") " Apr 20 21:04:34.951161 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:04:34.951101 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d\" (UID: \"40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d\") " Apr 20 21:04:34.951161 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:04:34.951153 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d-kserve-provision-location\") pod \"40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d\" (UID: \"40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d\") " Apr 20 21:04:34.951246 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:04:34.951207 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d-proxy-tls\") pod \"40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d\" (UID: \"40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d\") " Apr 20 21:04:34.951485 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:04:34.951466 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config") pod "40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d" (UID: "40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d"). InnerVolumeSpecName "isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:04:34.951535 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:04:34.951487 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d" (UID: "40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:04:34.952910 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:04:34.952892 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d-kube-api-access-8rl8x" (OuterVolumeSpecName: "kube-api-access-8rl8x") pod "40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d" (UID: "40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d"). InnerVolumeSpecName "kube-api-access-8rl8x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:04:34.953172 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:04:34.953155 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d" (UID: "40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:04:35.051898 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:04:35.051865 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 21:04:35.051898 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:04:35.051896 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d-kserve-provision-location\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 21:04:35.052029 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:04:35.051909 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d-proxy-tls\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 21:04:35.052029 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:04:35.051920 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8rl8x\" (UniqueName: \"kubernetes.io/projected/40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d-kube-api-access-8rl8x\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 21:04:35.237644 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:04:35.237617 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qq75l"] Apr 20 21:04:35.241499 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:04:35.241478 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-qq75l"] Apr 20 21:04:35.943841 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:04:35.943804 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d" path="/var/lib/kubelet/pods/40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d/volumes" Apr 20 21:05:44.622539 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:05:44.622494 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nr6v9"] Apr 20 21:05:44.622979 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:05:44.622965 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d" containerName="storage-initializer" Apr 20 21:05:44.623029 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:05:44.622982 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d" containerName="storage-initializer" Apr 20 21:05:44.623029 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:05:44.622995 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d" containerName="kserve-container" Apr 20 21:05:44.623029 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:05:44.623004 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d" containerName="kserve-container" Apr 20 21:05:44.623175 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:05:44.623029 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d" containerName="kube-rbac-proxy" Apr 20 21:05:44.623175 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:05:44.623039 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d" containerName="kube-rbac-proxy" Apr 20 21:05:44.623175 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:05:44.623140 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d" containerName="kserve-container" Apr 20 21:05:44.623175 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:05:44.623152 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="40cfc1c0-2ec1-4356-b34b-3e26d3f55b0d" containerName="kube-rbac-proxy" Apr 20 21:05:44.626618 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:05:44.626594 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nr6v9" Apr 20 21:05:44.629069 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:05:44.629039 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-predictor-serving-cert\"" Apr 20 21:05:44.629190 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:05:44.629067 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 20 21:05:44.629190 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:05:44.629092 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-c5zv6\"" Apr 20 21:05:44.630091 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:05:44.630071 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 20 21:05:44.630491 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:05:44.630162 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"storage-config\"" Apr 20 21:05:44.630491 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:05:44.630184 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-kube-rbac-proxy-sar-config\"" Apr 20 21:05:44.632983 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:05:44.632961 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nr6v9"] Apr 20 21:05:44.654493 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:05:44.654471 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6a8049a3-c453-40a7-8b86-ad14a7ec8b65-proxy-tls\") pod \"isvc-sklearn-s3-predictor-88457d696-nr6v9\" (UID: \"6a8049a3-c453-40a7-8b86-ad14a7ec8b65\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nr6v9" Apr 20 21:05:44.654590 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:05:44.654503 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzqz5\" (UniqueName: \"kubernetes.io/projected/6a8049a3-c453-40a7-8b86-ad14a7ec8b65-kube-api-access-hzqz5\") pod \"isvc-sklearn-s3-predictor-88457d696-nr6v9\" (UID: \"6a8049a3-c453-40a7-8b86-ad14a7ec8b65\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nr6v9" Apr 20 21:05:44.654590 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:05:44.654543 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6a8049a3-c453-40a7-8b86-ad14a7ec8b65-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-predictor-88457d696-nr6v9\" (UID: \"6a8049a3-c453-40a7-8b86-ad14a7ec8b65\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nr6v9" Apr 20 21:05:44.654590 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:05:44.654570 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6a8049a3-c453-40a7-8b86-ad14a7ec8b65-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-88457d696-nr6v9\" (UID: \"6a8049a3-c453-40a7-8b86-ad14a7ec8b65\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nr6v9" Apr 20 21:05:44.754857 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:05:44.754830 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6a8049a3-c453-40a7-8b86-ad14a7ec8b65-proxy-tls\") pod \"isvc-sklearn-s3-predictor-88457d696-nr6v9\" (UID: \"6a8049a3-c453-40a7-8b86-ad14a7ec8b65\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nr6v9" Apr 20 21:05:44.755041 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:05:44.754862 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hzqz5\" (UniqueName: \"kubernetes.io/projected/6a8049a3-c453-40a7-8b86-ad14a7ec8b65-kube-api-access-hzqz5\") pod \"isvc-sklearn-s3-predictor-88457d696-nr6v9\" (UID: \"6a8049a3-c453-40a7-8b86-ad14a7ec8b65\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nr6v9" Apr 20 21:05:44.755041 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:05:44.754895 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6a8049a3-c453-40a7-8b86-ad14a7ec8b65-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-predictor-88457d696-nr6v9\" (UID: \"6a8049a3-c453-40a7-8b86-ad14a7ec8b65\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nr6v9" Apr 20 21:05:44.755041 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:05:44.754925 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6a8049a3-c453-40a7-8b86-ad14a7ec8b65-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-88457d696-nr6v9\" (UID: \"6a8049a3-c453-40a7-8b86-ad14a7ec8b65\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nr6v9" Apr 20 21:05:44.755041 ip-10-0-143-23 kubenswrapper[2576]: E0420 21:05:44.754990 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-s3-predictor-serving-cert: secret "isvc-sklearn-s3-predictor-serving-cert" not found Apr 20 21:05:44.755313 ip-10-0-143-23 kubenswrapper[2576]: E0420 21:05:44.755067 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6a8049a3-c453-40a7-8b86-ad14a7ec8b65-proxy-tls podName:6a8049a3-c453-40a7-8b86-ad14a7ec8b65 nodeName:}" failed. No retries permitted until 2026-04-20 21:05:45.255045472 +0000 UTC m=+3619.967727460 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/6a8049a3-c453-40a7-8b86-ad14a7ec8b65-proxy-tls") pod "isvc-sklearn-s3-predictor-88457d696-nr6v9" (UID: "6a8049a3-c453-40a7-8b86-ad14a7ec8b65") : secret "isvc-sklearn-s3-predictor-serving-cert" not found Apr 20 21:05:44.755386 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:05:44.755359 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6a8049a3-c453-40a7-8b86-ad14a7ec8b65-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-88457d696-nr6v9\" (UID: \"6a8049a3-c453-40a7-8b86-ad14a7ec8b65\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nr6v9" Apr 20 21:05:44.755655 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:05:44.755633 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6a8049a3-c453-40a7-8b86-ad14a7ec8b65-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-predictor-88457d696-nr6v9\" (UID: \"6a8049a3-c453-40a7-8b86-ad14a7ec8b65\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nr6v9" Apr 20 21:05:44.763758 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:05:44.763734 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzqz5\" (UniqueName: \"kubernetes.io/projected/6a8049a3-c453-40a7-8b86-ad14a7ec8b65-kube-api-access-hzqz5\") pod \"isvc-sklearn-s3-predictor-88457d696-nr6v9\" (UID: \"6a8049a3-c453-40a7-8b86-ad14a7ec8b65\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nr6v9" Apr 20 21:05:45.258926 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:05:45.258888 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6a8049a3-c453-40a7-8b86-ad14a7ec8b65-proxy-tls\") pod \"isvc-sklearn-s3-predictor-88457d696-nr6v9\" (UID: \"6a8049a3-c453-40a7-8b86-ad14a7ec8b65\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nr6v9" Apr 20 21:05:45.261277 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:05:45.261259 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6a8049a3-c453-40a7-8b86-ad14a7ec8b65-proxy-tls\") pod \"isvc-sklearn-s3-predictor-88457d696-nr6v9\" (UID: \"6a8049a3-c453-40a7-8b86-ad14a7ec8b65\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nr6v9" Apr 20 21:05:45.538817 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:05:45.538785 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nr6v9" Apr 20 21:05:45.661232 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:05:45.661201 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nr6v9"] Apr 20 21:05:45.664294 ip-10-0-143-23 kubenswrapper[2576]: W0420 21:05:45.664265 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a8049a3_c453_40a7_8b86_ad14a7ec8b65.slice/crio-54064259821a6b68225db6f317218124a97a816e5b3da3a719b2b5473cd4cbbc WatchSource:0}: Error finding container 54064259821a6b68225db6f317218124a97a816e5b3da3a719b2b5473cd4cbbc: Status 404 returned error can't find the container with id 54064259821a6b68225db6f317218124a97a816e5b3da3a719b2b5473cd4cbbc Apr 20 21:05:45.666146 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:05:45.666129 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 21:05:46.138461 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:05:46.138419 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nr6v9" event={"ID":"6a8049a3-c453-40a7-8b86-ad14a7ec8b65","Type":"ContainerStarted","Data":"69bda21086b2ade55f68a7123c01fceb030f89014ae0cc940ab438c7f5965232"} Apr 20 21:05:46.138623 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:05:46.138472 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nr6v9" event={"ID":"6a8049a3-c453-40a7-8b86-ad14a7ec8b65","Type":"ContainerStarted","Data":"54064259821a6b68225db6f317218124a97a816e5b3da3a719b2b5473cd4cbbc"} Apr 20 21:05:47.142763 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:05:47.142720 2576 generic.go:358] "Generic (PLEG): container finished" podID="6a8049a3-c453-40a7-8b86-ad14a7ec8b65" containerID="69bda21086b2ade55f68a7123c01fceb030f89014ae0cc940ab438c7f5965232" exitCode=0 Apr 20 21:05:47.143147 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:05:47.142804 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nr6v9" event={"ID":"6a8049a3-c453-40a7-8b86-ad14a7ec8b65","Type":"ContainerDied","Data":"69bda21086b2ade55f68a7123c01fceb030f89014ae0cc940ab438c7f5965232"} Apr 20 21:05:48.147870 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:05:48.147830 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nr6v9" event={"ID":"6a8049a3-c453-40a7-8b86-ad14a7ec8b65","Type":"ContainerStarted","Data":"16dd00b8e92a48afb1de700c65de2150aad338b99b9bdd7dcc1d38f9d0afa662"} Apr 20 21:05:48.147870 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:05:48.147866 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nr6v9" event={"ID":"6a8049a3-c453-40a7-8b86-ad14a7ec8b65","Type":"ContainerStarted","Data":"b00eda351987dbfdc29c6fe2bcce6d0a0621da7200870b1cf80a87aed5732450"} Apr 20 21:05:48.148277 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:05:48.148041 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nr6v9" Apr 20 21:05:48.166384 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:05:48.166342 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nr6v9" podStartSLOduration=4.166329903 podStartE2EDuration="4.166329903s" podCreationTimestamp="2026-04-20 21:05:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 21:05:48.164492365 +0000 UTC m=+3622.877174367" watchObservedRunningTime="2026-04-20 21:05:48.166329903 +0000 UTC m=+3622.879011900" Apr 20 21:05:49.151758 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:05:49.151728 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nr6v9" Apr 20 21:05:49.152732 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:05:49.152698 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nr6v9" podUID="6a8049a3-c453-40a7-8b86-ad14a7ec8b65" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.63:8080: connect: connection refused" Apr 20 21:05:50.155266 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:05:50.155225 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nr6v9" podUID="6a8049a3-c453-40a7-8b86-ad14a7ec8b65" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.63:8080: connect: connection refused" Apr 20 21:05:54.550797 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:05:54.550769 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z55qt_f78ac3d9-bcf1-43dd-aac7-1678831ee3ba/ovn-acl-logging/0.log" Apr 20 21:05:54.557154 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:05:54.557134 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z55qt_f78ac3d9-bcf1-43dd-aac7-1678831ee3ba/ovn-acl-logging/0.log" Apr 20 21:05:55.159209 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:05:55.159134 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nr6v9" Apr 20 21:05:55.159648 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:05:55.159620 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nr6v9" podUID="6a8049a3-c453-40a7-8b86-ad14a7ec8b65" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.63:8080: connect: connection refused" Apr 20 21:06:05.160335 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:06:05.160294 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nr6v9" podUID="6a8049a3-c453-40a7-8b86-ad14a7ec8b65" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.63:8080: connect: connection refused" Apr 20 21:06:15.160027 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:06:15.159991 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nr6v9" podUID="6a8049a3-c453-40a7-8b86-ad14a7ec8b65" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.63:8080: connect: connection refused" Apr 20 21:06:25.159902 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:06:25.159862 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nr6v9" podUID="6a8049a3-c453-40a7-8b86-ad14a7ec8b65" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.63:8080: connect: connection refused" Apr 20 21:06:35.160291 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:06:35.160254 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nr6v9" podUID="6a8049a3-c453-40a7-8b86-ad14a7ec8b65" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.63:8080: connect: connection refused" Apr 20 21:06:45.160051 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:06:45.159969 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nr6v9" podUID="6a8049a3-c453-40a7-8b86-ad14a7ec8b65" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.63:8080: connect: connection refused" Apr 20 21:06:55.160454 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:06:55.160420 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nr6v9" Apr 20 21:07:04.721688 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:04.721657 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nr6v9"] Apr 20 21:07:04.722125 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:04.722019 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nr6v9" podUID="6a8049a3-c453-40a7-8b86-ad14a7ec8b65" containerName="kserve-container" containerID="cri-o://b00eda351987dbfdc29c6fe2bcce6d0a0621da7200870b1cf80a87aed5732450" gracePeriod=30 Apr 20 21:07:04.722125 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:04.722062 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nr6v9" podUID="6a8049a3-c453-40a7-8b86-ad14a7ec8b65" containerName="kube-rbac-proxy" containerID="cri-o://16dd00b8e92a48afb1de700c65de2150aad338b99b9bdd7dcc1d38f9d0afa662" gracePeriod=30 Apr 20 21:07:04.833560 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:04.833531 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w"] Apr 20 21:07:04.837330 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:04.837308 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w" Apr 20 21:07:04.839965 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:04.839942 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 20 21:07:04.839965 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:04.839955 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\"" Apr 20 21:07:04.840209 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:04.839995 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-pass-predictor-serving-cert\"" Apr 20 21:07:04.846893 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:04.846871 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w"] Apr 20 21:07:04.878648 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:04.878622 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/35b95e08-2815-4f36-a556-c809b43eee74-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w\" (UID: \"35b95e08-2815-4f36-a556-c809b43eee74\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w" Apr 20 21:07:04.878772 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:04.878658 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/35b95e08-2815-4f36-a556-c809b43eee74-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w\" (UID: \"35b95e08-2815-4f36-a556-c809b43eee74\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w" Apr 20 21:07:04.878772 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:04.878685 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/35b95e08-2815-4f36-a556-c809b43eee74-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w\" (UID: \"35b95e08-2815-4f36-a556-c809b43eee74\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w" Apr 20 21:07:04.878772 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:04.878758 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/35b95e08-2815-4f36-a556-c809b43eee74-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w\" (UID: \"35b95e08-2815-4f36-a556-c809b43eee74\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w" Apr 20 21:07:04.878965 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:04.878811 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z248t\" (UniqueName: \"kubernetes.io/projected/35b95e08-2815-4f36-a556-c809b43eee74-kube-api-access-z248t\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w\" (UID: \"35b95e08-2815-4f36-a556-c809b43eee74\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w" Apr 20 21:07:04.979363 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:04.979286 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/35b95e08-2815-4f36-a556-c809b43eee74-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w\" (UID: \"35b95e08-2815-4f36-a556-c809b43eee74\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w" Apr 20 21:07:04.979363 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:04.979342 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/35b95e08-2815-4f36-a556-c809b43eee74-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w\" (UID: \"35b95e08-2815-4f36-a556-c809b43eee74\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w" Apr 20 21:07:04.979570 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:04.979384 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/35b95e08-2815-4f36-a556-c809b43eee74-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w\" (UID: \"35b95e08-2815-4f36-a556-c809b43eee74\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w" Apr 20 21:07:04.979570 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:04.979426 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/35b95e08-2815-4f36-a556-c809b43eee74-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w\" (UID: \"35b95e08-2815-4f36-a556-c809b43eee74\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w" Apr 20 21:07:04.979570 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:04.979512 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z248t\" (UniqueName: \"kubernetes.io/projected/35b95e08-2815-4f36-a556-c809b43eee74-kube-api-access-z248t\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w\" (UID: \"35b95e08-2815-4f36-a556-c809b43eee74\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w" Apr 20 21:07:04.980022 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:04.980000 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/35b95e08-2815-4f36-a556-c809b43eee74-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w\" (UID: \"35b95e08-2815-4f36-a556-c809b43eee74\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w" Apr 20 21:07:04.980106 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:04.980022 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/35b95e08-2815-4f36-a556-c809b43eee74-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w\" (UID: \"35b95e08-2815-4f36-a556-c809b43eee74\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w" Apr 20 21:07:04.980106 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:04.980040 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/35b95e08-2815-4f36-a556-c809b43eee74-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w\" (UID: \"35b95e08-2815-4f36-a556-c809b43eee74\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w" Apr 20 21:07:04.981891 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:04.981868 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/35b95e08-2815-4f36-a556-c809b43eee74-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w\" (UID: \"35b95e08-2815-4f36-a556-c809b43eee74\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w" Apr 20 21:07:04.988534 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:04.988511 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z248t\" (UniqueName: \"kubernetes.io/projected/35b95e08-2815-4f36-a556-c809b43eee74-kube-api-access-z248t\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w\" (UID: \"35b95e08-2815-4f36-a556-c809b43eee74\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w" Apr 20 21:07:05.149381 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:05.149353 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w" Apr 20 21:07:05.156285 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:05.156255 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nr6v9" podUID="6a8049a3-c453-40a7-8b86-ad14a7ec8b65" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.63:8643/healthz\": dial tcp 10.132.0.63:8643: connect: connection refused" Apr 20 21:07:05.159565 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:05.159536 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nr6v9" podUID="6a8049a3-c453-40a7-8b86-ad14a7ec8b65" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.63:8080: connect: connection refused" Apr 20 21:07:05.271464 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:05.271386 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w"] Apr 20 21:07:05.275031 ip-10-0-143-23 kubenswrapper[2576]: W0420 21:07:05.275002 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35b95e08_2815_4f36_a556_c809b43eee74.slice/crio-dd80a975ed16033e2e19a0d0f9c59de9c5aac40cbdc12e9d70ef0dca87e1d8ab WatchSource:0}: Error finding container dd80a975ed16033e2e19a0d0f9c59de9c5aac40cbdc12e9d70ef0dca87e1d8ab: Status 404 returned error can't find the container with id dd80a975ed16033e2e19a0d0f9c59de9c5aac40cbdc12e9d70ef0dca87e1d8ab Apr 20 21:07:05.402255 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:05.402220 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w" event={"ID":"35b95e08-2815-4f36-a556-c809b43eee74","Type":"ContainerStarted","Data":"1c8e118d3b8b96795646c064511c74734568ac68fc037c667b473ccc2b5ec795"} Apr 20 21:07:05.402402 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:05.402259 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w" event={"ID":"35b95e08-2815-4f36-a556-c809b43eee74","Type":"ContainerStarted","Data":"dd80a975ed16033e2e19a0d0f9c59de9c5aac40cbdc12e9d70ef0dca87e1d8ab"} Apr 20 21:07:05.404044 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:05.404016 2576 generic.go:358] "Generic (PLEG): container finished" podID="6a8049a3-c453-40a7-8b86-ad14a7ec8b65" containerID="16dd00b8e92a48afb1de700c65de2150aad338b99b9bdd7dcc1d38f9d0afa662" exitCode=2 Apr 20 21:07:05.404171 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:05.404084 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nr6v9" event={"ID":"6a8049a3-c453-40a7-8b86-ad14a7ec8b65","Type":"ContainerDied","Data":"16dd00b8e92a48afb1de700c65de2150aad338b99b9bdd7dcc1d38f9d0afa662"} Apr 20 21:07:06.408936 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:06.408896 2576 generic.go:358] "Generic (PLEG): container finished" podID="35b95e08-2815-4f36-a556-c809b43eee74" containerID="1c8e118d3b8b96795646c064511c74734568ac68fc037c667b473ccc2b5ec795" exitCode=0 Apr 20 21:07:06.409418 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:06.408980 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w" event={"ID":"35b95e08-2815-4f36-a556-c809b43eee74","Type":"ContainerDied","Data":"1c8e118d3b8b96795646c064511c74734568ac68fc037c667b473ccc2b5ec795"} Apr 20 21:07:07.414607 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:07.414567 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w" event={"ID":"35b95e08-2815-4f36-a556-c809b43eee74","Type":"ContainerStarted","Data":"38c55e32280b578ead892e122ffe209c395a42033969083e4f31491133fc3180"} Apr 20 21:07:07.414607 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:07.414605 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w" event={"ID":"35b95e08-2815-4f36-a556-c809b43eee74","Type":"ContainerStarted","Data":"04f06d5e7788f3a098647943fa2012f1508313df9b9247eb031f663fe96030a4"} Apr 20 21:07:07.415156 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:07.414685 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w" Apr 20 21:07:07.436271 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:07.436216 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w" podStartSLOduration=3.436204371 podStartE2EDuration="3.436204371s" podCreationTimestamp="2026-04-20 21:07:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 21:07:07.43410022 +0000 UTC m=+3702.146782213" watchObservedRunningTime="2026-04-20 21:07:07.436204371 +0000 UTC m=+3702.148886428" Apr 20 21:07:08.418566 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:08.418536 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w" Apr 20 21:07:08.419893 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:08.419865 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w" podUID="35b95e08-2815-4f36-a556-c809b43eee74" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.64:8080: connect: connection refused" Apr 20 21:07:08.867569 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:08.867543 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nr6v9" Apr 20 21:07:08.912270 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:08.912175 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6a8049a3-c453-40a7-8b86-ad14a7ec8b65-kserve-provision-location\") pod \"6a8049a3-c453-40a7-8b86-ad14a7ec8b65\" (UID: \"6a8049a3-c453-40a7-8b86-ad14a7ec8b65\") " Apr 20 21:07:08.912404 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:08.912318 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzqz5\" (UniqueName: \"kubernetes.io/projected/6a8049a3-c453-40a7-8b86-ad14a7ec8b65-kube-api-access-hzqz5\") pod \"6a8049a3-c453-40a7-8b86-ad14a7ec8b65\" (UID: \"6a8049a3-c453-40a7-8b86-ad14a7ec8b65\") " Apr 20 21:07:08.912404 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:08.912356 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6a8049a3-c453-40a7-8b86-ad14a7ec8b65-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"6a8049a3-c453-40a7-8b86-ad14a7ec8b65\" (UID: \"6a8049a3-c453-40a7-8b86-ad14a7ec8b65\") " Apr 20 21:07:08.912528 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:08.912423 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a8049a3-c453-40a7-8b86-ad14a7ec8b65-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6a8049a3-c453-40a7-8b86-ad14a7ec8b65" (UID: "6a8049a3-c453-40a7-8b86-ad14a7ec8b65"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:07:08.912528 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:08.912426 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6a8049a3-c453-40a7-8b86-ad14a7ec8b65-proxy-tls\") pod \"6a8049a3-c453-40a7-8b86-ad14a7ec8b65\" (UID: \"6a8049a3-c453-40a7-8b86-ad14a7ec8b65\") " Apr 20 21:07:08.912783 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:08.912754 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a8049a3-c453-40a7-8b86-ad14a7ec8b65-isvc-sklearn-s3-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-kube-rbac-proxy-sar-config") pod "6a8049a3-c453-40a7-8b86-ad14a7ec8b65" (UID: "6a8049a3-c453-40a7-8b86-ad14a7ec8b65"). InnerVolumeSpecName "isvc-sklearn-s3-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:07:08.912895 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:08.912827 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6a8049a3-c453-40a7-8b86-ad14a7ec8b65-kserve-provision-location\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 21:07:08.914604 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:08.914576 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a8049a3-c453-40a7-8b86-ad14a7ec8b65-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "6a8049a3-c453-40a7-8b86-ad14a7ec8b65" (UID: "6a8049a3-c453-40a7-8b86-ad14a7ec8b65"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:07:08.914687 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:08.914624 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a8049a3-c453-40a7-8b86-ad14a7ec8b65-kube-api-access-hzqz5" (OuterVolumeSpecName: "kube-api-access-hzqz5") pod "6a8049a3-c453-40a7-8b86-ad14a7ec8b65" (UID: "6a8049a3-c453-40a7-8b86-ad14a7ec8b65"). InnerVolumeSpecName "kube-api-access-hzqz5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:07:09.013885 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:09.013827 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6a8049a3-c453-40a7-8b86-ad14a7ec8b65-proxy-tls\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 21:07:09.013885 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:09.013850 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hzqz5\" (UniqueName: \"kubernetes.io/projected/6a8049a3-c453-40a7-8b86-ad14a7ec8b65-kube-api-access-hzqz5\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 21:07:09.013885 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:09.013861 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6a8049a3-c453-40a7-8b86-ad14a7ec8b65-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 21:07:09.422923 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:09.422889 2576 generic.go:358] "Generic (PLEG): container finished" podID="6a8049a3-c453-40a7-8b86-ad14a7ec8b65" containerID="b00eda351987dbfdc29c6fe2bcce6d0a0621da7200870b1cf80a87aed5732450" exitCode=0 Apr 20 21:07:09.423330 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:09.422974 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nr6v9" Apr 20 21:07:09.423330 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:09.422979 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nr6v9" event={"ID":"6a8049a3-c453-40a7-8b86-ad14a7ec8b65","Type":"ContainerDied","Data":"b00eda351987dbfdc29c6fe2bcce6d0a0621da7200870b1cf80a87aed5732450"} Apr 20 21:07:09.423330 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:09.423021 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nr6v9" event={"ID":"6a8049a3-c453-40a7-8b86-ad14a7ec8b65","Type":"ContainerDied","Data":"54064259821a6b68225db6f317218124a97a816e5b3da3a719b2b5473cd4cbbc"} Apr 20 21:07:09.423330 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:09.423041 2576 scope.go:117] "RemoveContainer" containerID="16dd00b8e92a48afb1de700c65de2150aad338b99b9bdd7dcc1d38f9d0afa662" Apr 20 21:07:09.423735 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:09.423671 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w" podUID="35b95e08-2815-4f36-a556-c809b43eee74" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.64:8080: connect: connection refused" Apr 20 21:07:09.431550 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:09.431535 2576 scope.go:117] "RemoveContainer" containerID="b00eda351987dbfdc29c6fe2bcce6d0a0621da7200870b1cf80a87aed5732450" Apr 20 21:07:09.439668 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:09.439651 2576 scope.go:117] "RemoveContainer" containerID="69bda21086b2ade55f68a7123c01fceb030f89014ae0cc940ab438c7f5965232" Apr 20 21:07:09.445177 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:09.445153 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nr6v9"] Apr 20 21:07:09.447688 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:09.447668 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-nr6v9"] Apr 20 21:07:09.448034 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:09.448019 2576 scope.go:117] "RemoveContainer" containerID="16dd00b8e92a48afb1de700c65de2150aad338b99b9bdd7dcc1d38f9d0afa662" Apr 20 21:07:09.448310 ip-10-0-143-23 kubenswrapper[2576]: E0420 21:07:09.448290 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16dd00b8e92a48afb1de700c65de2150aad338b99b9bdd7dcc1d38f9d0afa662\": container with ID starting with 16dd00b8e92a48afb1de700c65de2150aad338b99b9bdd7dcc1d38f9d0afa662 not found: ID does not exist" containerID="16dd00b8e92a48afb1de700c65de2150aad338b99b9bdd7dcc1d38f9d0afa662" Apr 20 21:07:09.448375 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:09.448318 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16dd00b8e92a48afb1de700c65de2150aad338b99b9bdd7dcc1d38f9d0afa662"} err="failed to get container status \"16dd00b8e92a48afb1de700c65de2150aad338b99b9bdd7dcc1d38f9d0afa662\": rpc error: code = NotFound desc = could not find container \"16dd00b8e92a48afb1de700c65de2150aad338b99b9bdd7dcc1d38f9d0afa662\": container with ID starting with 16dd00b8e92a48afb1de700c65de2150aad338b99b9bdd7dcc1d38f9d0afa662 not found: ID does not exist" Apr 20 21:07:09.448375 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:09.448340 2576 scope.go:117] "RemoveContainer" containerID="b00eda351987dbfdc29c6fe2bcce6d0a0621da7200870b1cf80a87aed5732450" Apr 20 21:07:09.448567 ip-10-0-143-23 kubenswrapper[2576]: E0420 21:07:09.448552 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b00eda351987dbfdc29c6fe2bcce6d0a0621da7200870b1cf80a87aed5732450\": container with ID starting with b00eda351987dbfdc29c6fe2bcce6d0a0621da7200870b1cf80a87aed5732450 not found: ID does not exist" containerID="b00eda351987dbfdc29c6fe2bcce6d0a0621da7200870b1cf80a87aed5732450" Apr 20 21:07:09.448610 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:09.448572 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b00eda351987dbfdc29c6fe2bcce6d0a0621da7200870b1cf80a87aed5732450"} err="failed to get container status \"b00eda351987dbfdc29c6fe2bcce6d0a0621da7200870b1cf80a87aed5732450\": rpc error: code = NotFound desc = could not find container \"b00eda351987dbfdc29c6fe2bcce6d0a0621da7200870b1cf80a87aed5732450\": container with ID starting with b00eda351987dbfdc29c6fe2bcce6d0a0621da7200870b1cf80a87aed5732450 not found: ID does not exist" Apr 20 21:07:09.448610 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:09.448585 2576 scope.go:117] "RemoveContainer" containerID="69bda21086b2ade55f68a7123c01fceb030f89014ae0cc940ab438c7f5965232" Apr 20 21:07:09.448780 ip-10-0-143-23 kubenswrapper[2576]: E0420 21:07:09.448766 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69bda21086b2ade55f68a7123c01fceb030f89014ae0cc940ab438c7f5965232\": container with ID starting with 69bda21086b2ade55f68a7123c01fceb030f89014ae0cc940ab438c7f5965232 not found: ID does not exist" containerID="69bda21086b2ade55f68a7123c01fceb030f89014ae0cc940ab438c7f5965232" Apr 20 21:07:09.448819 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:09.448782 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69bda21086b2ade55f68a7123c01fceb030f89014ae0cc940ab438c7f5965232"} err="failed to get container status \"69bda21086b2ade55f68a7123c01fceb030f89014ae0cc940ab438c7f5965232\": rpc error: code = NotFound desc = could not find container \"69bda21086b2ade55f68a7123c01fceb030f89014ae0cc940ab438c7f5965232\": container with ID starting with 69bda21086b2ade55f68a7123c01fceb030f89014ae0cc940ab438c7f5965232 not found: ID does not exist" Apr 20 21:07:09.944730 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:09.944699 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a8049a3-c453-40a7-8b86-ad14a7ec8b65" path="/var/lib/kubelet/pods/6a8049a3-c453-40a7-8b86-ad14a7ec8b65/volumes" Apr 20 21:07:14.427315 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:14.427287 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w" Apr 20 21:07:14.427863 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:14.427834 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w" podUID="35b95e08-2815-4f36-a556-c809b43eee74" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.64:8080: connect: connection refused" Apr 20 21:07:24.428661 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:24.428622 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w" podUID="35b95e08-2815-4f36-a556-c809b43eee74" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.64:8080: connect: connection refused" Apr 20 21:07:34.427955 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:34.427914 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w" podUID="35b95e08-2815-4f36-a556-c809b43eee74" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.64:8080: connect: connection refused" Apr 20 21:07:44.427879 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:44.427838 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w" podUID="35b95e08-2815-4f36-a556-c809b43eee74" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.64:8080: connect: connection refused" Apr 20 21:07:54.428176 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:07:54.428135 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w" podUID="35b95e08-2815-4f36-a556-c809b43eee74" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.64:8080: connect: connection refused" Apr 20 21:08:04.428579 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:04.428538 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w" podUID="35b95e08-2815-4f36-a556-c809b43eee74" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.64:8080: connect: connection refused" Apr 20 21:08:14.428317 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:14.428240 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w" Apr 20 21:08:14.883347 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:14.883316 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w"] Apr 20 21:08:14.883717 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:14.883676 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w" podUID="35b95e08-2815-4f36-a556-c809b43eee74" containerName="kserve-container" containerID="cri-o://04f06d5e7788f3a098647943fa2012f1508313df9b9247eb031f663fe96030a4" gracePeriod=30 Apr 20 21:08:14.883829 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:14.883700 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w" podUID="35b95e08-2815-4f36-a556-c809b43eee74" containerName="kube-rbac-proxy" containerID="cri-o://38c55e32280b578ead892e122ffe209c395a42033969083e4f31491133fc3180" gracePeriod=30 Apr 20 21:08:15.646906 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:15.646871 2576 generic.go:358] "Generic (PLEG): container finished" podID="35b95e08-2815-4f36-a556-c809b43eee74" containerID="38c55e32280b578ead892e122ffe209c395a42033969083e4f31491133fc3180" exitCode=2 Apr 20 21:08:15.647295 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:15.646944 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w" event={"ID":"35b95e08-2815-4f36-a556-c809b43eee74","Type":"ContainerDied","Data":"38c55e32280b578ead892e122ffe209c395a42033969083e4f31491133fc3180"} Apr 20 21:08:15.945352 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:15.945271 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-8bnzt"] Apr 20 21:08:15.945617 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:15.945603 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6a8049a3-c453-40a7-8b86-ad14a7ec8b65" containerName="kube-rbac-proxy" Apr 20 21:08:15.945666 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:15.945619 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a8049a3-c453-40a7-8b86-ad14a7ec8b65" containerName="kube-rbac-proxy" Apr 20 21:08:15.945666 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:15.945634 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6a8049a3-c453-40a7-8b86-ad14a7ec8b65" containerName="kserve-container" Apr 20 21:08:15.945666 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:15.945639 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a8049a3-c453-40a7-8b86-ad14a7ec8b65" containerName="kserve-container" Apr 20 21:08:15.945666 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:15.945650 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6a8049a3-c453-40a7-8b86-ad14a7ec8b65" containerName="storage-initializer" Apr 20 21:08:15.945666 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:15.945656 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a8049a3-c453-40a7-8b86-ad14a7ec8b65" containerName="storage-initializer" Apr 20 21:08:15.945843 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:15.945718 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="6a8049a3-c453-40a7-8b86-ad14a7ec8b65" containerName="kserve-container" Apr 20 21:08:15.945843 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:15.945729 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="6a8049a3-c453-40a7-8b86-ad14a7ec8b65" containerName="kube-rbac-proxy" Apr 20 21:08:15.948883 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:15.948859 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-8bnzt" Apr 20 21:08:15.951281 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:15.951253 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-fail-predictor-serving-cert\"" Apr 20 21:08:15.951408 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:15.951283 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\"" Apr 20 21:08:15.959521 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:15.959502 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-8bnzt"] Apr 20 21:08:16.013540 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:16.013511 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4093e5b0-9cc4-4195-8963-97e9be6a6254-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-8bnzt\" (UID: \"4093e5b0-9cc4-4195-8963-97e9be6a6254\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-8bnzt" Apr 20 21:08:16.013658 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:16.013555 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6qpj\" (UniqueName: \"kubernetes.io/projected/4093e5b0-9cc4-4195-8963-97e9be6a6254-kube-api-access-f6qpj\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-8bnzt\" (UID: \"4093e5b0-9cc4-4195-8963-97e9be6a6254\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-8bnzt" Apr 20 21:08:16.013830 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:16.013801 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4093e5b0-9cc4-4195-8963-97e9be6a6254-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-8bnzt\" (UID: \"4093e5b0-9cc4-4195-8963-97e9be6a6254\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-8bnzt" Apr 20 21:08:16.013948 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:16.013894 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4093e5b0-9cc4-4195-8963-97e9be6a6254-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-8bnzt\" (UID: \"4093e5b0-9cc4-4195-8963-97e9be6a6254\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-8bnzt" Apr 20 21:08:16.114940 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:16.114909 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4093e5b0-9cc4-4195-8963-97e9be6a6254-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-8bnzt\" (UID: \"4093e5b0-9cc4-4195-8963-97e9be6a6254\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-8bnzt" Apr 20 21:08:16.115074 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:16.114954 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f6qpj\" (UniqueName: \"kubernetes.io/projected/4093e5b0-9cc4-4195-8963-97e9be6a6254-kube-api-access-f6qpj\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-8bnzt\" (UID: \"4093e5b0-9cc4-4195-8963-97e9be6a6254\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-8bnzt" Apr 20 21:08:16.115074 ip-10-0-143-23 kubenswrapper[2576]: E0420 21:08:16.115049 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-serving-cert: secret "isvc-sklearn-s3-tls-global-fail-predictor-serving-cert" not found Apr 20 21:08:16.115180 ip-10-0-143-23 kubenswrapper[2576]: E0420 21:08:16.115130 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4093e5b0-9cc4-4195-8963-97e9be6a6254-proxy-tls podName:4093e5b0-9cc4-4195-8963-97e9be6a6254 nodeName:}" failed. No retries permitted until 2026-04-20 21:08:16.615087123 +0000 UTC m=+3771.327769099 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/4093e5b0-9cc4-4195-8963-97e9be6a6254-proxy-tls") pod "isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-8bnzt" (UID: "4093e5b0-9cc4-4195-8963-97e9be6a6254") : secret "isvc-sklearn-s3-tls-global-fail-predictor-serving-cert" not found Apr 20 21:08:16.115180 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:16.115153 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4093e5b0-9cc4-4195-8963-97e9be6a6254-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-8bnzt\" (UID: \"4093e5b0-9cc4-4195-8963-97e9be6a6254\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-8bnzt" Apr 20 21:08:16.115286 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:16.115183 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4093e5b0-9cc4-4195-8963-97e9be6a6254-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-8bnzt\" (UID: \"4093e5b0-9cc4-4195-8963-97e9be6a6254\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-8bnzt" Apr 20 21:08:16.115514 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:16.115499 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4093e5b0-9cc4-4195-8963-97e9be6a6254-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-8bnzt\" (UID: \"4093e5b0-9cc4-4195-8963-97e9be6a6254\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-8bnzt" Apr 20 21:08:16.115755 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:16.115738 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4093e5b0-9cc4-4195-8963-97e9be6a6254-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-8bnzt\" (UID: \"4093e5b0-9cc4-4195-8963-97e9be6a6254\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-8bnzt" Apr 20 21:08:16.125894 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:16.125876 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6qpj\" (UniqueName: \"kubernetes.io/projected/4093e5b0-9cc4-4195-8963-97e9be6a6254-kube-api-access-f6qpj\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-8bnzt\" (UID: \"4093e5b0-9cc4-4195-8963-97e9be6a6254\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-8bnzt" Apr 20 21:08:16.619286 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:16.619251 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4093e5b0-9cc4-4195-8963-97e9be6a6254-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-8bnzt\" (UID: \"4093e5b0-9cc4-4195-8963-97e9be6a6254\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-8bnzt" Apr 20 21:08:16.621695 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:16.621665 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4093e5b0-9cc4-4195-8963-97e9be6a6254-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-8bnzt\" (UID: \"4093e5b0-9cc4-4195-8963-97e9be6a6254\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-8bnzt" Apr 20 21:08:16.859784 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:16.859754 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-8bnzt" Apr 20 21:08:16.974981 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:16.974958 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-8bnzt"] Apr 20 21:08:16.983234 ip-10-0-143-23 kubenswrapper[2576]: W0420 21:08:16.977932 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4093e5b0_9cc4_4195_8963_97e9be6a6254.slice/crio-d7d6f3a777943acbe3f2cb25fc97ee73287cc4e9f508c3707c834118a951c7a5 WatchSource:0}: Error finding container d7d6f3a777943acbe3f2cb25fc97ee73287cc4e9f508c3707c834118a951c7a5: Status 404 returned error can't find the container with id d7d6f3a777943acbe3f2cb25fc97ee73287cc4e9f508c3707c834118a951c7a5 Apr 20 21:08:17.656093 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:17.656052 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-8bnzt" event={"ID":"4093e5b0-9cc4-4195-8963-97e9be6a6254","Type":"ContainerStarted","Data":"5dbdb3b7be9c3c186159b3adb255a68b30cad6cf7c9319e86cc4dceafedd777c"} Apr 20 21:08:17.656280 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:17.656100 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-8bnzt" event={"ID":"4093e5b0-9cc4-4195-8963-97e9be6a6254","Type":"ContainerStarted","Data":"d7d6f3a777943acbe3f2cb25fc97ee73287cc4e9f508c3707c834118a951c7a5"} Apr 20 21:08:18.928016 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:18.927995 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w" Apr 20 21:08:19.037923 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:19.037896 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z248t\" (UniqueName: \"kubernetes.io/projected/35b95e08-2815-4f36-a556-c809b43eee74-kube-api-access-z248t\") pod \"35b95e08-2815-4f36-a556-c809b43eee74\" (UID: \"35b95e08-2815-4f36-a556-c809b43eee74\") " Apr 20 21:08:19.038067 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:19.037942 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/35b95e08-2815-4f36-a556-c809b43eee74-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"35b95e08-2815-4f36-a556-c809b43eee74\" (UID: \"35b95e08-2815-4f36-a556-c809b43eee74\") " Apr 20 21:08:19.038067 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:19.037984 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/35b95e08-2815-4f36-a556-c809b43eee74-cabundle-cert\") pod \"35b95e08-2815-4f36-a556-c809b43eee74\" (UID: \"35b95e08-2815-4f36-a556-c809b43eee74\") " Apr 20 21:08:19.038067 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:19.038012 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/35b95e08-2815-4f36-a556-c809b43eee74-proxy-tls\") pod \"35b95e08-2815-4f36-a556-c809b43eee74\" (UID: \"35b95e08-2815-4f36-a556-c809b43eee74\") " Apr 20 21:08:19.038067 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:19.038055 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/35b95e08-2815-4f36-a556-c809b43eee74-kserve-provision-location\") pod \"35b95e08-2815-4f36-a556-c809b43eee74\" (UID: \"35b95e08-2815-4f36-a556-c809b43eee74\") " Apr 20 21:08:19.038409 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:19.038372 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35b95e08-2815-4f36-a556-c809b43eee74-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "35b95e08-2815-4f36-a556-c809b43eee74" (UID: "35b95e08-2815-4f36-a556-c809b43eee74"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:08:19.038518 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:19.038428 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35b95e08-2815-4f36-a556-c809b43eee74-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "35b95e08-2815-4f36-a556-c809b43eee74" (UID: "35b95e08-2815-4f36-a556-c809b43eee74"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:08:19.038518 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:19.038418 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35b95e08-2815-4f36-a556-c809b43eee74-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config") pod "35b95e08-2815-4f36-a556-c809b43eee74" (UID: "35b95e08-2815-4f36-a556-c809b43eee74"). InnerVolumeSpecName "isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:08:19.040155 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:19.040133 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35b95e08-2815-4f36-a556-c809b43eee74-kube-api-access-z248t" (OuterVolumeSpecName: "kube-api-access-z248t") pod "35b95e08-2815-4f36-a556-c809b43eee74" (UID: "35b95e08-2815-4f36-a556-c809b43eee74"). InnerVolumeSpecName "kube-api-access-z248t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:08:19.040217 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:19.040137 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35b95e08-2815-4f36-a556-c809b43eee74-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "35b95e08-2815-4f36-a556-c809b43eee74" (UID: "35b95e08-2815-4f36-a556-c809b43eee74"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:08:19.138696 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:19.138667 2576 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/35b95e08-2815-4f36-a556-c809b43eee74-cabundle-cert\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 21:08:19.138696 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:19.138691 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/35b95e08-2815-4f36-a556-c809b43eee74-proxy-tls\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 21:08:19.138858 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:19.138707 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/35b95e08-2815-4f36-a556-c809b43eee74-kserve-provision-location\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 21:08:19.138858 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:19.138721 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z248t\" (UniqueName: \"kubernetes.io/projected/35b95e08-2815-4f36-a556-c809b43eee74-kube-api-access-z248t\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 21:08:19.138858 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:19.138734 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/35b95e08-2815-4f36-a556-c809b43eee74-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 21:08:19.665686 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:19.665654 2576 generic.go:358] "Generic (PLEG): container finished" podID="35b95e08-2815-4f36-a556-c809b43eee74" containerID="04f06d5e7788f3a098647943fa2012f1508313df9b9247eb031f663fe96030a4" exitCode=0 Apr 20 21:08:19.665927 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:19.665729 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w" Apr 20 21:08:19.665927 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:19.665736 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w" event={"ID":"35b95e08-2815-4f36-a556-c809b43eee74","Type":"ContainerDied","Data":"04f06d5e7788f3a098647943fa2012f1508313df9b9247eb031f663fe96030a4"} Apr 20 21:08:19.665927 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:19.665771 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w" event={"ID":"35b95e08-2815-4f36-a556-c809b43eee74","Type":"ContainerDied","Data":"dd80a975ed16033e2e19a0d0f9c59de9c5aac40cbdc12e9d70ef0dca87e1d8ab"} Apr 20 21:08:19.665927 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:19.665787 2576 scope.go:117] "RemoveContainer" containerID="38c55e32280b578ead892e122ffe209c395a42033969083e4f31491133fc3180" Apr 20 21:08:19.674105 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:19.674091 2576 scope.go:117] "RemoveContainer" containerID="04f06d5e7788f3a098647943fa2012f1508313df9b9247eb031f663fe96030a4" Apr 20 21:08:19.681127 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:19.681098 2576 scope.go:117] "RemoveContainer" containerID="1c8e118d3b8b96795646c064511c74734568ac68fc037c667b473ccc2b5ec795" Apr 20 21:08:19.686341 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:19.686319 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w"] Apr 20 21:08:19.688168 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:19.688154 2576 scope.go:117] "RemoveContainer" containerID="38c55e32280b578ead892e122ffe209c395a42033969083e4f31491133fc3180" Apr 20 21:08:19.688394 ip-10-0-143-23 kubenswrapper[2576]: E0420 21:08:19.688376 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38c55e32280b578ead892e122ffe209c395a42033969083e4f31491133fc3180\": container with ID starting with 38c55e32280b578ead892e122ffe209c395a42033969083e4f31491133fc3180 not found: ID does not exist" containerID="38c55e32280b578ead892e122ffe209c395a42033969083e4f31491133fc3180" Apr 20 21:08:19.688455 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:19.688400 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38c55e32280b578ead892e122ffe209c395a42033969083e4f31491133fc3180"} err="failed to get container status \"38c55e32280b578ead892e122ffe209c395a42033969083e4f31491133fc3180\": rpc error: code = NotFound desc = could not find container \"38c55e32280b578ead892e122ffe209c395a42033969083e4f31491133fc3180\": container with ID starting with 38c55e32280b578ead892e122ffe209c395a42033969083e4f31491133fc3180 not found: ID does not exist" Apr 20 21:08:19.688455 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:19.688415 2576 scope.go:117] "RemoveContainer" containerID="04f06d5e7788f3a098647943fa2012f1508313df9b9247eb031f663fe96030a4" Apr 20 21:08:19.688615 ip-10-0-143-23 kubenswrapper[2576]: E0420 21:08:19.688601 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04f06d5e7788f3a098647943fa2012f1508313df9b9247eb031f663fe96030a4\": container with ID starting with 04f06d5e7788f3a098647943fa2012f1508313df9b9247eb031f663fe96030a4 not found: ID does not exist" containerID="04f06d5e7788f3a098647943fa2012f1508313df9b9247eb031f663fe96030a4" Apr 20 21:08:19.688656 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:19.688618 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04f06d5e7788f3a098647943fa2012f1508313df9b9247eb031f663fe96030a4"} err="failed to get container status \"04f06d5e7788f3a098647943fa2012f1508313df9b9247eb031f663fe96030a4\": rpc error: code = NotFound desc = could not find container \"04f06d5e7788f3a098647943fa2012f1508313df9b9247eb031f663fe96030a4\": container with ID starting with 04f06d5e7788f3a098647943fa2012f1508313df9b9247eb031f663fe96030a4 not found: ID does not exist" Apr 20 21:08:19.688656 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:19.688629 2576 scope.go:117] "RemoveContainer" containerID="1c8e118d3b8b96795646c064511c74734568ac68fc037c667b473ccc2b5ec795" Apr 20 21:08:19.688855 ip-10-0-143-23 kubenswrapper[2576]: E0420 21:08:19.688835 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c8e118d3b8b96795646c064511c74734568ac68fc037c667b473ccc2b5ec795\": container with ID starting with 1c8e118d3b8b96795646c064511c74734568ac68fc037c667b473ccc2b5ec795 not found: ID does not exist" containerID="1c8e118d3b8b96795646c064511c74734568ac68fc037c667b473ccc2b5ec795" Apr 20 21:08:19.688901 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:19.688860 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c8e118d3b8b96795646c064511c74734568ac68fc037c667b473ccc2b5ec795"} err="failed to get container status \"1c8e118d3b8b96795646c064511c74734568ac68fc037c667b473ccc2b5ec795\": rpc error: code = NotFound desc = could not find container \"1c8e118d3b8b96795646c064511c74734568ac68fc037c667b473ccc2b5ec795\": container with ID starting with 1c8e118d3b8b96795646c064511c74734568ac68fc037c667b473ccc2b5ec795 not found: ID does not exist" Apr 20 21:08:19.692324 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:19.692307 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-hdt5w"] Apr 20 21:08:19.944386 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:19.944317 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35b95e08-2815-4f36-a556-c809b43eee74" path="/var/lib/kubelet/pods/35b95e08-2815-4f36-a556-c809b43eee74/volumes" Apr 20 21:08:23.682514 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:23.682439 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-8bnzt_4093e5b0-9cc4-4195-8963-97e9be6a6254/storage-initializer/0.log" Apr 20 21:08:23.682514 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:23.682476 2576 generic.go:358] "Generic (PLEG): container finished" podID="4093e5b0-9cc4-4195-8963-97e9be6a6254" containerID="5dbdb3b7be9c3c186159b3adb255a68b30cad6cf7c9319e86cc4dceafedd777c" exitCode=1 Apr 20 21:08:23.682891 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:23.682529 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-8bnzt" event={"ID":"4093e5b0-9cc4-4195-8963-97e9be6a6254","Type":"ContainerDied","Data":"5dbdb3b7be9c3c186159b3adb255a68b30cad6cf7c9319e86cc4dceafedd777c"} Apr 20 21:08:24.686750 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:24.686721 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-8bnzt_4093e5b0-9cc4-4195-8963-97e9be6a6254/storage-initializer/0.log" Apr 20 21:08:24.687133 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:24.686796 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-8bnzt" event={"ID":"4093e5b0-9cc4-4195-8963-97e9be6a6254","Type":"ContainerStarted","Data":"6e309e4c503534f25ee003dea95bc9f1ecfb08d204fc80435060381a1fb43c26"} Apr 20 21:08:25.947664 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:25.947631 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-8bnzt"] Apr 20 21:08:25.948103 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:25.947902 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-8bnzt" podUID="4093e5b0-9cc4-4195-8963-97e9be6a6254" containerName="storage-initializer" containerID="cri-o://6e309e4c503534f25ee003dea95bc9f1ecfb08d204fc80435060381a1fb43c26" gracePeriod=30 Apr 20 21:08:27.017182 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:27.017146 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r"] Apr 20 21:08:27.017534 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:27.017476 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="35b95e08-2815-4f36-a556-c809b43eee74" containerName="kube-rbac-proxy" Apr 20 21:08:27.017534 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:27.017487 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="35b95e08-2815-4f36-a556-c809b43eee74" containerName="kube-rbac-proxy" Apr 20 21:08:27.017534 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:27.017495 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="35b95e08-2815-4f36-a556-c809b43eee74" containerName="storage-initializer" Apr 20 21:08:27.017534 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:27.017500 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="35b95e08-2815-4f36-a556-c809b43eee74" containerName="storage-initializer" Apr 20 21:08:27.017534 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:27.017514 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="35b95e08-2815-4f36-a556-c809b43eee74" containerName="kserve-container" Apr 20 21:08:27.017534 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:27.017519 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="35b95e08-2815-4f36-a556-c809b43eee74" containerName="kserve-container" Apr 20 21:08:27.017727 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:27.017574 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="35b95e08-2815-4f36-a556-c809b43eee74" containerName="kube-rbac-proxy" Apr 20 21:08:27.017727 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:27.017582 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="35b95e08-2815-4f36-a556-c809b43eee74" containerName="kserve-container" Apr 20 21:08:27.021960 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:27.021941 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r" Apr 20 21:08:27.024536 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:27.024504 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-pass-predictor-serving-cert\"" Apr 20 21:08:27.024536 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:27.024519 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 20 21:08:27.024716 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:27.024582 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\"" Apr 20 21:08:27.029170 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:27.029132 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r"] Apr 20 21:08:27.101475 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:27.101447 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c25a2f8d-0388-4a23-9fe0-e50987535213-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r\" (UID: \"c25a2f8d-0388-4a23-9fe0-e50987535213\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r" Apr 20 21:08:27.101614 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:27.101501 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfcxd\" (UniqueName: \"kubernetes.io/projected/c25a2f8d-0388-4a23-9fe0-e50987535213-kube-api-access-vfcxd\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r\" (UID: \"c25a2f8d-0388-4a23-9fe0-e50987535213\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r" Apr 20 21:08:27.101614 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:27.101590 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/c25a2f8d-0388-4a23-9fe0-e50987535213-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r\" (UID: \"c25a2f8d-0388-4a23-9fe0-e50987535213\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r" Apr 20 21:08:27.101742 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:27.101650 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c25a2f8d-0388-4a23-9fe0-e50987535213-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r\" (UID: \"c25a2f8d-0388-4a23-9fe0-e50987535213\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r" Apr 20 21:08:27.101742 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:27.101679 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c25a2f8d-0388-4a23-9fe0-e50987535213-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r\" (UID: \"c25a2f8d-0388-4a23-9fe0-e50987535213\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r" Apr 20 21:08:27.202884 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:27.202853 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vfcxd\" (UniqueName: \"kubernetes.io/projected/c25a2f8d-0388-4a23-9fe0-e50987535213-kube-api-access-vfcxd\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r\" (UID: \"c25a2f8d-0388-4a23-9fe0-e50987535213\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r" Apr 20 21:08:27.203007 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:27.202905 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/c25a2f8d-0388-4a23-9fe0-e50987535213-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r\" (UID: \"c25a2f8d-0388-4a23-9fe0-e50987535213\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r" Apr 20 21:08:27.203007 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:27.202953 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c25a2f8d-0388-4a23-9fe0-e50987535213-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r\" (UID: \"c25a2f8d-0388-4a23-9fe0-e50987535213\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r" Apr 20 21:08:27.203007 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:27.202980 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c25a2f8d-0388-4a23-9fe0-e50987535213-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r\" (UID: \"c25a2f8d-0388-4a23-9fe0-e50987535213\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r" Apr 20 21:08:27.203222 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:27.203038 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c25a2f8d-0388-4a23-9fe0-e50987535213-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r\" (UID: \"c25a2f8d-0388-4a23-9fe0-e50987535213\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r" Apr 20 21:08:27.203222 ip-10-0-143-23 kubenswrapper[2576]: E0420 21:08:27.203185 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-serving-cert: secret "isvc-sklearn-s3-tls-custom-pass-predictor-serving-cert" not found Apr 20 21:08:27.203326 ip-10-0-143-23 kubenswrapper[2576]: E0420 21:08:27.203246 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c25a2f8d-0388-4a23-9fe0-e50987535213-proxy-tls podName:c25a2f8d-0388-4a23-9fe0-e50987535213 nodeName:}" failed. No retries permitted until 2026-04-20 21:08:27.703224846 +0000 UTC m=+3782.415906824 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/c25a2f8d-0388-4a23-9fe0-e50987535213-proxy-tls") pod "isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r" (UID: "c25a2f8d-0388-4a23-9fe0-e50987535213") : secret "isvc-sklearn-s3-tls-custom-pass-predictor-serving-cert" not found Apr 20 21:08:27.203461 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:27.203440 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c25a2f8d-0388-4a23-9fe0-e50987535213-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r\" (UID: \"c25a2f8d-0388-4a23-9fe0-e50987535213\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r" Apr 20 21:08:27.203564 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:27.203547 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/c25a2f8d-0388-4a23-9fe0-e50987535213-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r\" (UID: \"c25a2f8d-0388-4a23-9fe0-e50987535213\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r" Apr 20 21:08:27.203614 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:27.203596 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c25a2f8d-0388-4a23-9fe0-e50987535213-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r\" (UID: \"c25a2f8d-0388-4a23-9fe0-e50987535213\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r" Apr 20 21:08:27.213884 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:27.213867 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfcxd\" (UniqueName: \"kubernetes.io/projected/c25a2f8d-0388-4a23-9fe0-e50987535213-kube-api-access-vfcxd\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r\" (UID: \"c25a2f8d-0388-4a23-9fe0-e50987535213\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r" Apr 20 21:08:27.706930 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:27.706900 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c25a2f8d-0388-4a23-9fe0-e50987535213-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r\" (UID: \"c25a2f8d-0388-4a23-9fe0-e50987535213\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r" Apr 20 21:08:27.709276 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:27.709255 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c25a2f8d-0388-4a23-9fe0-e50987535213-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r\" (UID: \"c25a2f8d-0388-4a23-9fe0-e50987535213\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r" Apr 20 21:08:27.933483 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:27.933448 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r" Apr 20 21:08:28.058886 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:28.058858 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r"] Apr 20 21:08:28.059726 ip-10-0-143-23 kubenswrapper[2576]: W0420 21:08:28.059700 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc25a2f8d_0388_4a23_9fe0_e50987535213.slice/crio-958598a6f471ef8a2bd013d7e9be1c5c10f29b35c46a59b1c8d4d3b10597aca5 WatchSource:0}: Error finding container 958598a6f471ef8a2bd013d7e9be1c5c10f29b35c46a59b1c8d4d3b10597aca5: Status 404 returned error can't find the container with id 958598a6f471ef8a2bd013d7e9be1c5c10f29b35c46a59b1c8d4d3b10597aca5 Apr 20 21:08:28.700603 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:28.700569 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r" event={"ID":"c25a2f8d-0388-4a23-9fe0-e50987535213","Type":"ContainerStarted","Data":"6abff3d36a652d30ce7cde25b89d0d848bf42eb52209238ee8f5e69794f8097e"} Apr 20 21:08:28.700603 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:28.700606 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r" event={"ID":"c25a2f8d-0388-4a23-9fe0-e50987535213","Type":"ContainerStarted","Data":"958598a6f471ef8a2bd013d7e9be1c5c10f29b35c46a59b1c8d4d3b10597aca5"} Apr 20 21:08:29.003174 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:29.003145 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-8bnzt_4093e5b0-9cc4-4195-8963-97e9be6a6254/storage-initializer/1.log" Apr 20 21:08:29.003578 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:29.003565 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-8bnzt_4093e5b0-9cc4-4195-8963-97e9be6a6254/storage-initializer/0.log" Apr 20 21:08:29.003648 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:29.003625 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-8bnzt" Apr 20 21:08:29.118210 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:29.118177 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4093e5b0-9cc4-4195-8963-97e9be6a6254-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"4093e5b0-9cc4-4195-8963-97e9be6a6254\" (UID: \"4093e5b0-9cc4-4195-8963-97e9be6a6254\") " Apr 20 21:08:29.118572 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:29.118229 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6qpj\" (UniqueName: \"kubernetes.io/projected/4093e5b0-9cc4-4195-8963-97e9be6a6254-kube-api-access-f6qpj\") pod \"4093e5b0-9cc4-4195-8963-97e9be6a6254\" (UID: \"4093e5b0-9cc4-4195-8963-97e9be6a6254\") " Apr 20 21:08:29.118572 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:29.118332 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4093e5b0-9cc4-4195-8963-97e9be6a6254-proxy-tls\") pod \"4093e5b0-9cc4-4195-8963-97e9be6a6254\" (UID: \"4093e5b0-9cc4-4195-8963-97e9be6a6254\") " Apr 20 21:08:29.118572 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:29.118367 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4093e5b0-9cc4-4195-8963-97e9be6a6254-kserve-provision-location\") pod \"4093e5b0-9cc4-4195-8963-97e9be6a6254\" (UID: \"4093e5b0-9cc4-4195-8963-97e9be6a6254\") " Apr 20 21:08:29.118572 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:29.118501 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4093e5b0-9cc4-4195-8963-97e9be6a6254-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config") pod "4093e5b0-9cc4-4195-8963-97e9be6a6254" (UID: "4093e5b0-9cc4-4195-8963-97e9be6a6254"). InnerVolumeSpecName "isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:08:29.118821 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:29.118666 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4093e5b0-9cc4-4195-8963-97e9be6a6254-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4093e5b0-9cc4-4195-8963-97e9be6a6254" (UID: "4093e5b0-9cc4-4195-8963-97e9be6a6254"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:08:29.118821 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:29.118688 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4093e5b0-9cc4-4195-8963-97e9be6a6254-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 21:08:29.120198 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:29.120180 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4093e5b0-9cc4-4195-8963-97e9be6a6254-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4093e5b0-9cc4-4195-8963-97e9be6a6254" (UID: "4093e5b0-9cc4-4195-8963-97e9be6a6254"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:08:29.120280 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:29.120231 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4093e5b0-9cc4-4195-8963-97e9be6a6254-kube-api-access-f6qpj" (OuterVolumeSpecName: "kube-api-access-f6qpj") pod "4093e5b0-9cc4-4195-8963-97e9be6a6254" (UID: "4093e5b0-9cc4-4195-8963-97e9be6a6254"). InnerVolumeSpecName "kube-api-access-f6qpj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:08:29.219993 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:29.219971 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4093e5b0-9cc4-4195-8963-97e9be6a6254-proxy-tls\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 21:08:29.220074 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:29.219995 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4093e5b0-9cc4-4195-8963-97e9be6a6254-kserve-provision-location\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 21:08:29.220074 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:29.220006 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f6qpj\" (UniqueName: \"kubernetes.io/projected/4093e5b0-9cc4-4195-8963-97e9be6a6254-kube-api-access-f6qpj\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 21:08:29.704962 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:29.704928 2576 generic.go:358] "Generic (PLEG): container finished" podID="c25a2f8d-0388-4a23-9fe0-e50987535213" containerID="6abff3d36a652d30ce7cde25b89d0d848bf42eb52209238ee8f5e69794f8097e" exitCode=0 Apr 20 21:08:29.705144 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:29.705018 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r" event={"ID":"c25a2f8d-0388-4a23-9fe0-e50987535213","Type":"ContainerDied","Data":"6abff3d36a652d30ce7cde25b89d0d848bf42eb52209238ee8f5e69794f8097e"} Apr 20 21:08:29.706266 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:29.706245 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-8bnzt_4093e5b0-9cc4-4195-8963-97e9be6a6254/storage-initializer/1.log" Apr 20 21:08:29.706614 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:29.706598 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-8bnzt_4093e5b0-9cc4-4195-8963-97e9be6a6254/storage-initializer/0.log" Apr 20 21:08:29.706710 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:29.706634 2576 generic.go:358] "Generic (PLEG): container finished" podID="4093e5b0-9cc4-4195-8963-97e9be6a6254" containerID="6e309e4c503534f25ee003dea95bc9f1ecfb08d204fc80435060381a1fb43c26" exitCode=1 Apr 20 21:08:29.706710 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:29.706663 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-8bnzt" event={"ID":"4093e5b0-9cc4-4195-8963-97e9be6a6254","Type":"ContainerDied","Data":"6e309e4c503534f25ee003dea95bc9f1ecfb08d204fc80435060381a1fb43c26"} Apr 20 21:08:29.706710 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:29.706687 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-8bnzt" event={"ID":"4093e5b0-9cc4-4195-8963-97e9be6a6254","Type":"ContainerDied","Data":"d7d6f3a777943acbe3f2cb25fc97ee73287cc4e9f508c3707c834118a951c7a5"} Apr 20 21:08:29.706710 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:29.706701 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-8bnzt" Apr 20 21:08:29.706885 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:29.706706 2576 scope.go:117] "RemoveContainer" containerID="6e309e4c503534f25ee003dea95bc9f1ecfb08d204fc80435060381a1fb43c26" Apr 20 21:08:29.714840 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:29.714808 2576 scope.go:117] "RemoveContainer" containerID="5dbdb3b7be9c3c186159b3adb255a68b30cad6cf7c9319e86cc4dceafedd777c" Apr 20 21:08:29.721969 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:29.721951 2576 scope.go:117] "RemoveContainer" containerID="6e309e4c503534f25ee003dea95bc9f1ecfb08d204fc80435060381a1fb43c26" Apr 20 21:08:29.722243 ip-10-0-143-23 kubenswrapper[2576]: E0420 21:08:29.722225 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e309e4c503534f25ee003dea95bc9f1ecfb08d204fc80435060381a1fb43c26\": container with ID starting with 6e309e4c503534f25ee003dea95bc9f1ecfb08d204fc80435060381a1fb43c26 not found: ID does not exist" containerID="6e309e4c503534f25ee003dea95bc9f1ecfb08d204fc80435060381a1fb43c26" Apr 20 21:08:29.722308 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:29.722251 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e309e4c503534f25ee003dea95bc9f1ecfb08d204fc80435060381a1fb43c26"} err="failed to get container status \"6e309e4c503534f25ee003dea95bc9f1ecfb08d204fc80435060381a1fb43c26\": rpc error: code = NotFound desc = could not find container \"6e309e4c503534f25ee003dea95bc9f1ecfb08d204fc80435060381a1fb43c26\": container with ID starting with 6e309e4c503534f25ee003dea95bc9f1ecfb08d204fc80435060381a1fb43c26 not found: ID does not exist" Apr 20 21:08:29.722308 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:29.722267 2576 scope.go:117] "RemoveContainer" containerID="5dbdb3b7be9c3c186159b3adb255a68b30cad6cf7c9319e86cc4dceafedd777c" Apr 20 21:08:29.722514 ip-10-0-143-23 kubenswrapper[2576]: E0420 21:08:29.722493 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5dbdb3b7be9c3c186159b3adb255a68b30cad6cf7c9319e86cc4dceafedd777c\": container with ID starting with 5dbdb3b7be9c3c186159b3adb255a68b30cad6cf7c9319e86cc4dceafedd777c not found: ID does not exist" containerID="5dbdb3b7be9c3c186159b3adb255a68b30cad6cf7c9319e86cc4dceafedd777c" Apr 20 21:08:29.722580 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:29.722521 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dbdb3b7be9c3c186159b3adb255a68b30cad6cf7c9319e86cc4dceafedd777c"} err="failed to get container status \"5dbdb3b7be9c3c186159b3adb255a68b30cad6cf7c9319e86cc4dceafedd777c\": rpc error: code = NotFound desc = could not find container \"5dbdb3b7be9c3c186159b3adb255a68b30cad6cf7c9319e86cc4dceafedd777c\": container with ID starting with 5dbdb3b7be9c3c186159b3adb255a68b30cad6cf7c9319e86cc4dceafedd777c not found: ID does not exist" Apr 20 21:08:29.754897 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:29.754874 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-8bnzt"] Apr 20 21:08:29.757845 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:29.757821 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-8bnzt"] Apr 20 21:08:29.945000 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:29.944969 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4093e5b0-9cc4-4195-8963-97e9be6a6254" path="/var/lib/kubelet/pods/4093e5b0-9cc4-4195-8963-97e9be6a6254/volumes" Apr 20 21:08:30.712087 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:30.712049 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r" event={"ID":"c25a2f8d-0388-4a23-9fe0-e50987535213","Type":"ContainerStarted","Data":"e3d04649c8c1564154b60972b930c9cb14b183d2e5d585d819566b2578adc6ef"} Apr 20 21:08:30.712087 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:30.712088 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r" event={"ID":"c25a2f8d-0388-4a23-9fe0-e50987535213","Type":"ContainerStarted","Data":"138cdbbcb7893f63242c6d1228d2dd6d40f57cbc726f05aca55070e698aae5d2"} Apr 20 21:08:30.712500 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:30.712217 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r" Apr 20 21:08:30.731761 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:30.731717 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r" podStartSLOduration=3.731704877 podStartE2EDuration="3.731704877s" podCreationTimestamp="2026-04-20 21:08:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 21:08:30.729943193 +0000 UTC m=+3785.442625190" watchObservedRunningTime="2026-04-20 21:08:30.731704877 +0000 UTC m=+3785.444386874" Apr 20 21:08:31.715668 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:31.715633 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r" Apr 20 21:08:31.716840 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:31.716813 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r" podUID="c25a2f8d-0388-4a23-9fe0-e50987535213" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.66:8080: connect: connection refused" Apr 20 21:08:32.719192 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:32.719158 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r" podUID="c25a2f8d-0388-4a23-9fe0-e50987535213" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.66:8080: connect: connection refused" Apr 20 21:08:37.723895 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:37.723863 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r" Apr 20 21:08:37.724396 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:37.724373 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r" podUID="c25a2f8d-0388-4a23-9fe0-e50987535213" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.66:8080: connect: connection refused" Apr 20 21:08:47.724588 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:47.724550 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r" podUID="c25a2f8d-0388-4a23-9fe0-e50987535213" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.66:8080: connect: connection refused" Apr 20 21:08:57.724335 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:08:57.724295 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r" podUID="c25a2f8d-0388-4a23-9fe0-e50987535213" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.66:8080: connect: connection refused" Apr 20 21:09:07.724886 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:07.724846 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r" podUID="c25a2f8d-0388-4a23-9fe0-e50987535213" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.66:8080: connect: connection refused" Apr 20 21:09:17.725056 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:17.725018 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r" podUID="c25a2f8d-0388-4a23-9fe0-e50987535213" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.66:8080: connect: connection refused" Apr 20 21:09:27.724808 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:27.724766 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r" podUID="c25a2f8d-0388-4a23-9fe0-e50987535213" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.66:8080: connect: connection refused" Apr 20 21:09:37.725351 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:37.725269 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r" Apr 20 21:09:47.077596 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:47.077561 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r"] Apr 20 21:09:47.078006 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:47.077864 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r" podUID="c25a2f8d-0388-4a23-9fe0-e50987535213" containerName="kserve-container" containerID="cri-o://138cdbbcb7893f63242c6d1228d2dd6d40f57cbc726f05aca55070e698aae5d2" gracePeriod=30 Apr 20 21:09:47.078006 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:47.077900 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r" podUID="c25a2f8d-0388-4a23-9fe0-e50987535213" containerName="kube-rbac-proxy" containerID="cri-o://e3d04649c8c1564154b60972b930c9cb14b183d2e5d585d819566b2578adc6ef" gracePeriod=30 Apr 20 21:09:47.719406 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:47.719367 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r" podUID="c25a2f8d-0388-4a23-9fe0-e50987535213" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.66:8643/healthz\": dial tcp 10.132.0.66:8643: connect: connection refused" Apr 20 21:09:47.724916 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:47.724886 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r" podUID="c25a2f8d-0388-4a23-9fe0-e50987535213" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.66:8080: connect: connection refused" Apr 20 21:09:47.977602 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:47.977515 2576 generic.go:358] "Generic (PLEG): container finished" podID="c25a2f8d-0388-4a23-9fe0-e50987535213" containerID="e3d04649c8c1564154b60972b930c9cb14b183d2e5d585d819566b2578adc6ef" exitCode=2 Apr 20 21:09:47.977749 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:47.977591 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r" event={"ID":"c25a2f8d-0388-4a23-9fe0-e50987535213","Type":"ContainerDied","Data":"e3d04649c8c1564154b60972b930c9cb14b183d2e5d585d819566b2578adc6ef"} Apr 20 21:09:48.141727 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:48.141690 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-7jqd7"] Apr 20 21:09:48.142256 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:48.142240 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4093e5b0-9cc4-4195-8963-97e9be6a6254" containerName="storage-initializer" Apr 20 21:09:48.142309 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:48.142260 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4093e5b0-9cc4-4195-8963-97e9be6a6254" containerName="storage-initializer" Apr 20 21:09:48.142365 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:48.142354 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="4093e5b0-9cc4-4195-8963-97e9be6a6254" containerName="storage-initializer" Apr 20 21:09:48.142400 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:48.142373 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="4093e5b0-9cc4-4195-8963-97e9be6a6254" containerName="storage-initializer" Apr 20 21:09:48.142470 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:48.142459 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4093e5b0-9cc4-4195-8963-97e9be6a6254" containerName="storage-initializer" Apr 20 21:09:48.142505 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:48.142471 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="4093e5b0-9cc4-4195-8963-97e9be6a6254" containerName="storage-initializer" Apr 20 21:09:48.145928 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:48.145912 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-7jqd7" Apr 20 21:09:48.148296 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:48.148274 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-fail-predictor-serving-cert\"" Apr 20 21:09:48.148435 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:48.148324 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\"" Apr 20 21:09:48.156534 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:48.156513 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-7jqd7"] Apr 20 21:09:48.217949 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:48.217916 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/df13aab4-5d14-4736-8415-35a8b0f266fd-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-7jqd7\" (UID: \"df13aab4-5d14-4736-8415-35a8b0f266fd\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-7jqd7" Apr 20 21:09:48.218105 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:48.217967 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/df13aab4-5d14-4736-8415-35a8b0f266fd-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-7jqd7\" (UID: \"df13aab4-5d14-4736-8415-35a8b0f266fd\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-7jqd7" Apr 20 21:09:48.218105 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:48.218030 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/df13aab4-5d14-4736-8415-35a8b0f266fd-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-7jqd7\" (UID: \"df13aab4-5d14-4736-8415-35a8b0f266fd\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-7jqd7" Apr 20 21:09:48.218105 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:48.218072 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrbcz\" (UniqueName: \"kubernetes.io/projected/df13aab4-5d14-4736-8415-35a8b0f266fd-kube-api-access-zrbcz\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-7jqd7\" (UID: \"df13aab4-5d14-4736-8415-35a8b0f266fd\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-7jqd7" Apr 20 21:09:48.319352 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:48.319319 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/df13aab4-5d14-4736-8415-35a8b0f266fd-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-7jqd7\" (UID: \"df13aab4-5d14-4736-8415-35a8b0f266fd\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-7jqd7" Apr 20 21:09:48.319479 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:48.319364 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/df13aab4-5d14-4736-8415-35a8b0f266fd-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-7jqd7\" (UID: \"df13aab4-5d14-4736-8415-35a8b0f266fd\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-7jqd7" Apr 20 21:09:48.319479 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:48.319401 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zrbcz\" (UniqueName: \"kubernetes.io/projected/df13aab4-5d14-4736-8415-35a8b0f266fd-kube-api-access-zrbcz\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-7jqd7\" (UID: \"df13aab4-5d14-4736-8415-35a8b0f266fd\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-7jqd7" Apr 20 21:09:48.319479 ip-10-0-143-23 kubenswrapper[2576]: E0420 21:09:48.319466 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-serving-cert: secret "isvc-sklearn-s3-tls-custom-fail-predictor-serving-cert" not found Apr 20 21:09:48.319639 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:48.319478 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/df13aab4-5d14-4736-8415-35a8b0f266fd-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-7jqd7\" (UID: \"df13aab4-5d14-4736-8415-35a8b0f266fd\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-7jqd7" Apr 20 21:09:48.319639 ip-10-0-143-23 kubenswrapper[2576]: E0420 21:09:48.319537 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df13aab4-5d14-4736-8415-35a8b0f266fd-proxy-tls podName:df13aab4-5d14-4736-8415-35a8b0f266fd nodeName:}" failed. No retries permitted until 2026-04-20 21:09:48.819518095 +0000 UTC m=+3863.532200075 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/df13aab4-5d14-4736-8415-35a8b0f266fd-proxy-tls") pod "isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-7jqd7" (UID: "df13aab4-5d14-4736-8415-35a8b0f266fd") : secret "isvc-sklearn-s3-tls-custom-fail-predictor-serving-cert" not found Apr 20 21:09:48.319854 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:48.319826 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/df13aab4-5d14-4736-8415-35a8b0f266fd-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-7jqd7\" (UID: \"df13aab4-5d14-4736-8415-35a8b0f266fd\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-7jqd7" Apr 20 21:09:48.320206 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:48.320186 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/df13aab4-5d14-4736-8415-35a8b0f266fd-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-7jqd7\" (UID: \"df13aab4-5d14-4736-8415-35a8b0f266fd\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-7jqd7" Apr 20 21:09:48.328684 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:48.328664 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrbcz\" (UniqueName: \"kubernetes.io/projected/df13aab4-5d14-4736-8415-35a8b0f266fd-kube-api-access-zrbcz\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-7jqd7\" (UID: \"df13aab4-5d14-4736-8415-35a8b0f266fd\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-7jqd7" Apr 20 21:09:48.824517 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:48.824477 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/df13aab4-5d14-4736-8415-35a8b0f266fd-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-7jqd7\" (UID: \"df13aab4-5d14-4736-8415-35a8b0f266fd\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-7jqd7" Apr 20 21:09:48.826812 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:48.826782 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/df13aab4-5d14-4736-8415-35a8b0f266fd-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-7jqd7\" (UID: \"df13aab4-5d14-4736-8415-35a8b0f266fd\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-7jqd7" Apr 20 21:09:49.057256 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:49.057216 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-7jqd7" Apr 20 21:09:49.176912 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:49.176887 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-7jqd7"] Apr 20 21:09:49.178584 ip-10-0-143-23 kubenswrapper[2576]: W0420 21:09:49.178556 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf13aab4_5d14_4736_8415_35a8b0f266fd.slice/crio-d0ef32d0a722012eb3dcd61732c4bb55b6e03a61c9fe61981b67de1d2e9f01cd WatchSource:0}: Error finding container d0ef32d0a722012eb3dcd61732c4bb55b6e03a61c9fe61981b67de1d2e9f01cd: Status 404 returned error can't find the container with id d0ef32d0a722012eb3dcd61732c4bb55b6e03a61c9fe61981b67de1d2e9f01cd Apr 20 21:09:49.985759 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:49.985723 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-7jqd7" event={"ID":"df13aab4-5d14-4736-8415-35a8b0f266fd","Type":"ContainerStarted","Data":"6d81202fdabcdd7d2e1a92dd6a0f2e1d054549805ab31f7ce7856ffb976ecb4b"} Apr 20 21:09:49.985927 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:49.985763 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-7jqd7" event={"ID":"df13aab4-5d14-4736-8415-35a8b0f266fd","Type":"ContainerStarted","Data":"d0ef32d0a722012eb3dcd61732c4bb55b6e03a61c9fe61981b67de1d2e9f01cd"} Apr 20 21:09:50.991085 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:50.991050 2576 generic.go:358] "Generic (PLEG): container finished" podID="c25a2f8d-0388-4a23-9fe0-e50987535213" containerID="138cdbbcb7893f63242c6d1228d2dd6d40f57cbc726f05aca55070e698aae5d2" exitCode=0 Apr 20 21:09:50.991442 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:50.991139 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r" event={"ID":"c25a2f8d-0388-4a23-9fe0-e50987535213","Type":"ContainerDied","Data":"138cdbbcb7893f63242c6d1228d2dd6d40f57cbc726f05aca55070e698aae5d2"} Apr 20 21:09:51.028722 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:51.028701 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r" Apr 20 21:09:51.142905 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:51.142834 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c25a2f8d-0388-4a23-9fe0-e50987535213-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"c25a2f8d-0388-4a23-9fe0-e50987535213\" (UID: \"c25a2f8d-0388-4a23-9fe0-e50987535213\") " Apr 20 21:09:51.142905 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:51.142870 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c25a2f8d-0388-4a23-9fe0-e50987535213-proxy-tls\") pod \"c25a2f8d-0388-4a23-9fe0-e50987535213\" (UID: \"c25a2f8d-0388-4a23-9fe0-e50987535213\") " Apr 20 21:09:51.143095 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:51.142947 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfcxd\" (UniqueName: \"kubernetes.io/projected/c25a2f8d-0388-4a23-9fe0-e50987535213-kube-api-access-vfcxd\") pod \"c25a2f8d-0388-4a23-9fe0-e50987535213\" (UID: \"c25a2f8d-0388-4a23-9fe0-e50987535213\") " Apr 20 21:09:51.143095 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:51.142976 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/c25a2f8d-0388-4a23-9fe0-e50987535213-cabundle-cert\") pod \"c25a2f8d-0388-4a23-9fe0-e50987535213\" (UID: \"c25a2f8d-0388-4a23-9fe0-e50987535213\") " Apr 20 21:09:51.143095 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:51.143027 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c25a2f8d-0388-4a23-9fe0-e50987535213-kserve-provision-location\") pod \"c25a2f8d-0388-4a23-9fe0-e50987535213\" (UID: \"c25a2f8d-0388-4a23-9fe0-e50987535213\") " Apr 20 21:09:51.143349 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:51.143320 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c25a2f8d-0388-4a23-9fe0-e50987535213-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config") pod "c25a2f8d-0388-4a23-9fe0-e50987535213" (UID: "c25a2f8d-0388-4a23-9fe0-e50987535213"). InnerVolumeSpecName "isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:09:51.143466 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:51.143386 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c25a2f8d-0388-4a23-9fe0-e50987535213-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c25a2f8d-0388-4a23-9fe0-e50987535213" (UID: "c25a2f8d-0388-4a23-9fe0-e50987535213"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:09:51.143466 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:51.143426 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c25a2f8d-0388-4a23-9fe0-e50987535213-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "c25a2f8d-0388-4a23-9fe0-e50987535213" (UID: "c25a2f8d-0388-4a23-9fe0-e50987535213"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:09:51.144900 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:51.144882 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c25a2f8d-0388-4a23-9fe0-e50987535213-kube-api-access-vfcxd" (OuterVolumeSpecName: "kube-api-access-vfcxd") pod "c25a2f8d-0388-4a23-9fe0-e50987535213" (UID: "c25a2f8d-0388-4a23-9fe0-e50987535213"). InnerVolumeSpecName "kube-api-access-vfcxd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:09:51.144955 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:51.144885 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c25a2f8d-0388-4a23-9fe0-e50987535213-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c25a2f8d-0388-4a23-9fe0-e50987535213" (UID: "c25a2f8d-0388-4a23-9fe0-e50987535213"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:09:51.244102 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:51.244068 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vfcxd\" (UniqueName: \"kubernetes.io/projected/c25a2f8d-0388-4a23-9fe0-e50987535213-kube-api-access-vfcxd\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 21:09:51.244102 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:51.244099 2576 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/c25a2f8d-0388-4a23-9fe0-e50987535213-cabundle-cert\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 21:09:51.244102 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:51.244126 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c25a2f8d-0388-4a23-9fe0-e50987535213-kserve-provision-location\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 21:09:51.244308 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:51.244139 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c25a2f8d-0388-4a23-9fe0-e50987535213-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 21:09:51.244308 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:51.244150 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c25a2f8d-0388-4a23-9fe0-e50987535213-proxy-tls\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 21:09:51.996510 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:51.996487 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r" Apr 20 21:09:51.996860 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:51.996486 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r" event={"ID":"c25a2f8d-0388-4a23-9fe0-e50987535213","Type":"ContainerDied","Data":"958598a6f471ef8a2bd013d7e9be1c5c10f29b35c46a59b1c8d4d3b10597aca5"} Apr 20 21:09:51.996860 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:51.996603 2576 scope.go:117] "RemoveContainer" containerID="e3d04649c8c1564154b60972b930c9cb14b183d2e5d585d819566b2578adc6ef" Apr 20 21:09:52.004321 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:52.004303 2576 scope.go:117] "RemoveContainer" containerID="138cdbbcb7893f63242c6d1228d2dd6d40f57cbc726f05aca55070e698aae5d2" Apr 20 21:09:52.011229 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:52.011207 2576 scope.go:117] "RemoveContainer" containerID="6abff3d36a652d30ce7cde25b89d0d848bf42eb52209238ee8f5e69794f8097e" Apr 20 21:09:52.016153 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:52.016133 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r"] Apr 20 21:09:52.021298 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:52.021277 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-q486r"] Apr 20 21:09:53.944444 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:53.944411 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c25a2f8d-0388-4a23-9fe0-e50987535213" path="/var/lib/kubelet/pods/c25a2f8d-0388-4a23-9fe0-e50987535213/volumes" Apr 20 21:09:57.016650 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:57.016623 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-7jqd7_df13aab4-5d14-4736-8415-35a8b0f266fd/storage-initializer/0.log" Apr 20 21:09:57.017074 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:57.016661 2576 generic.go:358] "Generic (PLEG): container finished" podID="df13aab4-5d14-4736-8415-35a8b0f266fd" containerID="6d81202fdabcdd7d2e1a92dd6a0f2e1d054549805ab31f7ce7856ffb976ecb4b" exitCode=1 Apr 20 21:09:57.017074 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:57.016725 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-7jqd7" event={"ID":"df13aab4-5d14-4736-8415-35a8b0f266fd","Type":"ContainerDied","Data":"6d81202fdabcdd7d2e1a92dd6a0f2e1d054549805ab31f7ce7856ffb976ecb4b"} Apr 20 21:09:58.022037 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:58.022002 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-7jqd7_df13aab4-5d14-4736-8415-35a8b0f266fd/storage-initializer/0.log" Apr 20 21:09:58.022518 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:58.022143 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-7jqd7" event={"ID":"df13aab4-5d14-4736-8415-35a8b0f266fd","Type":"ContainerStarted","Data":"a791c448c55cb149b26fa245e16d1263342b813a0b3a56f4b03e468fbd15fa4a"} Apr 20 21:09:58.183699 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:58.183670 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-7jqd7"] Apr 20 21:09:59.025384 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:59.025347 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-7jqd7" podUID="df13aab4-5d14-4736-8415-35a8b0f266fd" containerName="storage-initializer" containerID="cri-o://a791c448c55cb149b26fa245e16d1263342b813a0b3a56f4b03e468fbd15fa4a" gracePeriod=30 Apr 20 21:09:59.214173 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:59.214132 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7"] Apr 20 21:09:59.214497 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:59.214482 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c25a2f8d-0388-4a23-9fe0-e50987535213" containerName="storage-initializer" Apr 20 21:09:59.214497 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:59.214497 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c25a2f8d-0388-4a23-9fe0-e50987535213" containerName="storage-initializer" Apr 20 21:09:59.214619 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:59.214505 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c25a2f8d-0388-4a23-9fe0-e50987535213" containerName="kube-rbac-proxy" Apr 20 21:09:59.214619 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:59.214511 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c25a2f8d-0388-4a23-9fe0-e50987535213" containerName="kube-rbac-proxy" Apr 20 21:09:59.214619 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:59.214526 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c25a2f8d-0388-4a23-9fe0-e50987535213" containerName="kserve-container" Apr 20 21:09:59.214619 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:59.214531 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="c25a2f8d-0388-4a23-9fe0-e50987535213" containerName="kserve-container" Apr 20 21:09:59.214619 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:59.214591 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c25a2f8d-0388-4a23-9fe0-e50987535213" containerName="kube-rbac-proxy" Apr 20 21:09:59.214619 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:59.214600 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="c25a2f8d-0388-4a23-9fe0-e50987535213" containerName="kserve-container" Apr 20 21:09:59.217431 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:59.217414 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7" Apr 20 21:09:59.219928 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:59.219905 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-pass-predictor-serving-cert\"" Apr 20 21:09:59.219928 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:59.219925 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\"" Apr 20 21:09:59.220133 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:59.219926 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 20 21:09:59.227593 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:59.227572 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7"] Apr 20 21:09:59.305754 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:59.305731 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/ed70b9fb-aecd-4212-9e38-58c13048748a-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7\" (UID: \"ed70b9fb-aecd-4212-9e38-58c13048748a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7" Apr 20 21:09:59.305863 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:59.305768 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ed70b9fb-aecd-4212-9e38-58c13048748a-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7\" (UID: \"ed70b9fb-aecd-4212-9e38-58c13048748a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7" Apr 20 21:09:59.305863 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:59.305785 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxrss\" (UniqueName: \"kubernetes.io/projected/ed70b9fb-aecd-4212-9e38-58c13048748a-kube-api-access-nxrss\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7\" (UID: \"ed70b9fb-aecd-4212-9e38-58c13048748a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7" Apr 20 21:09:59.305863 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:59.305804 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ed70b9fb-aecd-4212-9e38-58c13048748a-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7\" (UID: \"ed70b9fb-aecd-4212-9e38-58c13048748a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7" Apr 20 21:09:59.305863 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:59.305862 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ed70b9fb-aecd-4212-9e38-58c13048748a-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7\" (UID: \"ed70b9fb-aecd-4212-9e38-58c13048748a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7" Apr 20 21:09:59.406958 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:59.406934 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ed70b9fb-aecd-4212-9e38-58c13048748a-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7\" (UID: \"ed70b9fb-aecd-4212-9e38-58c13048748a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7" Apr 20 21:09:59.407048 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:59.406980 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/ed70b9fb-aecd-4212-9e38-58c13048748a-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7\" (UID: \"ed70b9fb-aecd-4212-9e38-58c13048748a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7" Apr 20 21:09:59.407048 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:59.407005 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ed70b9fb-aecd-4212-9e38-58c13048748a-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7\" (UID: \"ed70b9fb-aecd-4212-9e38-58c13048748a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7" Apr 20 21:09:59.407048 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:59.407029 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nxrss\" (UniqueName: \"kubernetes.io/projected/ed70b9fb-aecd-4212-9e38-58c13048748a-kube-api-access-nxrss\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7\" (UID: \"ed70b9fb-aecd-4212-9e38-58c13048748a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7" Apr 20 21:09:59.407225 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:59.407057 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ed70b9fb-aecd-4212-9e38-58c13048748a-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7\" (UID: \"ed70b9fb-aecd-4212-9e38-58c13048748a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7" Apr 20 21:09:59.407225 ip-10-0-143-23 kubenswrapper[2576]: E0420 21:09:59.407135 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-serving-cert: secret "isvc-sklearn-s3-tls-serving-pass-predictor-serving-cert" not found Apr 20 21:09:59.407225 ip-10-0-143-23 kubenswrapper[2576]: E0420 21:09:59.407195 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed70b9fb-aecd-4212-9e38-58c13048748a-proxy-tls podName:ed70b9fb-aecd-4212-9e38-58c13048748a nodeName:}" failed. No retries permitted until 2026-04-20 21:09:59.907175861 +0000 UTC m=+3874.619857841 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/ed70b9fb-aecd-4212-9e38-58c13048748a-proxy-tls") pod "isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7" (UID: "ed70b9fb-aecd-4212-9e38-58c13048748a") : secret "isvc-sklearn-s3-tls-serving-pass-predictor-serving-cert" not found Apr 20 21:09:59.407447 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:59.407430 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ed70b9fb-aecd-4212-9e38-58c13048748a-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7\" (UID: \"ed70b9fb-aecd-4212-9e38-58c13048748a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7" Apr 20 21:09:59.407608 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:59.407591 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/ed70b9fb-aecd-4212-9e38-58c13048748a-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7\" (UID: \"ed70b9fb-aecd-4212-9e38-58c13048748a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7" Apr 20 21:09:59.407647 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:59.407621 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ed70b9fb-aecd-4212-9e38-58c13048748a-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7\" (UID: \"ed70b9fb-aecd-4212-9e38-58c13048748a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7" Apr 20 21:09:59.415325 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:59.415299 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxrss\" (UniqueName: \"kubernetes.io/projected/ed70b9fb-aecd-4212-9e38-58c13048748a-kube-api-access-nxrss\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7\" (UID: \"ed70b9fb-aecd-4212-9e38-58c13048748a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7" Apr 20 21:09:59.910985 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:59.910892 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ed70b9fb-aecd-4212-9e38-58c13048748a-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7\" (UID: \"ed70b9fb-aecd-4212-9e38-58c13048748a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7" Apr 20 21:09:59.913172 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:09:59.913151 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ed70b9fb-aecd-4212-9e38-58c13048748a-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7\" (UID: \"ed70b9fb-aecd-4212-9e38-58c13048748a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7" Apr 20 21:10:00.128401 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:10:00.128369 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7" Apr 20 21:10:00.254225 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:10:00.254182 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7"] Apr 20 21:10:00.257366 ip-10-0-143-23 kubenswrapper[2576]: W0420 21:10:00.257342 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded70b9fb_aecd_4212_9e38_58c13048748a.slice/crio-2db099133b189e056f4c24c788e45d33614c038d1c763ea8b41ff6999c0afda8 WatchSource:0}: Error finding container 2db099133b189e056f4c24c788e45d33614c038d1c763ea8b41ff6999c0afda8: Status 404 returned error can't find the container with id 2db099133b189e056f4c24c788e45d33614c038d1c763ea8b41ff6999c0afda8 Apr 20 21:10:01.033728 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:10:01.033696 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7" event={"ID":"ed70b9fb-aecd-4212-9e38-58c13048748a","Type":"ContainerStarted","Data":"28f47eed3a0adbeaa2e848674f4c755e84e60855b93e7d9e122f3d237474a6eb"} Apr 20 21:10:01.033728 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:10:01.033732 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7" event={"ID":"ed70b9fb-aecd-4212-9e38-58c13048748a","Type":"ContainerStarted","Data":"2db099133b189e056f4c24c788e45d33614c038d1c763ea8b41ff6999c0afda8"} Apr 20 21:10:02.037711 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:10:02.037679 2576 generic.go:358] "Generic (PLEG): container finished" podID="ed70b9fb-aecd-4212-9e38-58c13048748a" containerID="28f47eed3a0adbeaa2e848674f4c755e84e60855b93e7d9e122f3d237474a6eb" exitCode=0 Apr 20 21:10:02.038083 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:10:02.037750 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7" event={"ID":"ed70b9fb-aecd-4212-9e38-58c13048748a","Type":"ContainerDied","Data":"28f47eed3a0adbeaa2e848674f4c755e84e60855b93e7d9e122f3d237474a6eb"} Apr 20 21:10:03.042809 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:10:03.042769 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7" event={"ID":"ed70b9fb-aecd-4212-9e38-58c13048748a","Type":"ContainerStarted","Data":"ec8cba9a7533ae042762566867fc52d212bd583aa40ba070e61f64f541dc49b3"} Apr 20 21:10:03.043207 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:10:03.042816 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7" event={"ID":"ed70b9fb-aecd-4212-9e38-58c13048748a","Type":"ContainerStarted","Data":"a4ad1dfc83bb5e06cda3ba76a892d4094a7debaa3f0a9b341a7006fedb5c1628"} Apr 20 21:10:03.043207 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:10:03.042960 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7" Apr 20 21:10:03.064885 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:10:03.064836 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7" podStartSLOduration=4.064823249 podStartE2EDuration="4.064823249s" podCreationTimestamp="2026-04-20 21:09:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 21:10:03.062928601 +0000 UTC m=+3877.775610598" watchObservedRunningTime="2026-04-20 21:10:03.064823249 +0000 UTC m=+3877.777505243" Apr 20 21:10:03.565656 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:10:03.565636 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-7jqd7_df13aab4-5d14-4736-8415-35a8b0f266fd/storage-initializer/1.log" Apr 20 21:10:03.566040 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:10:03.566022 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-7jqd7_df13aab4-5d14-4736-8415-35a8b0f266fd/storage-initializer/0.log" Apr 20 21:10:03.566157 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:10:03.566095 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-7jqd7" Apr 20 21:10:03.638566 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:10:03.638483 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/df13aab4-5d14-4736-8415-35a8b0f266fd-proxy-tls\") pod \"df13aab4-5d14-4736-8415-35a8b0f266fd\" (UID: \"df13aab4-5d14-4736-8415-35a8b0f266fd\") " Apr 20 21:10:03.638566 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:10:03.638522 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/df13aab4-5d14-4736-8415-35a8b0f266fd-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"df13aab4-5d14-4736-8415-35a8b0f266fd\" (UID: \"df13aab4-5d14-4736-8415-35a8b0f266fd\") " Apr 20 21:10:03.638566 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:10:03.638540 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrbcz\" (UniqueName: \"kubernetes.io/projected/df13aab4-5d14-4736-8415-35a8b0f266fd-kube-api-access-zrbcz\") pod \"df13aab4-5d14-4736-8415-35a8b0f266fd\" (UID: \"df13aab4-5d14-4736-8415-35a8b0f266fd\") " Apr 20 21:10:03.638796 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:10:03.638614 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/df13aab4-5d14-4736-8415-35a8b0f266fd-kserve-provision-location\") pod \"df13aab4-5d14-4736-8415-35a8b0f266fd\" (UID: \"df13aab4-5d14-4736-8415-35a8b0f266fd\") " Apr 20 21:10:03.638911 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:10:03.638891 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df13aab4-5d14-4736-8415-35a8b0f266fd-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config") pod "df13aab4-5d14-4736-8415-35a8b0f266fd" (UID: "df13aab4-5d14-4736-8415-35a8b0f266fd"). InnerVolumeSpecName "isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:10:03.638984 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:10:03.638956 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df13aab4-5d14-4736-8415-35a8b0f266fd-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "df13aab4-5d14-4736-8415-35a8b0f266fd" (UID: "df13aab4-5d14-4736-8415-35a8b0f266fd"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:10:03.640651 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:10:03.640626 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df13aab4-5d14-4736-8415-35a8b0f266fd-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "df13aab4-5d14-4736-8415-35a8b0f266fd" (UID: "df13aab4-5d14-4736-8415-35a8b0f266fd"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:10:03.640651 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:10:03.640630 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df13aab4-5d14-4736-8415-35a8b0f266fd-kube-api-access-zrbcz" (OuterVolumeSpecName: "kube-api-access-zrbcz") pod "df13aab4-5d14-4736-8415-35a8b0f266fd" (UID: "df13aab4-5d14-4736-8415-35a8b0f266fd"). InnerVolumeSpecName "kube-api-access-zrbcz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:10:03.739090 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:10:03.739062 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/df13aab4-5d14-4736-8415-35a8b0f266fd-proxy-tls\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 21:10:03.739090 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:10:03.739086 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/df13aab4-5d14-4736-8415-35a8b0f266fd-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 21:10:03.739090 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:10:03.739096 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zrbcz\" (UniqueName: \"kubernetes.io/projected/df13aab4-5d14-4736-8415-35a8b0f266fd-kube-api-access-zrbcz\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 21:10:03.739307 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:10:03.739106 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/df13aab4-5d14-4736-8415-35a8b0f266fd-kserve-provision-location\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 21:10:04.047093 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:10:04.047067 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-7jqd7_df13aab4-5d14-4736-8415-35a8b0f266fd/storage-initializer/1.log" Apr 20 21:10:04.047490 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:10:04.047433 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-7jqd7_df13aab4-5d14-4736-8415-35a8b0f266fd/storage-initializer/0.log" Apr 20 21:10:04.047490 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:10:04.047475 2576 generic.go:358] "Generic (PLEG): container finished" podID="df13aab4-5d14-4736-8415-35a8b0f266fd" containerID="a791c448c55cb149b26fa245e16d1263342b813a0b3a56f4b03e468fbd15fa4a" exitCode=1 Apr 20 21:10:04.047564 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:10:04.047547 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-7jqd7" Apr 20 21:10:04.047599 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:10:04.047554 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-7jqd7" event={"ID":"df13aab4-5d14-4736-8415-35a8b0f266fd","Type":"ContainerDied","Data":"a791c448c55cb149b26fa245e16d1263342b813a0b3a56f4b03e468fbd15fa4a"} Apr 20 21:10:04.047599 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:10:04.047588 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-7jqd7" event={"ID":"df13aab4-5d14-4736-8415-35a8b0f266fd","Type":"ContainerDied","Data":"d0ef32d0a722012eb3dcd61732c4bb55b6e03a61c9fe61981b67de1d2e9f01cd"} Apr 20 21:10:04.047666 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:10:04.047603 2576 scope.go:117] "RemoveContainer" containerID="a791c448c55cb149b26fa245e16d1263342b813a0b3a56f4b03e468fbd15fa4a" Apr 20 21:10:04.048057 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:10:04.048036 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7" Apr 20 21:10:04.049319 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:10:04.049289 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7" podUID="ed70b9fb-aecd-4212-9e38-58c13048748a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.68:8080: connect: connection refused" Apr 20 21:10:04.055676 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:10:04.055661 2576 scope.go:117] "RemoveContainer" containerID="6d81202fdabcdd7d2e1a92dd6a0f2e1d054549805ab31f7ce7856ffb976ecb4b" Apr 20 21:10:04.062640 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:10:04.062622 2576 scope.go:117] "RemoveContainer" containerID="a791c448c55cb149b26fa245e16d1263342b813a0b3a56f4b03e468fbd15fa4a" Apr 20 21:10:04.062900 ip-10-0-143-23 kubenswrapper[2576]: E0420 21:10:04.062882 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a791c448c55cb149b26fa245e16d1263342b813a0b3a56f4b03e468fbd15fa4a\": container with ID starting with a791c448c55cb149b26fa245e16d1263342b813a0b3a56f4b03e468fbd15fa4a not found: ID does not exist" containerID="a791c448c55cb149b26fa245e16d1263342b813a0b3a56f4b03e468fbd15fa4a" Apr 20 21:10:04.062945 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:10:04.062907 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a791c448c55cb149b26fa245e16d1263342b813a0b3a56f4b03e468fbd15fa4a"} err="failed to get container status \"a791c448c55cb149b26fa245e16d1263342b813a0b3a56f4b03e468fbd15fa4a\": rpc error: code = NotFound desc = could not find container \"a791c448c55cb149b26fa245e16d1263342b813a0b3a56f4b03e468fbd15fa4a\": container with ID starting with a791c448c55cb149b26fa245e16d1263342b813a0b3a56f4b03e468fbd15fa4a not found: ID does not exist" Apr 20 21:10:04.062945 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:10:04.062924 2576 scope.go:117] "RemoveContainer" containerID="6d81202fdabcdd7d2e1a92dd6a0f2e1d054549805ab31f7ce7856ffb976ecb4b" Apr 20 21:10:04.063181 ip-10-0-143-23 kubenswrapper[2576]: E0420 21:10:04.063163 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d81202fdabcdd7d2e1a92dd6a0f2e1d054549805ab31f7ce7856ffb976ecb4b\": container with ID starting with 6d81202fdabcdd7d2e1a92dd6a0f2e1d054549805ab31f7ce7856ffb976ecb4b not found: ID does not exist" containerID="6d81202fdabcdd7d2e1a92dd6a0f2e1d054549805ab31f7ce7856ffb976ecb4b" Apr 20 21:10:04.063231 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:10:04.063187 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d81202fdabcdd7d2e1a92dd6a0f2e1d054549805ab31f7ce7856ffb976ecb4b"} err="failed to get container status \"6d81202fdabcdd7d2e1a92dd6a0f2e1d054549805ab31f7ce7856ffb976ecb4b\": rpc error: code = NotFound desc = could not find container \"6d81202fdabcdd7d2e1a92dd6a0f2e1d054549805ab31f7ce7856ffb976ecb4b\": container with ID starting with 6d81202fdabcdd7d2e1a92dd6a0f2e1d054549805ab31f7ce7856ffb976ecb4b not found: ID does not exist" Apr 20 21:10:04.083210 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:10:04.083189 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-7jqd7"] Apr 20 21:10:04.086971 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:10:04.086950 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-7jqd7"] Apr 20 21:10:05.051531 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:10:05.051498 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7" podUID="ed70b9fb-aecd-4212-9e38-58c13048748a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.68:8080: connect: connection refused" Apr 20 21:10:05.945471 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:10:05.945428 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df13aab4-5d14-4736-8415-35a8b0f266fd" path="/var/lib/kubelet/pods/df13aab4-5d14-4736-8415-35a8b0f266fd/volumes" Apr 20 21:10:10.055734 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:10:10.055704 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7" Apr 20 21:10:10.056310 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:10:10.056286 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7" podUID="ed70b9fb-aecd-4212-9e38-58c13048748a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.68:8080: connect: connection refused" Apr 20 21:10:20.056975 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:10:20.056936 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7" podUID="ed70b9fb-aecd-4212-9e38-58c13048748a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.68:8080: connect: connection refused" Apr 20 21:10:30.056785 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:10:30.056747 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7" podUID="ed70b9fb-aecd-4212-9e38-58c13048748a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.68:8080: connect: connection refused" Apr 20 21:10:40.056528 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:10:40.056490 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7" podUID="ed70b9fb-aecd-4212-9e38-58c13048748a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.68:8080: connect: connection refused" Apr 20 21:10:50.056948 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:10:50.056910 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7" podUID="ed70b9fb-aecd-4212-9e38-58c13048748a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.68:8080: connect: connection refused" Apr 20 21:10:54.575201 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:10:54.575083 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z55qt_f78ac3d9-bcf1-43dd-aac7-1678831ee3ba/ovn-acl-logging/0.log" Apr 20 21:10:54.581805 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:10:54.581786 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z55qt_f78ac3d9-bcf1-43dd-aac7-1678831ee3ba/ovn-acl-logging/0.log" Apr 20 21:11:00.056638 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:00.056595 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7" podUID="ed70b9fb-aecd-4212-9e38-58c13048748a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.68:8080: connect: connection refused" Apr 20 21:11:10.057314 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:10.057241 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7" Apr 20 21:11:19.251964 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:19.251929 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7"] Apr 20 21:11:19.252450 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:19.252361 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7" podUID="ed70b9fb-aecd-4212-9e38-58c13048748a" containerName="kserve-container" containerID="cri-o://a4ad1dfc83bb5e06cda3ba76a892d4094a7debaa3f0a9b341a7006fedb5c1628" gracePeriod=30 Apr 20 21:11:19.252550 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:19.252519 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7" podUID="ed70b9fb-aecd-4212-9e38-58c13048748a" containerName="kube-rbac-proxy" containerID="cri-o://ec8cba9a7533ae042762566867fc52d212bd583aa40ba070e61f64f541dc49b3" gracePeriod=30 Apr 20 21:11:20.052034 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:20.051993 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7" podUID="ed70b9fb-aecd-4212-9e38-58c13048748a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.68:8643/healthz\": dial tcp 10.132.0.68:8643: connect: connection refused" Apr 20 21:11:20.056317 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:20.056278 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7" podUID="ed70b9fb-aecd-4212-9e38-58c13048748a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.68:8080: connect: connection refused" Apr 20 21:11:20.299144 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:20.299098 2576 generic.go:358] "Generic (PLEG): container finished" podID="ed70b9fb-aecd-4212-9e38-58c13048748a" containerID="ec8cba9a7533ae042762566867fc52d212bd583aa40ba070e61f64f541dc49b3" exitCode=2 Apr 20 21:11:20.299144 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:20.299135 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7" event={"ID":"ed70b9fb-aecd-4212-9e38-58c13048748a","Type":"ContainerDied","Data":"ec8cba9a7533ae042762566867fc52d212bd583aa40ba070e61f64f541dc49b3"} Apr 20 21:11:20.324277 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:20.324212 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-t6t7h"] Apr 20 21:11:20.324598 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:20.324584 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="df13aab4-5d14-4736-8415-35a8b0f266fd" containerName="storage-initializer" Apr 20 21:11:20.324653 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:20.324599 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="df13aab4-5d14-4736-8415-35a8b0f266fd" containerName="storage-initializer" Apr 20 21:11:20.324653 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:20.324611 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="df13aab4-5d14-4736-8415-35a8b0f266fd" containerName="storage-initializer" Apr 20 21:11:20.324653 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:20.324617 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="df13aab4-5d14-4736-8415-35a8b0f266fd" containerName="storage-initializer" Apr 20 21:11:20.324759 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:20.324695 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="df13aab4-5d14-4736-8415-35a8b0f266fd" containerName="storage-initializer" Apr 20 21:11:20.324759 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:20.324711 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="df13aab4-5d14-4736-8415-35a8b0f266fd" containerName="storage-initializer" Apr 20 21:11:20.327833 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:20.327814 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-t6t7h" Apr 20 21:11:20.330193 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:20.330170 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-fail-predictor-serving-cert\"" Apr 20 21:11:20.330193 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:20.330186 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\"" Apr 20 21:11:20.337306 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:20.337281 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-t6t7h"] Apr 20 21:11:20.410458 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:20.410434 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9b0229ac-2d3d-435a-b292-a9b043f2ca0a-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-t6t7h\" (UID: \"9b0229ac-2d3d-435a-b292-a9b043f2ca0a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-t6t7h" Apr 20 21:11:20.410588 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:20.410484 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bb7h\" (UniqueName: \"kubernetes.io/projected/9b0229ac-2d3d-435a-b292-a9b043f2ca0a-kube-api-access-8bb7h\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-t6t7h\" (UID: \"9b0229ac-2d3d-435a-b292-a9b043f2ca0a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-t6t7h" Apr 20 21:11:20.410588 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:20.410565 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9b0229ac-2d3d-435a-b292-a9b043f2ca0a-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-t6t7h\" (UID: \"9b0229ac-2d3d-435a-b292-a9b043f2ca0a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-t6t7h" Apr 20 21:11:20.410667 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:20.410605 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9b0229ac-2d3d-435a-b292-a9b043f2ca0a-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-t6t7h\" (UID: \"9b0229ac-2d3d-435a-b292-a9b043f2ca0a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-t6t7h" Apr 20 21:11:20.511203 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:20.511169 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9b0229ac-2d3d-435a-b292-a9b043f2ca0a-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-t6t7h\" (UID: \"9b0229ac-2d3d-435a-b292-a9b043f2ca0a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-t6t7h" Apr 20 21:11:20.511337 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:20.511223 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8bb7h\" (UniqueName: \"kubernetes.io/projected/9b0229ac-2d3d-435a-b292-a9b043f2ca0a-kube-api-access-8bb7h\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-t6t7h\" (UID: \"9b0229ac-2d3d-435a-b292-a9b043f2ca0a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-t6t7h" Apr 20 21:11:20.511337 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:20.511256 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9b0229ac-2d3d-435a-b292-a9b043f2ca0a-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-t6t7h\" (UID: \"9b0229ac-2d3d-435a-b292-a9b043f2ca0a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-t6t7h" Apr 20 21:11:20.511337 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:20.511277 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9b0229ac-2d3d-435a-b292-a9b043f2ca0a-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-t6t7h\" (UID: \"9b0229ac-2d3d-435a-b292-a9b043f2ca0a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-t6t7h" Apr 20 21:11:20.511711 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:20.511687 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9b0229ac-2d3d-435a-b292-a9b043f2ca0a-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-t6t7h\" (UID: \"9b0229ac-2d3d-435a-b292-a9b043f2ca0a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-t6t7h" Apr 20 21:11:20.511833 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:20.511813 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9b0229ac-2d3d-435a-b292-a9b043f2ca0a-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-t6t7h\" (UID: \"9b0229ac-2d3d-435a-b292-a9b043f2ca0a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-t6t7h" Apr 20 21:11:20.513529 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:20.513508 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9b0229ac-2d3d-435a-b292-a9b043f2ca0a-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-t6t7h\" (UID: \"9b0229ac-2d3d-435a-b292-a9b043f2ca0a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-t6t7h" Apr 20 21:11:20.519712 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:20.519690 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bb7h\" (UniqueName: \"kubernetes.io/projected/9b0229ac-2d3d-435a-b292-a9b043f2ca0a-kube-api-access-8bb7h\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-t6t7h\" (UID: \"9b0229ac-2d3d-435a-b292-a9b043f2ca0a\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-t6t7h" Apr 20 21:11:20.639755 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:20.639689 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-t6t7h" Apr 20 21:11:20.757246 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:20.757225 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-t6t7h"] Apr 20 21:11:20.759423 ip-10-0-143-23 kubenswrapper[2576]: W0420 21:11:20.759397 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b0229ac_2d3d_435a_b292_a9b043f2ca0a.slice/crio-220feb259b1408bcf12f30be6bb315b0f23d32298b520d37db11a909afc53261 WatchSource:0}: Error finding container 220feb259b1408bcf12f30be6bb315b0f23d32298b520d37db11a909afc53261: Status 404 returned error can't find the container with id 220feb259b1408bcf12f30be6bb315b0f23d32298b520d37db11a909afc53261 Apr 20 21:11:20.761318 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:20.761300 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 21:11:21.304336 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:21.304302 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-t6t7h" event={"ID":"9b0229ac-2d3d-435a-b292-a9b043f2ca0a","Type":"ContainerStarted","Data":"8a61741f9120aa58565b0507964f8e909f6c16933eb79608feef58a8b2784c97"} Apr 20 21:11:21.304336 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:21.304338 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-t6t7h" event={"ID":"9b0229ac-2d3d-435a-b292-a9b043f2ca0a","Type":"ContainerStarted","Data":"220feb259b1408bcf12f30be6bb315b0f23d32298b520d37db11a909afc53261"} Apr 20 21:11:23.196990 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:23.196967 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7" Apr 20 21:11:23.232878 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:23.232855 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ed70b9fb-aecd-4212-9e38-58c13048748a-proxy-tls\") pod \"ed70b9fb-aecd-4212-9e38-58c13048748a\" (UID: \"ed70b9fb-aecd-4212-9e38-58c13048748a\") " Apr 20 21:11:23.233027 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:23.232946 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ed70b9fb-aecd-4212-9e38-58c13048748a-kserve-provision-location\") pod \"ed70b9fb-aecd-4212-9e38-58c13048748a\" (UID: \"ed70b9fb-aecd-4212-9e38-58c13048748a\") " Apr 20 21:11:23.233027 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:23.232967 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/ed70b9fb-aecd-4212-9e38-58c13048748a-cabundle-cert\") pod \"ed70b9fb-aecd-4212-9e38-58c13048748a\" (UID: \"ed70b9fb-aecd-4212-9e38-58c13048748a\") " Apr 20 21:11:23.233179 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:23.233074 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ed70b9fb-aecd-4212-9e38-58c13048748a-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"ed70b9fb-aecd-4212-9e38-58c13048748a\" (UID: \"ed70b9fb-aecd-4212-9e38-58c13048748a\") " Apr 20 21:11:23.233179 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:23.233155 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxrss\" (UniqueName: \"kubernetes.io/projected/ed70b9fb-aecd-4212-9e38-58c13048748a-kube-api-access-nxrss\") pod \"ed70b9fb-aecd-4212-9e38-58c13048748a\" (UID: \"ed70b9fb-aecd-4212-9e38-58c13048748a\") " Apr 20 21:11:23.233371 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:23.233343 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed70b9fb-aecd-4212-9e38-58c13048748a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ed70b9fb-aecd-4212-9e38-58c13048748a" (UID: "ed70b9fb-aecd-4212-9e38-58c13048748a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:11:23.233488 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:23.233370 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed70b9fb-aecd-4212-9e38-58c13048748a-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "ed70b9fb-aecd-4212-9e38-58c13048748a" (UID: "ed70b9fb-aecd-4212-9e38-58c13048748a"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:11:23.233488 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:23.233453 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ed70b9fb-aecd-4212-9e38-58c13048748a-kserve-provision-location\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 21:11:23.233488 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:23.233457 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed70b9fb-aecd-4212-9e38-58c13048748a-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config") pod "ed70b9fb-aecd-4212-9e38-58c13048748a" (UID: "ed70b9fb-aecd-4212-9e38-58c13048748a"). InnerVolumeSpecName "isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:11:23.233488 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:23.233469 2576 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/ed70b9fb-aecd-4212-9e38-58c13048748a-cabundle-cert\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 21:11:23.235010 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:23.234985 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed70b9fb-aecd-4212-9e38-58c13048748a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "ed70b9fb-aecd-4212-9e38-58c13048748a" (UID: "ed70b9fb-aecd-4212-9e38-58c13048748a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:11:23.235097 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:23.235013 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed70b9fb-aecd-4212-9e38-58c13048748a-kube-api-access-nxrss" (OuterVolumeSpecName: "kube-api-access-nxrss") pod "ed70b9fb-aecd-4212-9e38-58c13048748a" (UID: "ed70b9fb-aecd-4212-9e38-58c13048748a"). InnerVolumeSpecName "kube-api-access-nxrss". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:11:23.313505 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:23.313437 2576 generic.go:358] "Generic (PLEG): container finished" podID="ed70b9fb-aecd-4212-9e38-58c13048748a" containerID="a4ad1dfc83bb5e06cda3ba76a892d4094a7debaa3f0a9b341a7006fedb5c1628" exitCode=0 Apr 20 21:11:23.313599 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:23.313515 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7" Apr 20 21:11:23.313599 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:23.313525 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7" event={"ID":"ed70b9fb-aecd-4212-9e38-58c13048748a","Type":"ContainerDied","Data":"a4ad1dfc83bb5e06cda3ba76a892d4094a7debaa3f0a9b341a7006fedb5c1628"} Apr 20 21:11:23.313599 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:23.313560 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7" event={"ID":"ed70b9fb-aecd-4212-9e38-58c13048748a","Type":"ContainerDied","Data":"2db099133b189e056f4c24c788e45d33614c038d1c763ea8b41ff6999c0afda8"} Apr 20 21:11:23.313599 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:23.313578 2576 scope.go:117] "RemoveContainer" containerID="ec8cba9a7533ae042762566867fc52d212bd583aa40ba070e61f64f541dc49b3" Apr 20 21:11:23.321787 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:23.321767 2576 scope.go:117] "RemoveContainer" containerID="a4ad1dfc83bb5e06cda3ba76a892d4094a7debaa3f0a9b341a7006fedb5c1628" Apr 20 21:11:23.328675 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:23.328658 2576 scope.go:117] "RemoveContainer" containerID="28f47eed3a0adbeaa2e848674f4c755e84e60855b93e7d9e122f3d237474a6eb" Apr 20 21:11:23.334266 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:23.334246 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ed70b9fb-aecd-4212-9e38-58c13048748a-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 21:11:23.334356 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:23.334268 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nxrss\" (UniqueName: \"kubernetes.io/projected/ed70b9fb-aecd-4212-9e38-58c13048748a-kube-api-access-nxrss\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 21:11:23.334356 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:23.334279 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ed70b9fb-aecd-4212-9e38-58c13048748a-proxy-tls\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 21:11:23.336007 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:23.335986 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7"] Apr 20 21:11:23.339954 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:23.339935 2576 scope.go:117] "RemoveContainer" containerID="ec8cba9a7533ae042762566867fc52d212bd583aa40ba070e61f64f541dc49b3" Apr 20 21:11:23.340333 ip-10-0-143-23 kubenswrapper[2576]: E0420 21:11:23.340311 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec8cba9a7533ae042762566867fc52d212bd583aa40ba070e61f64f541dc49b3\": container with ID starting with ec8cba9a7533ae042762566867fc52d212bd583aa40ba070e61f64f541dc49b3 not found: ID does not exist" containerID="ec8cba9a7533ae042762566867fc52d212bd583aa40ba070e61f64f541dc49b3" Apr 20 21:11:23.340400 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:23.340341 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec8cba9a7533ae042762566867fc52d212bd583aa40ba070e61f64f541dc49b3"} err="failed to get container status \"ec8cba9a7533ae042762566867fc52d212bd583aa40ba070e61f64f541dc49b3\": rpc error: code = NotFound desc = could not find container \"ec8cba9a7533ae042762566867fc52d212bd583aa40ba070e61f64f541dc49b3\": container with ID starting with ec8cba9a7533ae042762566867fc52d212bd583aa40ba070e61f64f541dc49b3 not found: ID does not exist" Apr 20 21:11:23.340400 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:23.340366 2576 scope.go:117] "RemoveContainer" containerID="a4ad1dfc83bb5e06cda3ba76a892d4094a7debaa3f0a9b341a7006fedb5c1628" Apr 20 21:11:23.340611 ip-10-0-143-23 kubenswrapper[2576]: E0420 21:11:23.340592 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4ad1dfc83bb5e06cda3ba76a892d4094a7debaa3f0a9b341a7006fedb5c1628\": container with ID starting with a4ad1dfc83bb5e06cda3ba76a892d4094a7debaa3f0a9b341a7006fedb5c1628 not found: ID does not exist" containerID="a4ad1dfc83bb5e06cda3ba76a892d4094a7debaa3f0a9b341a7006fedb5c1628" Apr 20 21:11:23.340683 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:23.340622 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4ad1dfc83bb5e06cda3ba76a892d4094a7debaa3f0a9b341a7006fedb5c1628"} err="failed to get container status \"a4ad1dfc83bb5e06cda3ba76a892d4094a7debaa3f0a9b341a7006fedb5c1628\": rpc error: code = NotFound desc = could not find container \"a4ad1dfc83bb5e06cda3ba76a892d4094a7debaa3f0a9b341a7006fedb5c1628\": container with ID starting with a4ad1dfc83bb5e06cda3ba76a892d4094a7debaa3f0a9b341a7006fedb5c1628 not found: ID does not exist" Apr 20 21:11:23.340683 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:23.340646 2576 scope.go:117] "RemoveContainer" containerID="28f47eed3a0adbeaa2e848674f4c755e84e60855b93e7d9e122f3d237474a6eb" Apr 20 21:11:23.340899 ip-10-0-143-23 kubenswrapper[2576]: E0420 21:11:23.340880 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28f47eed3a0adbeaa2e848674f4c755e84e60855b93e7d9e122f3d237474a6eb\": container with ID starting with 28f47eed3a0adbeaa2e848674f4c755e84e60855b93e7d9e122f3d237474a6eb not found: ID does not exist" containerID="28f47eed3a0adbeaa2e848674f4c755e84e60855b93e7d9e122f3d237474a6eb" Apr 20 21:11:23.340961 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:23.340908 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28f47eed3a0adbeaa2e848674f4c755e84e60855b93e7d9e122f3d237474a6eb"} err="failed to get container status \"28f47eed3a0adbeaa2e848674f4c755e84e60855b93e7d9e122f3d237474a6eb\": rpc error: code = NotFound desc = could not find container \"28f47eed3a0adbeaa2e848674f4c755e84e60855b93e7d9e122f3d237474a6eb\": container with ID starting with 28f47eed3a0adbeaa2e848674f4c755e84e60855b93e7d9e122f3d237474a6eb not found: ID does not exist" Apr 20 21:11:23.341054 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:23.341037 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-kd9n7"] Apr 20 21:11:23.943573 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:23.943539 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed70b9fb-aecd-4212-9e38-58c13048748a" path="/var/lib/kubelet/pods/ed70b9fb-aecd-4212-9e38-58c13048748a/volumes" Apr 20 21:11:24.318751 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:24.318724 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-t6t7h_9b0229ac-2d3d-435a-b292-a9b043f2ca0a/storage-initializer/0.log" Apr 20 21:11:24.319087 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:24.318757 2576 generic.go:358] "Generic (PLEG): container finished" podID="9b0229ac-2d3d-435a-b292-a9b043f2ca0a" containerID="8a61741f9120aa58565b0507964f8e909f6c16933eb79608feef58a8b2784c97" exitCode=1 Apr 20 21:11:24.319087 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:24.318805 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-t6t7h" event={"ID":"9b0229ac-2d3d-435a-b292-a9b043f2ca0a","Type":"ContainerDied","Data":"8a61741f9120aa58565b0507964f8e909f6c16933eb79608feef58a8b2784c97"} Apr 20 21:11:25.323560 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:25.323530 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-t6t7h_9b0229ac-2d3d-435a-b292-a9b043f2ca0a/storage-initializer/0.log" Apr 20 21:11:25.323923 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:25.323595 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-t6t7h" event={"ID":"9b0229ac-2d3d-435a-b292-a9b043f2ca0a","Type":"ContainerStarted","Data":"da0320fb735e87c6bcf5664e92f6047ab738fc1c64997f983937a8898b577b06"} Apr 20 21:11:30.325596 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:30.325562 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-t6t7h"] Apr 20 21:11:30.326017 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:30.325947 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-t6t7h" podUID="9b0229ac-2d3d-435a-b292-a9b043f2ca0a" containerName="storage-initializer" containerID="cri-o://da0320fb735e87c6bcf5664e92f6047ab738fc1c64997f983937a8898b577b06" gracePeriod=30 Apr 20 21:11:30.340922 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:30.340897 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-t6t7h_9b0229ac-2d3d-435a-b292-a9b043f2ca0a/storage-initializer/1.log" Apr 20 21:11:30.341419 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:30.341393 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-t6t7h_9b0229ac-2d3d-435a-b292-a9b043f2ca0a/storage-initializer/0.log" Apr 20 21:11:30.341540 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:30.341450 2576 generic.go:358] "Generic (PLEG): container finished" podID="9b0229ac-2d3d-435a-b292-a9b043f2ca0a" containerID="da0320fb735e87c6bcf5664e92f6047ab738fc1c64997f983937a8898b577b06" exitCode=1 Apr 20 21:11:30.341540 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:30.341511 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-t6t7h" event={"ID":"9b0229ac-2d3d-435a-b292-a9b043f2ca0a","Type":"ContainerDied","Data":"da0320fb735e87c6bcf5664e92f6047ab738fc1c64997f983937a8898b577b06"} Apr 20 21:11:30.341631 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:30.341545 2576 scope.go:117] "RemoveContainer" containerID="8a61741f9120aa58565b0507964f8e909f6c16933eb79608feef58a8b2784c97" Apr 20 21:11:30.459012 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:30.458991 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-t6t7h_9b0229ac-2d3d-435a-b292-a9b043f2ca0a/storage-initializer/1.log" Apr 20 21:11:30.459104 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:30.459055 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-t6t7h" Apr 20 21:11:30.591796 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:30.591729 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9b0229ac-2d3d-435a-b292-a9b043f2ca0a-proxy-tls\") pod \"9b0229ac-2d3d-435a-b292-a9b043f2ca0a\" (UID: \"9b0229ac-2d3d-435a-b292-a9b043f2ca0a\") " Apr 20 21:11:30.591899 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:30.591806 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9b0229ac-2d3d-435a-b292-a9b043f2ca0a-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"9b0229ac-2d3d-435a-b292-a9b043f2ca0a\" (UID: \"9b0229ac-2d3d-435a-b292-a9b043f2ca0a\") " Apr 20 21:11:30.591899 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:30.591854 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9b0229ac-2d3d-435a-b292-a9b043f2ca0a-kserve-provision-location\") pod \"9b0229ac-2d3d-435a-b292-a9b043f2ca0a\" (UID: \"9b0229ac-2d3d-435a-b292-a9b043f2ca0a\") " Apr 20 21:11:30.591899 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:30.591883 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bb7h\" (UniqueName: \"kubernetes.io/projected/9b0229ac-2d3d-435a-b292-a9b043f2ca0a-kube-api-access-8bb7h\") pod \"9b0229ac-2d3d-435a-b292-a9b043f2ca0a\" (UID: \"9b0229ac-2d3d-435a-b292-a9b043f2ca0a\") " Apr 20 21:11:30.592125 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:30.592090 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b0229ac-2d3d-435a-b292-a9b043f2ca0a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9b0229ac-2d3d-435a-b292-a9b043f2ca0a" (UID: "9b0229ac-2d3d-435a-b292-a9b043f2ca0a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:11:30.592195 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:30.592175 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b0229ac-2d3d-435a-b292-a9b043f2ca0a-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config") pod "9b0229ac-2d3d-435a-b292-a9b043f2ca0a" (UID: "9b0229ac-2d3d-435a-b292-a9b043f2ca0a"). InnerVolumeSpecName "isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 21:11:30.593730 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:30.593710 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b0229ac-2d3d-435a-b292-a9b043f2ca0a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9b0229ac-2d3d-435a-b292-a9b043f2ca0a" (UID: "9b0229ac-2d3d-435a-b292-a9b043f2ca0a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 21:11:30.593879 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:30.593861 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b0229ac-2d3d-435a-b292-a9b043f2ca0a-kube-api-access-8bb7h" (OuterVolumeSpecName: "kube-api-access-8bb7h") pod "9b0229ac-2d3d-435a-b292-a9b043f2ca0a" (UID: "9b0229ac-2d3d-435a-b292-a9b043f2ca0a"). InnerVolumeSpecName "kube-api-access-8bb7h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:11:30.692597 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:30.692571 2576 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9b0229ac-2d3d-435a-b292-a9b043f2ca0a-proxy-tls\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 21:11:30.692597 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:30.692594 2576 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9b0229ac-2d3d-435a-b292-a9b043f2ca0a-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 21:11:30.692740 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:30.692604 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9b0229ac-2d3d-435a-b292-a9b043f2ca0a-kserve-provision-location\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 21:11:30.692740 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:30.692614 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8bb7h\" (UniqueName: \"kubernetes.io/projected/9b0229ac-2d3d-435a-b292-a9b043f2ca0a-kube-api-access-8bb7h\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 21:11:31.345362 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:31.345334 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-t6t7h_9b0229ac-2d3d-435a-b292-a9b043f2ca0a/storage-initializer/1.log" Apr 20 21:11:31.345757 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:31.345462 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-t6t7h" Apr 20 21:11:31.345757 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:31.345465 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-t6t7h" event={"ID":"9b0229ac-2d3d-435a-b292-a9b043f2ca0a","Type":"ContainerDied","Data":"220feb259b1408bcf12f30be6bb315b0f23d32298b520d37db11a909afc53261"} Apr 20 21:11:31.345757 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:31.345575 2576 scope.go:117] "RemoveContainer" containerID="da0320fb735e87c6bcf5664e92f6047ab738fc1c64997f983937a8898b577b06" Apr 20 21:11:31.379609 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:31.379586 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-t6t7h"] Apr 20 21:11:31.382752 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:31.382730 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-t6t7h"] Apr 20 21:11:31.944381 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:31.944347 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b0229ac-2d3d-435a-b292-a9b043f2ca0a" path="/var/lib/kubelet/pods/9b0229ac-2d3d-435a-b292-a9b043f2ca0a/volumes" Apr 20 21:11:32.664800 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:32.664766 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8pqmf/must-gather-v95wx"] Apr 20 21:11:32.665187 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:32.665077 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9b0229ac-2d3d-435a-b292-a9b043f2ca0a" containerName="storage-initializer" Apr 20 21:11:32.665187 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:32.665087 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b0229ac-2d3d-435a-b292-a9b043f2ca0a" containerName="storage-initializer" Apr 20 21:11:32.665187 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:32.665096 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ed70b9fb-aecd-4212-9e38-58c13048748a" containerName="kube-rbac-proxy" Apr 20 21:11:32.665187 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:32.665102 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed70b9fb-aecd-4212-9e38-58c13048748a" containerName="kube-rbac-proxy" Apr 20 21:11:32.665187 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:32.665124 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9b0229ac-2d3d-435a-b292-a9b043f2ca0a" containerName="storage-initializer" Apr 20 21:11:32.665187 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:32.665130 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b0229ac-2d3d-435a-b292-a9b043f2ca0a" containerName="storage-initializer" Apr 20 21:11:32.665187 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:32.665144 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ed70b9fb-aecd-4212-9e38-58c13048748a" containerName="storage-initializer" Apr 20 21:11:32.665187 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:32.665152 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed70b9fb-aecd-4212-9e38-58c13048748a" containerName="storage-initializer" Apr 20 21:11:32.665187 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:32.665160 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ed70b9fb-aecd-4212-9e38-58c13048748a" containerName="kserve-container" Apr 20 21:11:32.665187 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:32.665165 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed70b9fb-aecd-4212-9e38-58c13048748a" containerName="kserve-container" Apr 20 21:11:32.665511 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:32.665227 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ed70b9fb-aecd-4212-9e38-58c13048748a" containerName="kserve-container" Apr 20 21:11:32.665511 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:32.665237 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9b0229ac-2d3d-435a-b292-a9b043f2ca0a" containerName="storage-initializer" Apr 20 21:11:32.665511 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:32.665244 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9b0229ac-2d3d-435a-b292-a9b043f2ca0a" containerName="storage-initializer" Apr 20 21:11:32.665511 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:32.665252 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ed70b9fb-aecd-4212-9e38-58c13048748a" containerName="kube-rbac-proxy" Apr 20 21:11:32.668378 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:32.668362 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8pqmf/must-gather-v95wx" Apr 20 21:11:32.670844 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:32.670812 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-8pqmf\"/\"default-dockercfg-twn47\"" Apr 20 21:11:32.671830 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:32.671811 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-8pqmf\"/\"openshift-service-ca.crt\"" Apr 20 21:11:32.671830 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:32.671830 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-8pqmf\"/\"kube-root-ca.crt\"" Apr 20 21:11:32.679731 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:32.679713 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8pqmf/must-gather-v95wx"] Apr 20 21:11:32.809685 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:32.809655 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5qb7\" (UniqueName: \"kubernetes.io/projected/f2a5aac9-c17f-42b9-ba83-e65ae7320cd5-kube-api-access-g5qb7\") pod \"must-gather-v95wx\" (UID: \"f2a5aac9-c17f-42b9-ba83-e65ae7320cd5\") " pod="openshift-must-gather-8pqmf/must-gather-v95wx" Apr 20 21:11:32.809807 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:32.809728 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f2a5aac9-c17f-42b9-ba83-e65ae7320cd5-must-gather-output\") pod \"must-gather-v95wx\" (UID: \"f2a5aac9-c17f-42b9-ba83-e65ae7320cd5\") " pod="openshift-must-gather-8pqmf/must-gather-v95wx" Apr 20 21:11:32.910441 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:32.910413 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f2a5aac9-c17f-42b9-ba83-e65ae7320cd5-must-gather-output\") pod \"must-gather-v95wx\" (UID: \"f2a5aac9-c17f-42b9-ba83-e65ae7320cd5\") " pod="openshift-must-gather-8pqmf/must-gather-v95wx" Apr 20 21:11:32.910555 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:32.910461 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g5qb7\" (UniqueName: \"kubernetes.io/projected/f2a5aac9-c17f-42b9-ba83-e65ae7320cd5-kube-api-access-g5qb7\") pod \"must-gather-v95wx\" (UID: \"f2a5aac9-c17f-42b9-ba83-e65ae7320cd5\") " pod="openshift-must-gather-8pqmf/must-gather-v95wx" Apr 20 21:11:32.910797 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:32.910780 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f2a5aac9-c17f-42b9-ba83-e65ae7320cd5-must-gather-output\") pod \"must-gather-v95wx\" (UID: \"f2a5aac9-c17f-42b9-ba83-e65ae7320cd5\") " pod="openshift-must-gather-8pqmf/must-gather-v95wx" Apr 20 21:11:32.917987 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:32.917938 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5qb7\" (UniqueName: \"kubernetes.io/projected/f2a5aac9-c17f-42b9-ba83-e65ae7320cd5-kube-api-access-g5qb7\") pod \"must-gather-v95wx\" (UID: \"f2a5aac9-c17f-42b9-ba83-e65ae7320cd5\") " pod="openshift-must-gather-8pqmf/must-gather-v95wx" Apr 20 21:11:32.989915 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:32.989894 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8pqmf/must-gather-v95wx" Apr 20 21:11:33.107958 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:33.107932 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8pqmf/must-gather-v95wx"] Apr 20 21:11:33.110570 ip-10-0-143-23 kubenswrapper[2576]: W0420 21:11:33.110544 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2a5aac9_c17f_42b9_ba83_e65ae7320cd5.slice/crio-ad0e927e0089b65962e1565fbb30475908a588ee41f2ce9583f24e97a2f96519 WatchSource:0}: Error finding container ad0e927e0089b65962e1565fbb30475908a588ee41f2ce9583f24e97a2f96519: Status 404 returned error can't find the container with id ad0e927e0089b65962e1565fbb30475908a588ee41f2ce9583f24e97a2f96519 Apr 20 21:11:33.353671 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:33.353640 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8pqmf/must-gather-v95wx" event={"ID":"f2a5aac9-c17f-42b9-ba83-e65ae7320cd5","Type":"ContainerStarted","Data":"ad0e927e0089b65962e1565fbb30475908a588ee41f2ce9583f24e97a2f96519"} Apr 20 21:11:38.374355 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:38.374317 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8pqmf/must-gather-v95wx" event={"ID":"f2a5aac9-c17f-42b9-ba83-e65ae7320cd5","Type":"ContainerStarted","Data":"27ac41bc6b697e12c240fb0ea55d25a4dec847e1ec354216c8d390d4f62646c9"} Apr 20 21:11:38.374355 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:38.374360 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8pqmf/must-gather-v95wx" event={"ID":"f2a5aac9-c17f-42b9-ba83-e65ae7320cd5","Type":"ContainerStarted","Data":"c23a78c716686a5317b1ee1cdb683d9a94cdae6a972ec9cc6955fc988ba53f21"} Apr 20 21:11:38.392567 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:38.392521 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8pqmf/must-gather-v95wx" podStartSLOduration=1.756535966 podStartE2EDuration="6.392505436s" podCreationTimestamp="2026-04-20 21:11:32 +0000 UTC" firstStartedPulling="2026-04-20 21:11:33.112308005 +0000 UTC m=+3967.824989980" lastFinishedPulling="2026-04-20 21:11:37.748277471 +0000 UTC m=+3972.460959450" observedRunningTime="2026-04-20 21:11:38.39047449 +0000 UTC m=+3973.103156487" watchObservedRunningTime="2026-04-20 21:11:38.392505436 +0000 UTC m=+3973.105187432" Apr 20 21:11:59.453624 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:59.453592 2576 generic.go:358] "Generic (PLEG): container finished" podID="f2a5aac9-c17f-42b9-ba83-e65ae7320cd5" containerID="c23a78c716686a5317b1ee1cdb683d9a94cdae6a972ec9cc6955fc988ba53f21" exitCode=0 Apr 20 21:11:59.454048 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:59.453653 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8pqmf/must-gather-v95wx" event={"ID":"f2a5aac9-c17f-42b9-ba83-e65ae7320cd5","Type":"ContainerDied","Data":"c23a78c716686a5317b1ee1cdb683d9a94cdae6a972ec9cc6955fc988ba53f21"} Apr 20 21:11:59.454048 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:59.453950 2576 scope.go:117] "RemoveContainer" containerID="c23a78c716686a5317b1ee1cdb683d9a94cdae6a972ec9cc6955fc988ba53f21" Apr 20 21:11:59.518521 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:11:59.518497 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8pqmf_must-gather-v95wx_f2a5aac9-c17f-42b9-ba83-e65ae7320cd5/gather/0.log" Apr 20 21:12:02.770807 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:02.770754 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-7wn8c_2f559047-01c4-4f3b-a5a7-1c183af11e8f/global-pull-secret-syncer/0.log" Apr 20 21:12:03.003891 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:03.003861 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-fsxdh_0a9e3157-7a26-4d6c-9f83-58c9eca7c51a/konnectivity-agent/0.log" Apr 20 21:12:03.072390 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:03.072362 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-143-23.ec2.internal_7fe58c6b7d5dfc478334c93846063b98/haproxy/0.log" Apr 20 21:12:05.009056 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:05.009016 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-8pqmf/must-gather-v95wx"] Apr 20 21:12:05.009621 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:05.009307 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-8pqmf/must-gather-v95wx" podUID="f2a5aac9-c17f-42b9-ba83-e65ae7320cd5" containerName="copy" containerID="cri-o://27ac41bc6b697e12c240fb0ea55d25a4dec847e1ec354216c8d390d4f62646c9" gracePeriod=2 Apr 20 21:12:05.011611 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:05.011576 2576 status_manager.go:895] "Failed to get status for pod" podUID="f2a5aac9-c17f-42b9-ba83-e65ae7320cd5" pod="openshift-must-gather-8pqmf/must-gather-v95wx" err="pods \"must-gather-v95wx\" is forbidden: User \"system:node:ip-10-0-143-23.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-8pqmf\": no relationship found between node 'ip-10-0-143-23.ec2.internal' and this object" Apr 20 21:12:05.012653 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:05.012629 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-8pqmf/must-gather-v95wx"] Apr 20 21:12:05.237480 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:05.237456 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8pqmf_must-gather-v95wx_f2a5aac9-c17f-42b9-ba83-e65ae7320cd5/copy/0.log" Apr 20 21:12:05.237803 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:05.237781 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8pqmf/must-gather-v95wx" Apr 20 21:12:05.239942 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:05.239916 2576 status_manager.go:895] "Failed to get status for pod" podUID="f2a5aac9-c17f-42b9-ba83-e65ae7320cd5" pod="openshift-must-gather-8pqmf/must-gather-v95wx" err="pods \"must-gather-v95wx\" is forbidden: User \"system:node:ip-10-0-143-23.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-8pqmf\": no relationship found between node 'ip-10-0-143-23.ec2.internal' and this object" Apr 20 21:12:05.275125 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:05.275059 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5qb7\" (UniqueName: \"kubernetes.io/projected/f2a5aac9-c17f-42b9-ba83-e65ae7320cd5-kube-api-access-g5qb7\") pod \"f2a5aac9-c17f-42b9-ba83-e65ae7320cd5\" (UID: \"f2a5aac9-c17f-42b9-ba83-e65ae7320cd5\") " Apr 20 21:12:05.275125 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:05.275094 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f2a5aac9-c17f-42b9-ba83-e65ae7320cd5-must-gather-output\") pod \"f2a5aac9-c17f-42b9-ba83-e65ae7320cd5\" (UID: \"f2a5aac9-c17f-42b9-ba83-e65ae7320cd5\") " Apr 20 21:12:05.276561 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:05.276536 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2a5aac9-c17f-42b9-ba83-e65ae7320cd5-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "f2a5aac9-c17f-42b9-ba83-e65ae7320cd5" (UID: "f2a5aac9-c17f-42b9-ba83-e65ae7320cd5"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 21:12:05.277203 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:05.277172 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2a5aac9-c17f-42b9-ba83-e65ae7320cd5-kube-api-access-g5qb7" (OuterVolumeSpecName: "kube-api-access-g5qb7") pod "f2a5aac9-c17f-42b9-ba83-e65ae7320cd5" (UID: "f2a5aac9-c17f-42b9-ba83-e65ae7320cd5"). InnerVolumeSpecName "kube-api-access-g5qb7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 21:12:05.375819 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:05.375796 2576 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f2a5aac9-c17f-42b9-ba83-e65ae7320cd5-must-gather-output\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 21:12:05.375819 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:05.375820 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g5qb7\" (UniqueName: \"kubernetes.io/projected/f2a5aac9-c17f-42b9-ba83-e65ae7320cd5-kube-api-access-g5qb7\") on node \"ip-10-0-143-23.ec2.internal\" DevicePath \"\"" Apr 20 21:12:05.475984 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:05.475966 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-8pqmf_must-gather-v95wx_f2a5aac9-c17f-42b9-ba83-e65ae7320cd5/copy/0.log" Apr 20 21:12:05.476309 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:05.476285 2576 generic.go:358] "Generic (PLEG): container finished" podID="f2a5aac9-c17f-42b9-ba83-e65ae7320cd5" containerID="27ac41bc6b697e12c240fb0ea55d25a4dec847e1ec354216c8d390d4f62646c9" exitCode=143 Apr 20 21:12:05.476406 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:05.476350 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8pqmf/must-gather-v95wx" Apr 20 21:12:05.476406 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:05.476390 2576 scope.go:117] "RemoveContainer" containerID="27ac41bc6b697e12c240fb0ea55d25a4dec847e1ec354216c8d390d4f62646c9" Apr 20 21:12:05.478798 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:05.478765 2576 status_manager.go:895] "Failed to get status for pod" podUID="f2a5aac9-c17f-42b9-ba83-e65ae7320cd5" pod="openshift-must-gather-8pqmf/must-gather-v95wx" err="pods \"must-gather-v95wx\" is forbidden: User \"system:node:ip-10-0-143-23.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-8pqmf\": no relationship found between node 'ip-10-0-143-23.ec2.internal' and this object" Apr 20 21:12:05.484268 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:05.484234 2576 scope.go:117] "RemoveContainer" containerID="c23a78c716686a5317b1ee1cdb683d9a94cdae6a972ec9cc6955fc988ba53f21" Apr 20 21:12:05.486132 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:05.486095 2576 status_manager.go:895] "Failed to get status for pod" podUID="f2a5aac9-c17f-42b9-ba83-e65ae7320cd5" pod="openshift-must-gather-8pqmf/must-gather-v95wx" err="pods \"must-gather-v95wx\" is forbidden: User \"system:node:ip-10-0-143-23.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-8pqmf\": no relationship found between node 'ip-10-0-143-23.ec2.internal' and this object" Apr 20 21:12:05.494817 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:05.494799 2576 scope.go:117] "RemoveContainer" containerID="27ac41bc6b697e12c240fb0ea55d25a4dec847e1ec354216c8d390d4f62646c9" Apr 20 21:12:05.495069 ip-10-0-143-23 kubenswrapper[2576]: E0420 21:12:05.495048 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27ac41bc6b697e12c240fb0ea55d25a4dec847e1ec354216c8d390d4f62646c9\": container with ID starting with 27ac41bc6b697e12c240fb0ea55d25a4dec847e1ec354216c8d390d4f62646c9 not found: ID does not exist" containerID="27ac41bc6b697e12c240fb0ea55d25a4dec847e1ec354216c8d390d4f62646c9" Apr 20 21:12:05.495168 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:05.495080 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27ac41bc6b697e12c240fb0ea55d25a4dec847e1ec354216c8d390d4f62646c9"} err="failed to get container status \"27ac41bc6b697e12c240fb0ea55d25a4dec847e1ec354216c8d390d4f62646c9\": rpc error: code = NotFound desc = could not find container \"27ac41bc6b697e12c240fb0ea55d25a4dec847e1ec354216c8d390d4f62646c9\": container with ID starting with 27ac41bc6b697e12c240fb0ea55d25a4dec847e1ec354216c8d390d4f62646c9 not found: ID does not exist" Apr 20 21:12:05.495168 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:05.495105 2576 scope.go:117] "RemoveContainer" containerID="c23a78c716686a5317b1ee1cdb683d9a94cdae6a972ec9cc6955fc988ba53f21" Apr 20 21:12:05.495337 ip-10-0-143-23 kubenswrapper[2576]: E0420 21:12:05.495318 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c23a78c716686a5317b1ee1cdb683d9a94cdae6a972ec9cc6955fc988ba53f21\": container with ID starting with c23a78c716686a5317b1ee1cdb683d9a94cdae6a972ec9cc6955fc988ba53f21 not found: ID does not exist" containerID="c23a78c716686a5317b1ee1cdb683d9a94cdae6a972ec9cc6955fc988ba53f21" Apr 20 21:12:05.495397 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:05.495346 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c23a78c716686a5317b1ee1cdb683d9a94cdae6a972ec9cc6955fc988ba53f21"} err="failed to get container status \"c23a78c716686a5317b1ee1cdb683d9a94cdae6a972ec9cc6955fc988ba53f21\": rpc error: code = NotFound desc = could not find container \"c23a78c716686a5317b1ee1cdb683d9a94cdae6a972ec9cc6955fc988ba53f21\": container with ID starting with c23a78c716686a5317b1ee1cdb683d9a94cdae6a972ec9cc6955fc988ba53f21 not found: ID does not exist" Apr 20 21:12:05.942962 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:05.942933 2576 status_manager.go:895] "Failed to get status for pod" podUID="f2a5aac9-c17f-42b9-ba83-e65ae7320cd5" pod="openshift-must-gather-8pqmf/must-gather-v95wx" err="pods \"must-gather-v95wx\" is forbidden: User \"system:node:ip-10-0-143-23.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-8pqmf\": no relationship found between node 'ip-10-0-143-23.ec2.internal' and this object" Apr 20 21:12:05.945202 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:05.945179 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2a5aac9-c17f-42b9-ba83-e65ae7320cd5" path="/var/lib/kubelet/pods/f2a5aac9-c17f-42b9-ba83-e65ae7320cd5/volumes" Apr 20 21:12:06.677244 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:06.677216 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-srrtl_76f7379a-b448-41ee-851d-a712d528cbda/node-exporter/0.log" Apr 20 21:12:06.700605 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:06.700579 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-srrtl_76f7379a-b448-41ee-851d-a712d528cbda/kube-rbac-proxy/0.log" Apr 20 21:12:06.721283 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:06.721260 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-srrtl_76f7379a-b448-41ee-851d-a712d528cbda/init-textfile/0.log" Apr 20 21:12:07.135600 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:07.135569 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-8fmnj_695e2e8f-4584-4016-a017-c63df5f16230/prometheus-operator-admission-webhook/0.log" Apr 20 21:12:07.254091 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:07.254066 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-575b747bcc-xbtfs_874f316f-a84f-4d91-97df-8300d3e42b26/thanos-query/0.log" Apr 20 21:12:07.279881 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:07.279855 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-575b747bcc-xbtfs_874f316f-a84f-4d91-97df-8300d3e42b26/kube-rbac-proxy-web/0.log" Apr 20 21:12:07.308968 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:07.308951 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-575b747bcc-xbtfs_874f316f-a84f-4d91-97df-8300d3e42b26/kube-rbac-proxy/0.log" Apr 20 21:12:07.333631 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:07.333605 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-575b747bcc-xbtfs_874f316f-a84f-4d91-97df-8300d3e42b26/prom-label-proxy/0.log" Apr 20 21:12:07.362420 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:07.362392 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-575b747bcc-xbtfs_874f316f-a84f-4d91-97df-8300d3e42b26/kube-rbac-proxy-rules/0.log" Apr 20 21:12:07.388976 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:07.388928 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-575b747bcc-xbtfs_874f316f-a84f-4d91-97df-8300d3e42b26/kube-rbac-proxy-metrics/0.log" Apr 20 21:12:08.586156 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:08.586106 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-85nmr_10ad3736-4c80-4544-ad33-36bdd65c0779/networking-console-plugin/0.log" Apr 20 21:12:09.476125 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:09.476090 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-84bf55c695-hszhv_ecf08bb4-ea03-45a3-acdb-d8890ef6b9b5/console/0.log" Apr 20 21:12:09.726618 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:09.726541 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qsjk5/perf-node-gather-daemonset-jgftm"] Apr 20 21:12:09.727045 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:09.726997 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f2a5aac9-c17f-42b9-ba83-e65ae7320cd5" containerName="gather" Apr 20 21:12:09.727045 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:09.727015 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2a5aac9-c17f-42b9-ba83-e65ae7320cd5" containerName="gather" Apr 20 21:12:09.727045 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:09.727028 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f2a5aac9-c17f-42b9-ba83-e65ae7320cd5" containerName="copy" Apr 20 21:12:09.727045 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:09.727036 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2a5aac9-c17f-42b9-ba83-e65ae7320cd5" containerName="copy" Apr 20 21:12:09.727277 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:09.727155 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="f2a5aac9-c17f-42b9-ba83-e65ae7320cd5" containerName="gather" Apr 20 21:12:09.727277 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:09.727172 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="f2a5aac9-c17f-42b9-ba83-e65ae7320cd5" containerName="copy" Apr 20 21:12:09.732232 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:09.732211 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-jgftm" Apr 20 21:12:09.735136 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:09.735084 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-qsjk5\"/\"kube-root-ca.crt\"" Apr 20 21:12:09.735301 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:09.735282 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-qsjk5\"/\"openshift-service-ca.crt\"" Apr 20 21:12:09.736317 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:09.736303 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-qsjk5\"/\"default-dockercfg-nn688\"" Apr 20 21:12:09.740664 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:09.740640 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qsjk5/perf-node-gather-daemonset-jgftm"] Apr 20 21:12:09.807263 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:09.807237 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/08b4ad4f-e44d-4b0e-9f90-4c7888cf58a0-lib-modules\") pod \"perf-node-gather-daemonset-jgftm\" (UID: \"08b4ad4f-e44d-4b0e-9f90-4c7888cf58a0\") " pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-jgftm" Apr 20 21:12:09.807263 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:09.807265 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/08b4ad4f-e44d-4b0e-9f90-4c7888cf58a0-podres\") pod \"perf-node-gather-daemonset-jgftm\" (UID: \"08b4ad4f-e44d-4b0e-9f90-4c7888cf58a0\") " pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-jgftm" Apr 20 21:12:09.807443 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:09.807284 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z28k\" (UniqueName: \"kubernetes.io/projected/08b4ad4f-e44d-4b0e-9f90-4c7888cf58a0-kube-api-access-4z28k\") pod \"perf-node-gather-daemonset-jgftm\" (UID: \"08b4ad4f-e44d-4b0e-9f90-4c7888cf58a0\") " pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-jgftm" Apr 20 21:12:09.807443 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:09.807305 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/08b4ad4f-e44d-4b0e-9f90-4c7888cf58a0-sys\") pod \"perf-node-gather-daemonset-jgftm\" (UID: \"08b4ad4f-e44d-4b0e-9f90-4c7888cf58a0\") " pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-jgftm" Apr 20 21:12:09.807443 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:09.807352 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/08b4ad4f-e44d-4b0e-9f90-4c7888cf58a0-proc\") pod \"perf-node-gather-daemonset-jgftm\" (UID: \"08b4ad4f-e44d-4b0e-9f90-4c7888cf58a0\") " pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-jgftm" Apr 20 21:12:09.908611 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:09.908580 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/08b4ad4f-e44d-4b0e-9f90-4c7888cf58a0-lib-modules\") pod \"perf-node-gather-daemonset-jgftm\" (UID: \"08b4ad4f-e44d-4b0e-9f90-4c7888cf58a0\") " pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-jgftm" Apr 20 21:12:09.908759 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:09.908613 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/08b4ad4f-e44d-4b0e-9f90-4c7888cf58a0-podres\") pod \"perf-node-gather-daemonset-jgftm\" (UID: \"08b4ad4f-e44d-4b0e-9f90-4c7888cf58a0\") " pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-jgftm" Apr 20 21:12:09.908759 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:09.908644 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4z28k\" (UniqueName: \"kubernetes.io/projected/08b4ad4f-e44d-4b0e-9f90-4c7888cf58a0-kube-api-access-4z28k\") pod \"perf-node-gather-daemonset-jgftm\" (UID: \"08b4ad4f-e44d-4b0e-9f90-4c7888cf58a0\") " pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-jgftm" Apr 20 21:12:09.908759 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:09.908675 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/08b4ad4f-e44d-4b0e-9f90-4c7888cf58a0-sys\") pod \"perf-node-gather-daemonset-jgftm\" (UID: \"08b4ad4f-e44d-4b0e-9f90-4c7888cf58a0\") " pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-jgftm" Apr 20 21:12:09.908759 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:09.908715 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/08b4ad4f-e44d-4b0e-9f90-4c7888cf58a0-proc\") pod \"perf-node-gather-daemonset-jgftm\" (UID: \"08b4ad4f-e44d-4b0e-9f90-4c7888cf58a0\") " pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-jgftm" Apr 20 21:12:09.908759 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:09.908740 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/08b4ad4f-e44d-4b0e-9f90-4c7888cf58a0-podres\") pod \"perf-node-gather-daemonset-jgftm\" (UID: \"08b4ad4f-e44d-4b0e-9f90-4c7888cf58a0\") " pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-jgftm" Apr 20 21:12:09.908759 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:09.908744 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/08b4ad4f-e44d-4b0e-9f90-4c7888cf58a0-lib-modules\") pod \"perf-node-gather-daemonset-jgftm\" (UID: \"08b4ad4f-e44d-4b0e-9f90-4c7888cf58a0\") " pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-jgftm" Apr 20 21:12:09.908989 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:09.908778 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/08b4ad4f-e44d-4b0e-9f90-4c7888cf58a0-sys\") pod \"perf-node-gather-daemonset-jgftm\" (UID: \"08b4ad4f-e44d-4b0e-9f90-4c7888cf58a0\") " pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-jgftm" Apr 20 21:12:09.908989 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:09.908796 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/08b4ad4f-e44d-4b0e-9f90-4c7888cf58a0-proc\") pod \"perf-node-gather-daemonset-jgftm\" (UID: \"08b4ad4f-e44d-4b0e-9f90-4c7888cf58a0\") " pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-jgftm" Apr 20 21:12:09.918029 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:09.918002 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4z28k\" (UniqueName: \"kubernetes.io/projected/08b4ad4f-e44d-4b0e-9f90-4c7888cf58a0-kube-api-access-4z28k\") pod \"perf-node-gather-daemonset-jgftm\" (UID: \"08b4ad4f-e44d-4b0e-9f90-4c7888cf58a0\") " pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-jgftm" Apr 20 21:12:10.043742 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:10.043721 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-jgftm" Apr 20 21:12:10.168641 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:10.168615 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qsjk5/perf-node-gather-daemonset-jgftm"] Apr 20 21:12:10.170962 ip-10-0-143-23 kubenswrapper[2576]: W0420 21:12:10.170934 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod08b4ad4f_e44d_4b0e_9f90_4c7888cf58a0.slice/crio-0c13a562c3197ea7008f0ce721eca312ae452fcd89c473f35f696b26c94e82f2 WatchSource:0}: Error finding container 0c13a562c3197ea7008f0ce721eca312ae452fcd89c473f35f696b26c94e82f2: Status 404 returned error can't find the container with id 0c13a562c3197ea7008f0ce721eca312ae452fcd89c473f35f696b26c94e82f2 Apr 20 21:12:10.494565 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:10.494492 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-jgftm" event={"ID":"08b4ad4f-e44d-4b0e-9f90-4c7888cf58a0","Type":"ContainerStarted","Data":"eb5e56c90e4274ee80c7e66a435c68f3ae3e26c2b54e3ad3cecd2c83fbc4819d"} Apr 20 21:12:10.494565 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:10.494533 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-jgftm" event={"ID":"08b4ad4f-e44d-4b0e-9f90-4c7888cf58a0","Type":"ContainerStarted","Data":"0c13a562c3197ea7008f0ce721eca312ae452fcd89c473f35f696b26c94e82f2"} Apr 20 21:12:10.494833 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:10.494675 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-jgftm" Apr 20 21:12:10.516172 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:10.516100 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-jgftm" podStartSLOduration=1.516088276 podStartE2EDuration="1.516088276s" podCreationTimestamp="2026-04-20 21:12:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 21:12:10.514726872 +0000 UTC m=+4005.227408868" watchObservedRunningTime="2026-04-20 21:12:10.516088276 +0000 UTC m=+4005.228770273" Apr 20 21:12:10.780857 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:10.780784 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-x2zsb_88bdb2a3-8afb-427f-bc63-e22166098be9/dns/0.log" Apr 20 21:12:10.808062 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:10.808041 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-x2zsb_88bdb2a3-8afb-427f-bc63-e22166098be9/kube-rbac-proxy/0.log" Apr 20 21:12:10.833031 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:10.833003 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-kzk2j_9cdd6b10-c252-4482-9ba2-40fdae9ef435/dns-node-resolver/0.log" Apr 20 21:12:11.358737 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:11.358707 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-56wkj_8c5b0ba6-52e0-4a02-bb42-a4b2b377f250/node-ca/0.log" Apr 20 21:12:12.426811 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:12.426780 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-s4dgk_3dfe201c-4c10-4a3b-98f1-39ca2bed620f/serve-healthcheck-canary/0.log" Apr 20 21:12:12.966281 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:12.966251 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-hmlx5_25f0a3e8-817e-4737-ac33-9ebc57ac1001/kube-rbac-proxy/0.log" Apr 20 21:12:12.988786 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:12.988757 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-hmlx5_25f0a3e8-817e-4737-ac33-9ebc57ac1001/exporter/0.log" Apr 20 21:12:13.009238 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:13.009216 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-hmlx5_25f0a3e8-817e-4737-ac33-9ebc57ac1001/extractor/0.log" Apr 20 21:12:15.012715 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:15.012688 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-dr5bg_c0890812-2ed0-426a-b98f-17b91b07c72b/server/0.log" Apr 20 21:12:15.230964 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:15.230934 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-vm6h5_e8d16292-03dc-4ace-a410-d17256e8596f/manager/0.log" Apr 20 21:12:15.251455 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:15.251437 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-6gm2r_5f256a79-a998-41f8-be60-f43eed0bbc4d/s3-init/0.log" Apr 20 21:12:15.273560 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:15.273500 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-custom-njtxf_a33aa541-18c2-4900-943e-7be3c40c5e1b/s3-tls-init-custom/0.log" Apr 20 21:12:15.320474 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:15.320445 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-rlh6d_351f1e3a-984d-49b3-96b5-b629f861d4d2/seaweedfs/0.log" Apr 20 21:12:15.347299 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:15.347263 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-custom-5c88b85bb7-zgmn9_14e6e679-f921-4d34-a883-989f705c6e43/seaweedfs-tls-custom/0.log" Apr 20 21:12:15.370095 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:15.370070 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-serving-7fd5766db9-p4qzl_80102916-20d8-463f-9795-f3e3974b86f4/seaweedfs-tls-serving/0.log" Apr 20 21:12:16.507729 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:16.507704 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-qsjk5/perf-node-gather-daemonset-jgftm" Apr 20 21:12:19.205067 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:19.205036 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-zszld_88f6fa69-2bdb-4e93-941b-4e14d4be2a2f/migrator/0.log" Apr 20 21:12:19.227244 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:19.227223 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-zszld_88f6fa69-2bdb-4e93-941b-4e14d4be2a2f/graceful-termination/0.log" Apr 20 21:12:19.617946 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:19.617922 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-jgpb4_bc006b49-b340-4b63-9917-d78702246b64/kube-storage-version-migrator-operator/1.log" Apr 20 21:12:19.620502 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:19.620481 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-jgpb4_bc006b49-b340-4b63-9917-d78702246b64/kube-storage-version-migrator-operator/0.log" Apr 20 21:12:20.457772 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:20.457730 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-44qfk_d3bb9caf-0865-431c-89be-a2a222a45633/kube-multus/0.log" Apr 20 21:12:20.483295 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:20.483273 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-68wc5_45f05212-3e62-445f-af62-b586721d3417/kube-multus-additional-cni-plugins/0.log" Apr 20 21:12:20.506137 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:20.506101 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-68wc5_45f05212-3e62-445f-af62-b586721d3417/egress-router-binary-copy/0.log" Apr 20 21:12:20.529097 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:20.529074 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-68wc5_45f05212-3e62-445f-af62-b586721d3417/cni-plugins/0.log" Apr 20 21:12:20.550257 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:20.550241 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-68wc5_45f05212-3e62-445f-af62-b586721d3417/bond-cni-plugin/0.log" Apr 20 21:12:20.570456 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:20.570438 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-68wc5_45f05212-3e62-445f-af62-b586721d3417/routeoverride-cni/0.log" Apr 20 21:12:20.590713 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:20.590693 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-68wc5_45f05212-3e62-445f-af62-b586721d3417/whereabouts-cni-bincopy/0.log" Apr 20 21:12:20.610641 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:20.610623 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-68wc5_45f05212-3e62-445f-af62-b586721d3417/whereabouts-cni/0.log" Apr 20 21:12:21.024562 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:21.024484 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-2n29c_4c96dea8-a54f-4ca2-a3fb-757208554fe3/network-metrics-daemon/0.log" Apr 20 21:12:21.049431 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:21.049407 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-2n29c_4c96dea8-a54f-4ca2-a3fb-757208554fe3/kube-rbac-proxy/0.log" Apr 20 21:12:22.404382 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:22.404354 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z55qt_f78ac3d9-bcf1-43dd-aac7-1678831ee3ba/ovn-controller/0.log" Apr 20 21:12:22.420569 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:22.420546 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z55qt_f78ac3d9-bcf1-43dd-aac7-1678831ee3ba/ovn-acl-logging/0.log" Apr 20 21:12:22.437171 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:22.437151 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z55qt_f78ac3d9-bcf1-43dd-aac7-1678831ee3ba/ovn-acl-logging/1.log" Apr 20 21:12:22.458725 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:22.458703 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z55qt_f78ac3d9-bcf1-43dd-aac7-1678831ee3ba/kube-rbac-proxy-node/0.log" Apr 20 21:12:22.482356 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:22.482334 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z55qt_f78ac3d9-bcf1-43dd-aac7-1678831ee3ba/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 21:12:22.502298 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:22.502280 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z55qt_f78ac3d9-bcf1-43dd-aac7-1678831ee3ba/northd/0.log" Apr 20 21:12:22.521741 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:22.521721 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z55qt_f78ac3d9-bcf1-43dd-aac7-1678831ee3ba/nbdb/0.log" Apr 20 21:12:22.543711 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:22.543688 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z55qt_f78ac3d9-bcf1-43dd-aac7-1678831ee3ba/sbdb/0.log" Apr 20 21:12:22.709458 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:22.709397 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z55qt_f78ac3d9-bcf1-43dd-aac7-1678831ee3ba/ovnkube-controller/0.log" Apr 20 21:12:23.675007 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:23.674979 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-tsc9v_cbd39223-9397-40f2-b3bf-fdd7ce3ab9ac/check-endpoints/0.log" Apr 20 21:12:23.721763 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:23.721740 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-p8x9m_26d6564f-741b-481d-a1c3-a42559981c32/network-check-target-container/0.log" Apr 20 21:12:24.635480 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:24.635451 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-nt6rc_45668015-eebd-4a0f-adbc-9ebb7f100cbd/iptables-alerter/0.log" Apr 20 21:12:25.302081 ip-10-0-143-23 kubenswrapper[2576]: I0420 21:12:25.302054 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-lzm5k_7f59f5a3-2821-4412-9688-73d69c9bbb4c/tuned/0.log"